Cryonics Revival Scenarios & Potential Roadmaps & Hypotheses

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

Solving brain dynamics gives rise to flexible machine-learning models

Published in Artificial Intelligence.

MIT CSAIL researchers solve a differential equation behind the interaction of two neurons through synapses to unlock a new type of speedy and efficient AI algorithm.

Last year, MIT researchers announced that they had built “liquid” neural networks, inspired by the brains of small species: a class of flexible, robust machine learning models that learn on the job and can adapt to changing conditions, for real-world safety-critical tasks, like driving and flying. The flexibility of these “liquid” neural nets meant boosting the bloodline to our connected world, yielding better decision-making for many tasks involving time-series data, such as brain and heart monitoring, weather forecasting, and stock pricing.

But these models become computationally expensive as their number of neurons and synapses increase and require clunky computer programs to solve their underlying, complicated math. And all of this math, similar to many physical phenomena, becomes harder to solve with size, meaning computing lots of small steps to arrive at a solution.