Cryonics Revival Scenarios & Potential Roadmaps & Hypotheses

Generic selectors
Exact matches only
Search in title
Search in content
Post Type Selectors

A.I. Is Getting Better at Mind-Reading

Published in Artificial Intelligence, Brain/Neurology.

In a recent experiment, researchers used large language models to translate brain activity into words.

Scientists recorded M.R.I. data from three participants as they listened to 16 hours of narrative stories to train the model to map between brain activity and semantic features that captured the meanings of certain phrases and the associated brain response.

Think of the words whirling around in your head: that tasteless joke you wisely kept to yourself at dinner; your unvoiced impression of your best friend’s new partner. Now imagine that someone could listen in.

On Monday, scientists from the University of Texas, Austin, made another step in that direction. In a study published in the journal Nature Neuroscience, the researchers described an A.I. that could translate the private thoughts of human subjects by analyzing fMRI scans, which measure the flow of blood to different regions in the brain.

Already, researchers have developed language-decoding methods to pick up the attempted speech of people who have lost the ability to speak, and to allow paralyzed people to write while just thinking of writing. But the new language decoder is one of the first to not rely on implants. In the study, it was able to turn a person’s imagined speech into actual speech and, when subjects were shown silent films, it could generate relatively accurate descriptions of what was happening onscreen.

https://www.nytimes.com/2023/05/01/science/ai-speech-language.html