Thoughts can be read by recording brain activity with functional magnetic resonance, as scientists from the University of Texas at Austin (USA) have shown, who have successfully applied the technique to a group of volunteers.

Although for now the test only works if the analyzed person agrees to cooperate, the authors of the research call for the development of rules to protect the mental privacy of citizens before more sophisticated mind-reading techniques are developed.

“We hope that this technology will help people who have lost the ability to speak due to injuries or diseases such as ALS,” said Jerry Tang, first author of the research, at a press conference on Thursday. But “no one’s brain should be decoded without their cooperation; (…) it is important to enact policies that protect mental privacy”.

The research, in which specialists in neuroscience and computing have collaborated, has been developed in two phases. First, a computer system was taught how language is processed in the brain. This is why the brain activity of two men aged 23 and 36 and a woman aged 26 was recorded using magnetic resonance imaging while they listened to narratives over the course of 16 hours.

To instruct the computer system, the MRI data was supplemented with that of an artificial intelligence GPT language model, which generates sequences of words by evaluating which word is most likely to come next.

In the second phase of the research, when the computer system had already been instructed, it was checked whether it was able to interpret the thoughts of the same three volunteers. They were asked to imagine that they had to say something, but not to say it, to listen to new narratives and to watch videos without sound.

While they carried out these activities, their brain activity was again recorded with a functional magnetic resonance. From the magnetic resonance records, the computer system has decoded the thoughts of the three volunteers.

The results of the project were presented yesterday in the journal Nature Neuroscienc e. “We have seen that the decoder can predict what the user is imagining or seeing” even if he does not express it in words, emphasized Jerry Tang.

Previous research had converted brain signals into language by implanting electrodes in the brain using neurosurgery. Although some patients have partially recovered the ability to communicate with this technique, it is an invasive intervention that cannot be applied on a large scale.

The new research is the first to decode language non-invasively. According to the director of the research, Alexander Huth, the advance has now been possible thanks, on the one hand, to the brain’s ability to process a larger amount of data than in the past; and on the other, thanks to the development of language models such as GPT.

The system does not reconstruct the exact sentence that the volunteers have in mind, but the idea. “For example, when the user heard the phrase ‘I don’t have my driver’s license yet’, the decoder predicted ‘she hasn’t started learning to drive yet,'” explained Tang. What the system registers is a thought “deeper than language that becomes language”, clarified Huth at the press conference.

Although the experiments were conducted only in English, “there is no reason to think that it would not work in other languages; representations [of language] in the brain are shared between languages,” Huth added.