BrainInternationalIndiaAfricaThree volunteers agreed to give up 16 hours of their time to lie inside a Magnetic resonance imaging (MRI) machine while listening to podcast stories. It was what happened next that has been described by neural engineers as a fascinating feat. So is mind-reading another frontier that may be conquered?A brain scan can be used to potentially read people’s minds, a new study by scientists at the University of Texas, Austin, has shown.For this purpose, a semantic decoder model resembling those that power Open AI’s ChatGPT was developed, according to research published this month in Nature Neuroscience.The study sought to come up with a noninvasive way to aid people unable to physically speak, such as those suffering after a stroke. In other words, no implants are required, as brain activity is measured using a Functional Magnetic Resonance Imaging (fMRI) scanner. However, first, the decoder was subjected to a period of training, when a volunteer individual was placed in the scanner and left there, listening to several hours of podcasts.© Photo : Nolan Zunk/University of Texas .Screenshot showing Alex Huth (left), Shailee Jain (center) and Jerry Tang (right) preparing to collect brain activity data in the Biomedical Imaging Center at The University of Texas at Austin.Screenshot showing Alex Huth (left), Shailee Jain (center) and Jerry Tang (right) preparing to collect brain activity data in the Biomedical Imaging Center at The University of Texas at Austin.The three individuals involved in the study had their scans detect changes in the blood flow in the brain, according to the leaders of the research, Alexander Huth, an assistant professor of neuroscience and computer science, and Jerry Tang, a doctoral student in computer science at the University of Texas at Austin.Under the important precondition that the participant in the study is willing to have their thoughts decoded, the machine is able to generate text, albeit not “word-for-word”, from brain activity.© Photo : UNIVERSITY OF TEXAS AT AUSTIN.Screenshot showing image of decoder predictions from brain recordings collected while a user listened to four stories.Screenshot showing image of decoder predictions from brain recordings collected while a user listened to four stories.
"For a noninvasive method, this is a real leap forward compared to what’s been done before, which is typically single words or short sentences. We’re getting the model to decode continuous language for extended periods of time with complicated ideas," Huth said.
RussiaRussian Scientists Learn to Detect Depression With Help From MRI Scans22 March, 14:01 GMTThe researchers weighed in on concerns regarding the potential misuse of this technology, explaining that the decoder produced good results only when “cooperative” volunteers were involved.
"We take very seriously the concerns that it could be used for bad purposes and have worked to avoid that. We want to make sure people only use these types of technologies when they want to and that it helps them," Tang said.
After the completion of the research, supported by the Whitehall Foundation, the Alfred P. Sloan Foundation and the Burroughs Wellcome Fund, Alexander Huth and Jerry Tang have filed a PCT patent application.