AUSTIN (KXAN) — Can artificial intelligence read your mind? A new study from the University of Texas at Austin shows how this is possible.
The new brain decoding study shows how AI can translate someone’s brain activity into a stream of text while they listen to a story or imagine telling a story.
Jerry Tang, a UT doctoral student who worked on the decoding system, said the study looked at language decoding, specifically how brain scans were used to predict the words a user was hearing or imagining.
“We found that the decoder predictions could capture the gist of what the users were hearing,” Tang said.
The brain activity decoder aims to help people who are mentally conscious but cannot speak.
“The ultimate goal of this field is to help restore communication to people who have lost the ability to speak due to injuries like strokes or diseases like ALS,” Tang said.
This decoder is different from other systems because it is non-invasive and doesn’t require surgical implants. The system used a functional MRI, or fMRI, to collect brain scans from outside the skull.
Tang said having a non-invasive brain decoder would make the technology more accessible to more people and help with a wider range of language impairments.
The team’s next goals include adapting the coding approach to smaller, cheaper portable devices as the fMRI machine is large and expensive. Tang said the team is also interested in working with potential patients and creating interfaces that fit their needs.
More information about the system was published Monday in the Nature Neuroscience journal.