Mind-Reading Computer Instantly Decodes People’s Thoughts
A new computer program can decode people’s thoughts almost in real time, new research shows.
Researchers can predict what people are seeing based on the electrical signals coming from electrodes implanted in their brain, and this decoding happens within milliseconds of someone first seeing the image, the scientists found.
The new results could one day have applications for helping people, such as those who cannot speak or have trouble communicating, express their thoughts, Rajesh Rao, a neuroscientist at the University of Washington in Seattle, said in a statement. [10 Surprising Facts About the Brain]
“Clinically, you could think of our result as a proof of concept toward building a communication mechanism for patients who are paralyzed or have had a stroke and are completely locked in,” Rao said.
In recent years, scientists have made tremendous strides in decoding human thoughts. In a 2011 study, researchers translated brain waves into the movie clips people were watching at the time. In 2014, two scientists transmitted thoughts to each other using a brain-to-brain link. And other studies have shown that computers can “see” what people are dreaming about, using only their brain activity.
Rao and his colleagues wanted to see if they could further this effort. They asked seven people with severe epilepsy, who had already undergone surgery to implant electrodes into their temporal lobes, if they would mind having their thoughts decoded. (The patients had the electrodes implanted for a single week so that doctors could pinpoint where the seizures originated within the temporal lobe, which is a common source of seizures, the researchers said.)
“They were going to get the electrodes no matter what; we were just giving them additional tasks to do during their hospital stay while they are otherwise just waiting around,” said study co-author Dr. Jeff Ojemann, a neurosurgeon at the University of Washington Medical Center in Seattle.
The temporal lobe is also the brain region responsible for processing sensory input, such as visualizing and recognizing objects that a person sees.
Rao, Ojemann and their colleagues had the participants watch a computer screen as several images briefly flickered by. The images included pictures of faces and houses, as well as blank screens, and the subjects were told to keep alert to identify the image of an upside-down house.
At the same time, the electrodes were hooked up to a powerful computer program that analyzed brain signals 1,000 times a second, determining what brain signals looked like when someone was viewing a house versus a face. For the first two-thirds of the images, the computer program got a label, essentially telling it, “This is what brain signals look like when someone views a house.” For the remaining one-third of the pictures, the computer was able to predict, with 96 percent accuracy, what the person actually saw, the researchers reported Jan. 21 in the journal PLOS Computational Biology. What’s more, the computer accomplished this task within 20 milliseconds of the instant the person looked at the object.