AI can read our minds and reproduce images we THINK of, new breakthrough test reveals
Subjects of the test, the full findings of which were published in the eNeuro journal, were hooked up to electroencephalography (EEG) equipment by neuroscientists at the University of Toronto Scarborough.
Adrian Nestor, one of the co-authors of the study, has before successfully reconstructed facial images from functional magnetic resonance imaging (fMRI) data.
Speaking of the results, Professor Nestor said: “What’s really exciting is that we’re not reconstructing squares and triangles but actual images of a person’s face, and that involves a lot of fine-grained visual detail.
“The fact we can reconstruct what someone experiences visually based on their brain activity opens up a lot of possibilities.
“It unveils the subjective content of our mind and it provides a way to access, explore and share the content of our perception, memory and imagination.”
The EEG method was developed by Dan Nemrodov, a postdoctoral fellow at Professor Nestor’s lab.
He said: “When we see something, our brain creates a mental percept, which is essentially a mental impression of that thing.
“We were able to capture this percept using EEG to get a direct illustration of what’s happening in the brain during this process.
“fMRI captures activity at the time scale of seconds, but EEG captures activity at the millisecond scale.
“So we can see with very fine detail how the percept of a face develops in our brain using EEG.”
Using the equipment, the researchers were able to record the test subjects’ brain activity as they were shown images of faces.
Human speech will be replaced by thought communication by 2050, claims expert
A specifically-designed piece of software was then used to digitally recreate the image with the information received.
Extremely accurate depictions of the faces observed by volunteers were reproduced, it was found.
Past breakthroughs in this area have relied on fMRI scans that monitor changes in the brain’s blood flow, rather than electrical activity.
The AI used in the test could provide a means of communication for those who cannot speak and also enable the development of prosthetics controlled by thoughts.