MinD-Vis could be developed to integrate into virtual reality headsets, with the idea being that users could control their being in the metaverse with their mind.
Researchers in Singapore have developed an AI system to understand brain wave patterns and generate an image that determines what a person is looking at.
The research team is collecting brain scan datasets of approximately 58 participants, who have been exposed to between 1,200 and 5,000 different images of animals, food, buildings and human activities while receiving MRI scans.
Each lasts for 9 seconds with a break in between.
The mind-reading AI, called Mindi-Viz, then matched the brain scans with the images to generate a personalized AI model for each participant.
These models allow computers to “read” thoughts and recreate the scenes a person is seeing.
“It can understand your brain activity in the same way that ChatGPT understands natural languages of humans. And then it will translate your brain activity into a language called Stable Diffusion. [an open source AI which generates images from text] can understand,” said Jiaxin Qing, one of the study’s lead researchers and a PhD student at the Chinese University of Hong Kong (CUHK IE).
According to King, the decoded images were consistently the same as those shown to participants.
Li Ruilin is one of the participants and is impressed by the brain decoding.
Ruilin said, “This brain decoding like using brain signals to generate natural modalities is very interesting and exciting work. I am also interested in what happened in my brain and what outputs my brain can give and What am I thinking.”
The research team says that this technique can be applied to help people in the future.
“Say for some patients without motor ability. Maybe we can help them control their robots …[or]communicate with others as if using their thoughts instead of speech.” do, if the person cannot speak at the time, said Chen Zijiao at the National University of Singapore’s School of Medicine.
Chen said the technology could also be developed to be integrated into virtual reality headsets, so that users can control their being in the metaverse with their minds rather than physical controllers.
The researchers say their mind-reading AI is now being developed thanks to the easy availability of collecting MRI datasets and recent advances in the computational power to sift through the data.
However, according to the team, it will take several years for MinD-Vis to read the mind of the public.
“We’re trying to test the possibility right now, but I would say in terms of the datasets available right now, the computational power we have, as well as the enormous variation or inter-individual differences in our brain anatomy. Brain It’s going to be very, very difficult,” said Juan Helen Zhou, associate professor at the National University of Singapore.
There is also the risk that datasets learned from AI can be shared without consent. The researchers also acknowledged that the relative lack of legislation in AI research may be hindering progress.
“Privacy concerns are the first important thing and then people may be concerned whether the information we provide here can be evaluated or shared without prior consent. So the point to address this is that We have to have very strict guidelines, ethics and laws.” How to protect privacy,” Zhou said.
Watch the video in the media player above for more on this story.
video editor • Rosalyn Min