by Cyrus Moulton, Northeastern University
Srinivas Sridhar demonstrates a patented headset that can scan a brain while listening to a podcast in the Egan Research Center on Sept. 14, 2023. Credit: Matthew Modoono/Northeastern University
The headset combines a smartphone in a virtual reality headset with a brain sensor and plays podcasts.
But the NeuroVEP's purpose is not to entertain—its purpose is to diagnose vision and related neurological problems by converting brain signals into a map representing which parts of a user's visual field may have decreased function.
"After heavy data processing of brain electrical signals, we can deduce what the vision looks like for a person," Srinivas Sridhar, inventor of the NeuroVEP, says. "From that, we can deduce where the problem is."
Sridhar, a university distinguished professor of physics, bioengineering and chemical engineering at Northeastern University, associate research scientist Craig Versek, professor of psychology Peter Bex and data scientist Ali Banijamali received a patent for the device this summer, and the team has formed a spin-out company called NeuroFieldz to market the technology.
The device arises from Sridhar and the team's work under a DARPA challenge grant developing a neuromonitoring technique called electric field encephalography and evolved into a project to measure Visual Evoked Potentials, or VEP, for eye-brain diagnostics.
"What it is, is when light strikes the eye, the eye converts them to electrical signals which go to the brain, and which the brain then tries to decode what it's seeing," Sridhar explains. "This process is really central to our vision, perception and recognition."
By analyzing and isolating these brain signals, the NeuroVEP can turn them into a map of where the user's visual field responds —or doesn't respond—to targeted stimuli and compare them with a baseline model for healthy vision using machine learning techniques to identify potential problems.
Credit: Matthew Modoono/Northeastern University
So what does this have to do with a virtual reality headset, a smartphone and podcasts?
Well, the headset is equipped with a sensor that monitors the brain signals the user produces, using artificial intelligence and machine-based learning to isolate the VEP. The smartphone provides visual stimuli for the user so that the brain signals are activated, and a podcast is an option for keeping the user's attention.
"We avoid political topics for the podcasts—we want them alert, but not riled up," Sridhar says. "It's usually science stuff—that's neutral."
Sridhar says the device has two crucial features.
First, it is objective.
Compared with a standard vision test in which the user tells the ophthalmologist what they can or cannot see—and who among us has not fibbed a little during that process—the brain signals give an objective look at what the user sees, Sridhar says.
Second, it is portable.
Similar technology is wheeled around on a cart, involves needle pokes and does not use machine-learning to isolate the VEP.
Sridhar says that the product has been tested with roughly 200 patients through partnerships at Tufts Medical Center, Boston Children's Hospital and other clinics.
"Patients really loved our system because there are no buttons to press and all they have to do is listen to a podcast," Sridhar says.
Meanwhile, Sridhar says that clinicians love the portability, the efficiency and the objectivity of the device.
Versek says that feedback from clinicians reported that "NeuroVEP outperformed the current standard of care cart-based system in my clinic," and that it "worked flawlessly" in testing more than 150 patients with macular degeneration.
The device has a way to go before it is available for widespread use, however, as it has to be manufactured and approved by the Food and Drug Administration.
"So far we have done the tests," Sridhar says. "Next we want to get it into (the clinicians') hands."
But he says that the technology opens many doors. The researchers envision the technology also being used for people who have had strokes or traumatic brain injury. Meanwhile, the idea that you can isolate VEP and translate brain signals into visual field function maps has many applications.
"In the bigger picture, we are some of the few people who can decode the brain and what the brain sees," Sridhar says. "I want to take this to another next level: I want to decipher what people are actually seeing, their perception and how they react—if I see two people here and recognize them, how does that happen and can I duplicate the process?"
"There are higher-level functions that could also be detected with this kind of instrument," Sridhar says. "It requires major processing, money and effort, but in principle, it's possible."
Provided by Northeastern University
Post comments