by Lori Dajose, California Institute of Technology
Anatomical recording planes and behavioral tasks. a, Coronal fUS imaging planes used for monkeys P and L. The approximate fUS field of view superimposed on a coronal MRI slice. The recording chambers were placed surface normal to the skull above a craniectomy (black square). The ultrasound transducer was positioned to acquire a consistent coronal plane across different sessions (red line). The vascular maps show the mean power Doppler image from a single imaging session. Different brain regions are labeled in white text, and the labeled arrows point to brain sulci. D, dorsal; V, ventral; L, left; R, right; A, anterior; P, posterior; ls, lateral sulcus; ips, intraparietal sulcus; cis, cingulate sulcus. Anatomical labels are based upon ref. 63. b, Memory-guided saccade task. * ±1,000 ms of jitter for fixation and memory periods; ±500 ms of jitter for hold period. The peripheral cue was chosen from two or eight possible target locations depending on the specific experiment. Red square, monkey’s eye position (not visible to the monkey). NHP, nonhuman primate, that is, monkey. c, fUS-BMI algorithm. Real-time 2-Hz functional images were streamed to a linear decoder that controlled the behavioral task. The decoder used the last three fUS images of the memory period to make its prediction. If the prediction was correct, the data from that prediction were added to the training set. The decoder was retrained after every successful trial. The training set consisted of trials from the current session and/or from a previous fUS-BMI session. d, Multicoder algorithm. For predicting eight movement directions, the vertical component (blue) and the horizontal component (red) were separately predicted and then combined to form each fUS-BMI prediction (purple). e, Memory-guided BMI task. The BMI task is the same as in b except that the movement period is controlled by the brain activity (via fUS-BMI) rather than eye movements. After 100 successful eye movement trials, the fUS-BMI controlled the movement prediction (closed-loop control). During the closed-loop mode, the monkey had to maintain fixation on the center fixation cue until reward delivery. Red square, monkey’s eye position (not visible to the monkey); green square, BMI-controlled cursor (visible to the monkey). Credit: Nature Neuroscience (2023). DOI: 10.1038/s41593-023-01500-7
Brain–machine interfaces (BMIs) are devices that can read brain activity and translate that activity to control an electronic device like a prosthetic arm or computer cursor. They promise to enable people with paralysis to move prosthetic devices with their thoughts.
Many BMIs require invasive surgeries to implant electrodes into the brain in order to read neural activity. However, in 2021, Caltech researchers developed a way to read brain activity using functional ultrasound (fUS), a much less invasive technique.
Now, a new study is a proof-of-concept that fUS technology can be the basis for an "online" BMI—one that reads brain activity, deciphers its meaning with decoders programmed with machine learning, and consequently controls a computer that can accurately predict movement with very minimal delay time.
The study was conducted in the Caltech laboratories of Richard Andersen, James G. Boswell Professor of Neuroscience and director and leadership chair of the T&C Chen Brain–Machine Interface Center; and Mikhail Shapiro, Max Delbrück Professor of Chemical Engineering and Medical Engineering and Howard Hughes Medical Institute Investigator. The work was a collaboration with the laboratory of Mickael Tanter, director of physics for medicine at INSERM in Paris, France.
"Functional ultrasound is a completely new modality to add to the toolbox of brain–machine interfaces that can assist people with paralysis," says Andersen. "It offers attractive options of being less invasive than brain implants and does not require constant recalibration. This technology was developed as a truly collaborative effort that could not be accomplished by one lab alone."
"In general, all tools for measuring brain activity have benefits and drawbacks," says Sumner Norman, former senior postdoctoral scholar research associate at Caltech and a co-first author on the study.
"While electrodes can very precisely measure the activity of single neurons, they require implantation into the brain itself and are difficult to scale to more than a few small brain regions. Non-invasive techniques also come with tradeoffs. Functional magnetic resonance imaging [fMRI] provides whole-brain access but is restricted by limited sensitivity and resolution. Portable methods, like electroencephalography [EEG] are hampered by poor signal quality and an inability to localize deep brain function."
Ultrasound imaging works by emitting pulses of high-frequency sound and measuring how those sound vibrations echo throughout a substance, such as various tissues of the human body. Sound waves travel at different speeds through these tissue types and reflect at the boundaries between them. This technique is commonly used to take images of a fetus in utero and for other diagnostic imaging.
Because the skull itself is not permeable to sound waves, using ultrasound for brain imaging requires a transparent "window" to be installed into the skull. "Importantly, ultrasound technology does not need to be implanted into the brain itself," says Whitney Griggs (Ph.D. '23), a co-first author of the study. "This significantly reduces the chance for infection and leaves the brain tissue and its protective dura perfectly intact."
"As neurons' activity changes, so does their use of metabolic resources like oxygen," says Norman. "Those resources are resupplied through the bloodstream, which is the key to functional ultrasound." In this study, the researchers used ultrasound to measure changes in blood flow to specific brain regions. In the same way that the sound of an ambulance siren changes in pitch as it moves closer and then farther away from you, red blood cells will increase the pitch of the reflected ultrasound waves as they approach the source and decrease the pitch as they flow away.
Measuring this Doppler-effect phenomenon allowed the researchers to record tiny changes in the brain's blood flow down to spatial regions just 100 micrometers wide, about the width of a human hair. This enabled them to simultaneously measure the activity of tiny neural populations, some as small as just 60 neurons, widely throughout the brain.
The researchers used functional ultrasound to measure brain activity from the posterior parietal cortex (PPC) of non-human primates, a region that governs the planning of movements and contributes to their execution. The region has been studied by the Andersen lab for decades using other techniques.
The animals were taught two tasks, requiring them to either plan to move their hand to direct a cursor on a screen or plan to move their eyes to look at a specific part of the screen. They only needed to think about performing the task, not actually moving their eyes or hands, as the BMI read the planning activity in their PPC.
"I remember how impressive it was when this kind of predictive decoding worked with electrodes two decades ago, and it's amazing now to see it work with a much less invasive method like ultrasound," says Shapiro.
The ultrasound data was sent in real-time to a decoder (previously trained to decode the meaning of that data using machine learning), and subsequently generated control signals to move a cursor to where the animal intended it to go. The BMI was able to successfully do this to eight radial targets with mean errors of less than 40 degrees.
"It's significant that the technique does not require the BMI to be recalibrated each day, unlike other BMIs," says Griggs. "As an analogy, imagine needing to recalibrate your computer mouse for up to 15 minutes each day before use."
Next, the team plans to study how BMIs based on ultrasound technology perform in humans, and to further develop the fUS technology to enable three-dimensional imaging for improved accuracy.
The paper is titled "Decoding motor plans using a closed-loop ultrasonic brain–machine interface" and appears in the journal Nature Neuroscience.
More information: Whitney S. Griggs et al, Decoding motor plans using a closed-loop ultrasonic brain–machine interface, Nature Neuroscience (2023). DOI: 10.1038/s41593-023-01500-7
Journal information: Nature Neuroscience
Provided by California Institute of Technology
Post comments