Engineers Decode Conversations in Brain's Motor Cortex

<p><em>Illustration by Bona Kim, Emory University</em></p>

Illustration by Bona Kim, Emory University

How does your brain talk with your arm? The body doesn’t use English, or any other spoken language. Biomedical engineers are developing methods for decoding the conversation, by analyzing electrical patterns in the motor control areas of the brain.

The new research is published online in the journal Nature Methods.

In this study, the researchers leveraged advances from the field of “deep learning”-- powerful new artificial intelligence-based approaches that have revolutionized many technology industries in the last few years. The new computing approaches, which use artificial neural networks, let researchers uncover patterns in complex data sets that have been previously overlooked, says lead author Chethan Pandarinath, Ph.D.

Pandarinath and colleagues developed an approach to allow their artificial neural networks to mimic the biological networks that make our everyday movements possible. In doing so, the researchers gained a much better understanding of what the biological networks were doing. Eventually, these techniques could help paralyzed people move their limbs, or improve the treatment of people with Parkinson’s, says Pandarinath, an assistant professor in the Wallace H. Coulter Department of Biomedical Engineering at Georgia Tech and Emory University and researcher in the Petit Institute for Bioengineering and Bioscience at Tech. Pandarinath leads the Emory and Georgia Tech  Systems Engineering Lab.

For someone who has a spinal cord injury, the new technology could power “brain-machine interfaces” that discern the intent behind the brain’s signals and directly stimulate someone’s muscles.

“In the past, brain-machine interfaces have mostly worked by trying to decode very high-level commands, such as ‘I want to move my arm to the right, or left’,” Pandarinath says. “With these new innovations, we believe we'll actually be able to decode subtle signals related to the control of muscles, and make brain-machine interfaces that behave much more like a person’s own limbs.”

Network behavior ‘emergent’ from individual neurons

Previous research on how neurons control movement have revealed that it’s difficult to discern individual neurons’ roles, in a way that we might think of in a basic machine. Individual neurons’ behaviors don’t correspond to variables like arm speed, movement distance or angle. Rather, the rhythms of the entire network are more important than any individual neuron’s activity.

Pandarinath likens his team’s approach to ornithologists studying the flocking behavior of birds. To understand how the group holds together, one has to know how one bird responds to its neighbors, and to the flock’s movements as a whole. Flocking behavior is “emergent” from the interactions of the birds with each other, he says. Such emergent behaviors are challenging to characterize with standard methods, but are precisely the way artificial neural networks function.

Pandarinath started investigating this approach, called LFADS (Latent Factor Analysis via Dynamical Systems), while working with electrical engineer Krishna Shenoy, PhD, and neurosurgeon Jaimie Henderson, MD, who co-direct the Neural Prosthetics Translational Lab at Stanford University.

In the Nature Methods paper, the researchers analyzed data from both rhesus macaques and humans, who had electrodes implanted in the motor cortex. In some experiments, monkeys were trained to move their arms to follow an on-screen “maze,” and the researchers tested their ability to “decode” the monkeys’ arm movement trajectories based solely on the signals recorded from the implanted electrodes. Using their artificial neural network approach, the researchers were able to precisely uncover faint patterns that represented the brain rhythms in the motor cortex. They also observed similar patterns in human patients who were paralyzed – one because of motor neuron degeneration (amyotrophic lateral sclerosis), and another with spinal cord injury.

In addition to the motor cortex, Pandarinath believes the new approach could be used to analyze the activity of networks in other brain regions involved in spatial navigation or decision making.

Future plans for clinical applications include pairing the new technology with functional electrical stimulation of muscles for paralyzed patients, and also the refinement of deep brain stimulation technology in Parkinson’s disease. In addition, Pandarinath and colleagues have begun using these techniques to start to understand the activity of neurons at fundamentally different scales than were previously possible. This future work is supported by a recent grant from the National Science Foundation.

The team’s research was supported by National Institutes of Health grants T-R01NS076460, MH093338, T-R01MH09964703, R01DC014034, R01DC009899, 8DP1HD075623, as well as funding from the Craig H. Neilsen Foundation for Spinal Cord Injury Research.

 


Contact:

Walter Rich
Communications Mgr.
wrich@gatech.edu

News Contact

Walter Rich