Science & Research

BrainGate seeks to empower disabled

By
Senior Staff Writer

 

In April 2011, 58-year-old Cathy Hutchinson served herself coffee. The action would have been unremarkable, except that Hutchinson served herself using a robotic arm that she controlled with her thoughts. Hutchinson suffered a brainstem stroke almost 15 years earlier that left her unable to move any of her limbs.

Such a feat was made possible by the BrainGate research team ­- a group of scientists and engineers at Brown, Massachusetts General Hospital, Stanford University and Providence VA Medical Center. The team is working to restore independence to paralyzed individuals through interfaces that translate neural signals into commands that control external devices, like computer cursors and robotic arms. The team’s most recent paper, which described Hutchinson’s coffee drinking, was published in the journal Nature in May. 

 

Opening the gates

One of the project’s first major milestones occurred in 2005 when the researchers heard the sounds of neurons firing from the motor cortex of a paralyzed patient for the first time. Stroke and spinal injury damages the connections between the brain and muscles, but what the researchers discovered was that the brain was still capable of firing signals to initiate movement.

“That was a really big deal because many of us thought, including me, that it might just shut down altogether,” said John Donoghue PhD’79 P’09 P’12, a professor of neuroscience, director of the Brown Institute of Brain Science and co-leader on the second phase of the BrainGate clinical trials, BrainGate2.

Since that catalyzing discovery, researchers have been working to create sensors to implant in the brain to pick up the neural signals, algorithms to translate the complicated patterns of firing neurons and external devices for the algorithms to control. 

Ultimately, the researchers hope to remove the external devices altogether and reconnect the brain to paralyzed muscles using the Functional Electrical Stimulation technology being developed at Case Western University. Doing so would involve surgically implanting wires in patients’ limbs that would replace their damaged connections in order to directly stimulate their own muscles.

Donoghue said early simulations suggest such a system is possible but is likely years away.

 

Mixed signals

Though the premise is simple, translating neural signals into usable commands is a difficult endeavor in practice. In order to calibrate their interfaces, researchers ask participants to first imagine doing a specific action. For example, a participant may watch a computer cursor move around the screen and then imagine that he is moving it himself. 

Such a thought will stimulate neurons in his motor cortex, and researchers can then pick up and record that sequence using the sensor implanted in his brain. They then calibrate the algorithm so that the cursor’s movement will align with the way the participant imagines it moving.

But the method does not work perfectly. Two participants – Hutchinson and a 66-year-old tetraplegic man – controlled robotic arms and attempted to reach, touch and grasp small foam balls, according to the May 2012 paper. Hutchinson tried out both a robotic prosthetic arm and an arm mounted on a table, while the other participant only worked with the prosthetic arm. Both arms came from outside companies.

Hutchinson was able to touch the ball in about 60 percent of her trials. With the mounted arm, she was able to grasp the target around 20 percent of the time and with the prosthetic, around 45 percent of the time. She performed almost 160 trials in total. The other participant was able to touch the ball in over 95 percent of his 45 trials and grasp it in around 60 percent.

One of the reasons that the participants were unable to control the arms perfectly is that neural signals vary in ways the researchers do not completely understand.

“Even if you repeat the same action every time, the neurons don’t do the exact same thing,” Donoghue said. To program the arms, the team uses a statistical framework that takes into account the neural firings of past trials to determine a subject’s probable intention.

“You sort of take a guess,” Donoghue said. “It’s a well-established mathematical framework that comes from computer science.”

 

Motor learning

One limitation of the study is that the researchers did not address motor learning – changes in individuals’ response abilities as they practice, wrote Jose Contreras-Vidal, the director of the Laboratory for Noninvasive Brain-Machine Interface Systems and a professor of electrical and computer engineering at the University of Houston who was not involved in the study, in an email to The Herald. 

“That’s one of the things we’re embarking on now,” Donoghue said. “If you practice, if you’re trained in some specific way, can you learn to get better? … Does learning matter? It would be almost unimaginable that it wouldn’t.”

One of the issues in studying motor learning is that it requires participants to live with the brain implant for many years. Hutchinson had her implant for over five years, but it had to be plugged in to a power source by someone else.

The goal is to develop a completely wireless, long-lasting system. Such a system would require a new type of long-lasting battery or a way to harness energy from the body for power. They also need to develop materials that can survive the “hostile environment” of the body, Donoghue said.

Donoghue said picking up neural signals from outside of the scalp would be ideal, but that doing so may be impossible.

But Contreras-Vidal’s lab is working on doing just that, using a technique called EEG. External sensors avoid the surgical risks of internal ones, he wrote. Additionally, they can pick up signals from a wider area of cortex than small internal sensors. “We believe that the capability of EEG to acquire signals from large portions of the distributed neural networks involved in action production is essential to our success,” he wrote.

Contreras-Vidal wrote that his team does not know the limits of using EEG recordings to decode neural signals, but added that their results seem promising. They have not yet done any tests on clinical populations, like the BrainGate2 pan> team. 

 

Restoring language

Despite the excitement surrounding controlling robotic arms, restoring communication is paramount to restoring movement for many paralyzed people.

Daniel Bacher, a senior researcher and development engineer on the BrainGate2 team, is focused on this task. His work involves designing new keyboard interfaces that participants can control using neural cursors. He also works to improve the algorithms for cursor control.

Donoghue said traditional keyboards are antithetical to paralyzed patients’ needs. The traditional layout came about in the days of typewriters, the keys of which would often jam. Thus keyboards were purposefully designed so that letters that are often typed next to each other are spaced far apart.

For a participant using a neural cursor, such a layout is impractical because it requires moving the cursor back and forth, a slow and tedious process.

Bacher said that though Hutchinson’s coffee-drinking has received a lot of attention from the press, for him, one of the most rewarding moments came a few days later when she tested a new keyboard he had designed for her.

He was explaining how to use this new keyboard, and she suddenly started typing letters. “She very simply typed, ‘Thank you,'” he said, “and just looked so happy with a smile on her face.”

Kathryn Tringale ’12, a former BrainGate2 research assistant who now works at MGH as a fulltime clinical neurotechnology research assistant with the project, said designing communication interfaces is really about designing products that participants enjoy using. Bacher said that they also work to make their interfaces intuitive, following the philosophy Apple uses to design their products.

 

Going Forward

“There are really two main things that people want back who are in this state,” Bacher said, referring to those with paralysis. “One is control of their environment, and the second is to express their personality.”

Over the past six to seven years, the BrainGate research team has made considerable progress toward those two goals. They have tested their interfaces ­- ranging from computer cursors to robotic prosthetic arms – on seven different participants, two of whom are currently enrolled in ongoing trials. They have enabled paralyzed people to send emails, chat online, play computer games and drink coffee. And in the process, they have learned a tremendous amount about the basic science behind the brain.

Despite their remarkable progress, their ultimate goal of completely restoring independence to paralyzed people remains far off.

“Everything is too far behind for me,” Donoghue said, but he added that everyone working on the project is driven by the possibility of restoring independence and dignity to paralyzed patients. “Having someone care for you all the time … it takes away a lot of what we consider our humanity,” he said.

“We’re all optimistic that we’re on the path to help people,” said Leigh Hochberg ’90, a professor of engineering and co-director of the BrainGate2 research effort. “It’s an amazing team.”

 

- With additional reporting by Shefali Luthra