home :: features :: article page 1 :: article page 2

Closing the loop

Researchers from Lehigh and Johns Hopkins are measuring the effect of sensory feedback on the transmission of brain signals. Their goal is to help people with brain damage regain lost function.

Roy Loya is a tax attorney who has piloted airplanes, worked as an aeronautical engineer and traveled to most of the corners of the world.

Today, sitting perfectly still with 64 electrode sensors attached to his scalp, he is summoning the waning powers of his mind.

Loya has struggled for 10 years with ataxia, a degenerative condition that robs the brain of its ability to coordinate the body’s fine motor movements. He cannot stand on his own, walk or hold a pen. His speech is slurred and raspy.

But Loya is conceding nothing to illness. Today, guided by a team of researchers, he is marshaling his brain signals to do what his body no longer can.

Without flexing a muscle, Loya imagines he is moving his right hand. This generates brain signals, which are conveyed to an amplifier and an electroencephalograph (EEG). A computer processes the signals and carries out their intention. The results play out on a 40-inch TV, where a cursor inches down from the middle of the screen until it hits a rectangular target at the bottom.

Loya next focuses mentally on a left-hand motion, and the cursor moves slowly toward a target at the top of the screen.

Loya is seated in a neuroengineering lab at Johns Hopkins University’s School of Medicine. Minutes pass as he performs each task. Six researchers stand quietly around him, moving only to read the signals displayed on the computer or to adjust the signal-processing software.

The research team is led by Sarah Ying and Nitish Thakor of Johns Hopkins and Mayuresh Kothare of Lehigh. Ying, an assistant professor of neurology and ophthalmology at the Johns Hopkins Hospital, is an expert in ataxia, dizziness and eye movement abnormalities.

She has worked with Loya for seven years. Thakor, professor of biomedical engineering and neurology, directs Johns Hopkins’ Neuroengineering and Biomedical Instrumentation Lab. Kothare, an expert in control systems, is professor of chemical engineering at Lehigh.

The researchers’ goal is to help Loya and others like him learn to use brain-computer interface (BCI) technology to regain control of functions they have lost due to brain damage or disease. They are focusing on the feedback mechanism that enables the cerebellum to correct for errors while executing extremely precise multijoint movements. Located at the base of the brain, the cerebellum contains more than half the brain’s neurons (brain cells) but takes up only 10 percent of its volume. It integrates sensory perception, coordination and motor control, and signals the cortex, the outermost layer of the brain, to command muscles to move.

Scientists and engineers have developed BCIs that signal actuators to move cursors and operate artificial hands and other neuroprosthetic devices. Thakor’s lab has helped transradial (below-the-elbow) amputees control prosthetic hands that are linked to the signals generated by the muscles on their residual limbs. In another project, his group decodes brain signals to command prosthetic hands to open and close.

But little research has been done to enable BCIs and neuroprostheses to respond as the cerebellum does to sensory feedback from the environment. A driver changing lanes on the highway and an outfielder chasing a fly ball must adjust to changes in the environment and in their perception of the environment. Similarly, as a hand reaches for a cup, “force feedback” from the hand and visual feedback from the eye tell the cerebellum how much farther and in what direction the hand must move to complete its task. The cerebellum processes this feedback and sends new signals ordering the hand to take corrective action.

It is this feedback process that the Lehigh–John Hopkins team is seeking to interpret and to model mathematically. “Force, visual and all other kinds of feedback influence how the brain changes its signals,” says Kothare. “We want to interpret these modified signals and convert them into action. “To do that, we have to understand the entire closed loop. This includes the brain signals, the interpretation of the signals, the implementation of the brain’s command, the feedback by sensors to the brain, and the transmission of new brain signals that are modified in response to the environment. We want to measure and model this entire system.”

Three-way serendipity
The idea for the Lehigh-Johns Hopkins collaboration originated when Kothare and Thakor met at a reunion of alumni from the India Institute of Technology and discovered a common interest in neuroengineering.

“I have always been interested in control theory and feedback and its impact on learning,” says Kothare, who previously engineered a chip-based microreactor that catalyzes methanol into hydrogen and also investigated an implantable device for insulin delivery.

In 2008, Kothare and Thakor became one of about 30 teams out of 1,250 competing to win a grant through NSF’s Cyber-Enabled Discovery and Innovation (CDI) initiative. The three-year award has partially supported Kothare’s sabbatical at Johns Hopkins’ School of Medicine.

Two months into the project, Kothare met Ying when she proposed a project using BCIs to study ataxia patients and other persons with cerebellar defects. Ying secured approval to do studies on actual subjects, while Kothare obtained certification from the Johns Hopkins Institute Review Board. Thakor, who is an investigator for DARPA’s Revolutionizing Prosthesis Program, already had a lab set up for BCI research and EEG monitoring.

Kothare then recruited Geoffrey Newman, a graduate student in biomedical engineering at Johns Hopkins, and Youngseok Choi, a postdoctoral neuroengineer who develops neurological signal analysis methods for brain injuries. Choi, a visiting researcher at Johns Hopkins, now oversees signal processing for the ataxia project. Newman has helped modify a previously developed BCI hardware-software platform and equip it with an interface that runs the cursor-target experiments.

1 | 2 | Next >>
Roy Loya (seated, left) is learning to control his brain signals with help from Sarah Ying (left) and Mayuresh Kothare (far left).