General Information

Mail: University of Houston
Cullen College of Engineering
E421 Engineering Bldg 2, Houston, TX 77204-4007
Map & Driving Directions (includes parking information)
Email: info [at] egr [dot] uh [dot] edu

CULLEN COLLEGE OF ENGINEERING

University of Houston Cullen College of Engineering

News

Brain-Controlled Exoskeleton One Step Closer to Reality

Printer-friendly versionSend by emailPDF version

By: 

Toby Weber
Professor Jose Luis Contreras-Vidal (left) at a recent demonstration of a Rex Bionics exoskeleton at the University of Houston. Contreras-Vidal is developing a brain-machine interface that will allow users to control such devices through their thoughts. Photo by Nine Nguyen.
Professor Jose Luis Contreras-Vidal (left) at a recent demonstration of a Rex Bionics exoskeleton at the University of Houston. Contreras-Vidal is developing a brain-machine interface that will allow users to control such devices through their thoughts. Photo by Nine Nguyen.

Researchers have put in decades of hard work developing an interface that would allow the human brain to control prosthetic limbs. It’s ironic, then, that the end result may be surprisingly simple.

Jose Luis “Pepe” Contreras-Vidal, a professor of electrical and computer engineering with the UH Cullen College of Engineering and director of the Laboratory for Non-invasive Brain-Machine Interface Systems, has made great strides in developing a brain-machine interface that is far simpler than what was once predicted. He recently published a paper in a special issue of IEEE Transactions on Neural Systems and Rehabilitation Engineering showing his latest findings in this arena.

For years, said Contreras-Vidal, most researchers believed decoding movement intentions in the brain would require invasive technologies, such as electrodes implanted in the skull, “but we have been going against dogma for a long time.” Contreras-Vidal’s earlier research demonstrated that movement intentions related to the legs (such as walk, turn, and sit down) can be decoded with high accuracy through the scalp electroencephalogram (or EEG), which records the brain’s electrical activity through a skullcap fitted with electrodes that simply contact the scalp.

Those earlier studies utilized 64 electrodes to decode moment intention. In this latest paper, Contreras-Vidal has demonstrated that same capability with just 12 electrodes. Such a drastic reduction, Contreras-Vidal said, could allow for the creation of an interface system that is far less intrusive and much easier for the user to work with than previously thought.

The paper also showed this research group has developed the ability to connect movement intentions that integrate feedback from stimuli like visual information during walking or proprioception (the sense of where one’s body is in space) with the movement intention that preceded it. This advance should allow Contreras-Vidal and his colleagues to devise a brain-machine interface that can more closely mimic the near-automatic response able-bodied individuals have to the unexpected, such as stumbling. “This is an important capability for any robotics system. If something goes wrong, if the feedback is not expected, you need to be able to make quick corrections,” he said.

With these findings, Contreras-Vidal hopes to demonstrate a human walking with an exoskeleton controlled by a brain-machine interface in a matter of months. A successful demonstration will be an important milestone for both science and patient care – and a small victory for elegant, simple solutions.

Support in part from the National Institute of Neurological Disorders and Stroke (NINDS, award R01NS075889) is gladly acknowledged.

Faculty: 

Department: 

Related News Stories