In Melody Moore Jackson’s BrainLab, there are items one doesn’t expect to find in computing research. Like a washer and dryer, and a healthy supply of towels. A three-dimensional printer. An area sectioned off with furniture as a living room, complete with couches and TV. These are the tools required to learn how to read people’s minds—literally.
Jackson, an associate professor in the School of Interactive Computing and director of the Georgia Tech BrainLab, studies devices that interact directly—and non-invasively—with the human brain. They are used with individuals “locked in” their own heads by severe neurological disorders such as advanced-stage amyotrophic lateral sclerosis (ALS), commonly known as Lou Gehrig’s disease. These people, who usually retain full cognitive function, have no physical way to communicate; they cannot move to speak or sign, and the most afflicted cannot even blink or move their eyes.
“My grandmother had multiple sclerosis and used a wheelchair,” says Jackson, who holds both M.S. (1988) and Ph.D. (1998) degrees in computer science from Georgia Tech. “She was just my grandmother; I never noticed her disability. As I grew up I realized the remarkable things she could do with no technology whatsoever, and I became inspired to help others like her. My main motivation now is to develop assistive technologies for people who have no other options. They are completely paralyzed and on a ventilator.
“Some can move their eyes very slightly,” she says. “But usually they lose that as well.”
After finishing her Ph.D., Jackson taught at Georgia State University, where she created her first BrainLab. She’d met a neurologist, Philip Kennedy, who was working on an electrode that could interact directly with the motor cortex, though it did have to be implanted in the brain (in fact, Jackson says Kennedy was granted the first FDA approval for such an implant in humans).
But Jackson wondered if she could achieve similar success with non-surgical techniques. At Georgia State she created her first non-invasive brain interface, based on electroencephalography (EEG) scans.
It consists of a cap the user wears together with a special conductive gel that helps the cap’s electrodes pick up its wearer’s brain waves (this is where the towels and on-site laundry facilities come in). After a series of baseline tests (analogous to the control questions asked at the beginning of a lie-detector test), users then perform mental tasks, such as imagining movement or paying attention to a stimulus such as a flashing box on a screen.
Another technology that the BrainLab is studying is functional Near Infrared imaging (fNIR). Based on shining an infrared light into the brain and measuring what is reflected, an fNIR device can determine how much oxygen is in the brain at a specific point. This can indicate the amount of brain activity, which the user can control. “The fNIR device is simple—only a headband—and not as powerful as the EEG. However it is promising as a communication device because any family member or caregiver can help the user to operate it.”
It sounds straightforward, but in practice the responses can be ambiguous (especially at first), and the work requires an incredible amount of patience. The fastest brain interfaces allow patients to communicate at a rate of 10 to 12 characters per minute, Jackson says. Still, as a metric of success, speed matters relatively little next to the assurance that communication is happening. Cutting through the fog and establishing a reliable connection with the person inside the disability is everything.
Inspired by her grandmother’s battle against multiple sclerosis, Associate Professor Melody Jackson of Interactive Computing has devoted her research to finding ways to help people fighting severe neurological disorders.
“One of the hardest questions is, how do you know if the person’s still there?” Jackson says. “Are they unresponsive, or are they choosing not to respond? We’re studying why this works with some people but not with others. What we try to do is get to the patients before they’re completely locked in, so we can communicate with them and tell that their cognition is still intact.”
The EEG-based device is only one such technology tested or developed in the BrainLab; Jackson says she has about 12 projects ongoing at any given moment. For 10 years she has worked on her “Aware Chair,” a wheelchair fitted with an interface meant to allow its user to guide the chair through thoughts; the chair’s computer also can remember the geography of small spaces like someone’s home (hence the living room setup in the lab). Her team has even developed a “BrainBrowser” that enables users to surf the Web through brain activity.
Overall, Jackson’s brain-computer interfaces show much promise. Jackson beams when she talks about a 24-year-old former Georgia Tech student from Huntsville, Ala., who’s suffering from a degenerative neurological disorder and whose physical ability to communicate is virtually gone. But with the fNIR cap, he can “talk” with his parents.
“He’s tracking with near 100 percent accuracy [using the fNIR device],” Jackson says. “That makes everything worth it.”