Brain Powered

In the new world of brain-actuated technology, you'll fly a plane, play a video game, or maneuver a wheelchair just by thinking about it.

By Bennett DavissMay 1, 1994 5:00 AM

Newsletter

Sign up for our email newsletter for the latest science news
 

In a windowless, cavernlike laboratory, David Tumey wriggles into a steel box, about the size of a large refrigerator, that is mounted on a four-foot horizontal shaft and connected through a computer to an electrical control system. The box is an ordinary flight simulator, much like some two dozen others found here at Wright-Patterson Air Force Base near Dayton, Ohio. On the wall in front of Tumey, a small video screen displays a line representing the horizon. A heavier black bar sits atop that line, indicating the simulator's angle relative to the horizon.

Suddenly a green line appears, at a different angle from the simulator's black bar. It's Tumey's job to try to bank the simulator so that the green line covers the black bar--in other words, he needs to bank the simulator either to the right or to the left, so that the two lines are tilted at the same angle. He does so easily. But Tumey's feet aren't resting on pedals, and he holds nothing besides a limp scrap of paper that he's been fingering just to give his hands something to do.

Tumey is controlling the simulator just, in effect, by thinking about it. Researchers at Wright-Patterson's Alternative Control Technology Laboratory call this brain-actuated control. They and like-minded researchers around the world are learning how to take the electrical impulses given off by the brain's neurons and use them to control computers, motors, and other devices. In a number of different trials, people have learned to turn lamps on and off, channel surf on TV sets, and play video games without the use of hands or voice. All they use are a few well-placed electrodes and their power of will.

You might call it a case of mind over matter, but the scientists involved aren't comfortable with what the phrase implies. "People have the idea that this technology reads subvocal thoughts--that you think the phrase 'Close the door' and the door will close," says electrical engineer Andrew Junker, who first developed the Wright-Patterson program. "That's not it at all. At this point we're measuring electric signals, not reading minds."

"All control is brain-actuated control, as far as we know," adds Grant McMillan, who heads the Alternative Control Technology Laboratory. "All we're doing is measuring the output at a different point."

The signals from Tumey's brain that are being measured are evoked by two soft white lights--one on each side of the simulator's screen--that pulse in unison in a steady rhythm, at the rate of 13.25 cycles per second (or 13.25 hertz). It's a fast but detectable pulse--more like a flicker. Neurons in Tumey's visual cortex, a section of gray matter near the scalp at the back of the head, are stimulated by the pulsing light and give off quick bursts of electricity with exactly the same frequency. Electrodes resting on Tumey's scalp just above the visual cortex measure the strength, or voltage in microvolts, of that precise rhythm, which varies depending on how many neurons are firing and whether they do so in synchrony. The computer translates changes in that voltage into instructions for the simulator's automated controls.

Tumey controls the simulator by controlling his response to the pulsing light: by suppressing the rhythm in his visual cortex below a fixed threshold, he banks the simulator left; by enhancing his brain's rhythm above another, slightly higher threshold, he banks it right. If the voltage falls somewhere between the two thresholds, the simulator stays put. Tumey sees how he's doing by watching the green line pivot against the black bar.

How can Tumey suppress or intensify the voltage of a specific electrical frequency in his brain? "No one really knows," says John Schnurer, a physicist working on the Wright-Patterson project. "Just as with any learned response, the exact mechanism is difficult to put into words. We've asked subjects to fill out questionnaires, and their response makes it clear that the more successfully they can control the simulator, the less able they are to explain how they do it."

Tumey, who has logged more time in the simulator than anyone else, doesn't know either. "At first you think of physical images, like pushing and pulling or opening and closing. That didn't work very well for me, and one day I just said the heck with it. Once I just let go and started to let it happen instead of trying to make it happen, I got better control. It was almost like a psychic experience."

Of course, it may not really matter how Tumey does what he does. "It's not a question of the brain's ability to learn this kind of control," notes Jonathan Wolpaw, a neurophysiologist with the Wadsworth Laboratories, which are part of the New York State Department of Health. Wolpaw is trying to use the technology to give people who have little or no control of their limbs, voice, or even their breathing a way to control their surroundings. "I'm fairly sanguine that the brain can learn far more than we have the skill or knowledge to teach it. The hard part is learning how best to implement this technique."

The hard part, in other words, is finding the answers to the somewhat mundane technical questions: how many electrodes to use in reading brain signals, at which points along the scalp's surface those electrodes should rest, which electrical frequencies to read, how to sort one signal from another. The human brain holds somewhere between 10 billion and 100 billion neurons, which form as many as 100 trillion connections among themselves. The electric signal that is at the core of any mental or physical process--voluntary or involuntary, no matter how minor--comes from each of millions of neurons picking up a chemical signal and translating it into an electric pulse that travels down its long, threadlike axon until it reaches the next neuron. Every thought, reflex, or sensation triggers such an electric pulse. Trying to pick out one useful signal from the activity of millions or even billions of neurons is much like attempting to eavesdrop on a single conversation by holding a microphone over a large city. And on the brain's information highway, the traffic is as constant as it is noisy.

The good thing for the researchers is that much of that traffic travels in convoys. Although individual neurons pulse with electricity, it takes groups of neurons working together on a common task, or even resting together, to put out a pattern--a wave--that's loud enough to be detected. Different types of activities result in waves with different frequencies. For instance, delta waves--which are most evident during deep sleep--have frequencies between .5 and 4 Hz. Theta waves also arise during sleep, during its dream stages, and have frequencies between 4 and 7 Hz. Alpha waves span the frequencies from 8 to 13 Hz and are given off by a wakeful but relaxed brain. Finally there are the beta waves, which have frequencies between 13 and 30 Hz. These are the waves that result when you do something that requires real concentration--balancing your checkbook or writing a research paper.

To put together their brain-actuated technologies, researchers first have to choose a single wave from among the brain's electrical torrent and then learn how to track it. Wolpaw, for one, looks at recordings of brain waves taken by an electroencephalograph (EEG) while his test subjects--usually volunteers from other, nearby labs--are sitting quietly in a chair. What he's measuring is a subset of alpha waves known as mu waves. Mu waves arise at different frequencies within the alpha bandwidth of 8 to 13 Hz; they're the resting rhythms generated by the neurons in the sensorimotor cortex, a strip of brain geography that lies on top of the head between the ears, almost like a headband. "Mu waves change in response to sensation and movement," Wolpaw explains. "This seems to be the part of the brain most directly related to our normal channels of physical control, so we thought it might make sense to use these waves to develop this new control channel."

Wolpaw and three colleagues screened 60 normal adult volunteers and selected the five whose mu rhythms were the easiest to read. Then Wolpaw sat them in an easy chair in front of a computer screen and fitted them with what looked like a bathing cap studded with electrodes. Their assignment was simple: they were asked to think about something--anything-- that would allow them to move a cursor to the top or bottom of the screen on command. Each of the five underwent half-hour practice sessions three times a week for two months. The ideal, says Wolpaw, "is to try to approach as closely as we can the speed and accuracy of control over a computer that you get with a mouse."

The researchers monitored the mu rhythms until they found each individual's best wave--the wave they were best able to control and which they could then use to move the cursor. "Presumably," says Wolpaw, "general sites and frequencies are common to most people, although we can't really say yet what they are. However, we'll probably always have some customization within set limits for each patient--finding the precise frequency or electrode placement."

One subject was completely unable to get a mental grip on the cursor, but the other four were ultimately able to hit the target 80 to 95 percent of the time. In less formal tests, these same subjects have been able to use their minds quite deftly, maneuvering the cursor over on-screen bars that, when contacted, turn lamps on or off or change the channel on a television set. Like Tumey, however, even the most proficient cursor movers are not able to explain exactly how they do it.

Wolpaw is now taking the next step: with a five-year grant from the National Institutes of Health, he is fashioning his technique into a sort of mental prosthesis for people who don't have the use of their limbs. He hopes eventually to enroll 100 people in his study, about a quarter of whom will have some significant disability. His first test subject, for instance, was a man suffering from amyotrophic lateral sclerosis--otherwise known as ALS or Lou Gehrig's disease. Wolpaw says his ALS patients appear to be able to learn to move the cursor just as ably as those without any disability.

Actually, what has hampered progress in brain-actuated technology is not the speed at which the brain adapts to these new methods of controlling the environment. The speed bump along the learning loop is in the hardware itself: conventional computer programs take too long to analyze the EEG, carry out commands, and show the result to the test subject. "If you told your arms to turn the steering wheel on your car and nothing happened for ten seconds, you could still drive, but you'd have to drive really, really slowly," says Schnurer at Wright-Patterson. "We have to give test subjects feedback in as close to real time as possible, in order for them to be able to learn this control in a meaningful way. Without it, the brain can't associate a particular effort with a particular result, and it begins thinking about other things."

In real time, a visually evoked response--the amount of time it takes for you to see something and respond to it--takes about .07 second. Schnurer considers that sliver of time a personal challenge. In 1986 he built an electronic signal processor that can home in on any selected brain frequency, lift it out of the brain's electrical traffic flow, and begin to speed information about its variations back to the flight simulator's screen in .2 second.

That, says Schnurer, isn't nearly good enough. It's the signal processor's fault. It has a built-in time delay--a bottleneck inherent in the filter that picks up the pulses of brain activity and has to smooth them out into a steady signal the computer can use. He has been hard at work on a new signal processor--which includes three processors working at once, as well as some speeded-up circuitry--to significantly cut into this time delay. The goal is to slash the total response time to as little as .1 second. "If the brain completes a cycle," says Schnurer, "and we can show it what it did before the next cycle is over, that's about as close as you're going to get to instant feedback."

Gert Pfurtscheller, a biomedical engineer at the Graz University of Technology in Austria, has decided to get rid of the bottleneck altogether: the process would be much simpler, he believes, if you didn't have to bother showing the brain what it's doing. In his scheme there are no video screens; the brain sends out a command, and the computer acts on it.

In other words, rather than teaching the brain how to control a computer, Pfurtscheller has decided to let the brain act naturally--let it generate its usual, spontaneous wave patterns--and let the computer do all the learning. His immediate goal is to teach a computer program called a neural network--a program that, like the human brain, is designed to draw conclusions based on patterns in the data it's fed--to predict which physical movement a person has decided to make according to the EEG pattern the brain is generating. "We've found that mental preparation for a specific physical movement creates a specific EEG pattern that is recognizable in all subjects," he explains. "This specific pattern is also clearly different from that created by the mental preparation for a different movement."

Pfurtscheller starts with a short list of movements--pressing a button with the right or left index finger, pointing the toes of the right or left foot up or down, placing the tongue behind the teeth as if preparing to pronounce the letter t. To give the computer enough data to work with, he has his test subjects repeat one or more of these movements hundreds of times over three or four hours. Each time, says Pfurtscheller, the brain has to prepare itself to make the movement, and that preparation takes between half a second and a full second. A second's worth of hesitation seems like a long time, but according to Pfurtscheller there's a difference between the preparation for this sort of conscious decision--the decision, for instance, to say the letter t--and the preparation for a more programmed, automatic skill like using the letter t in speech. The chosen movements are discrete, voluntary movements, he points out, "not the movements involved in things like typing or playing the piano, which are part of a subconscious pattern."

During the warm-up time for each single, deliberate act in Pfurtscheller's list, the brain creates a recognizable EEG pattern at a specific brain location. Over time, the neural network learns to recognize the differences in the EEG patterns that occur before each movement and thus can predict which is about to occur.

Finding these prescient patterns isn't easy: they don't exist among the brain's well-charted electrical pathways--that is, among brain waves with a frequency below 30 Hz. Most EEG researchers don't even bother looking at brain waves above that: higher frequencies are difficult to pick up and are easily lost amid all the rest of the brain's noise. But Pfurtscheller--expanding on work done by European and U.S. researchers--has learned how to reliably isolate and read 40 Hz signals directly off the surface of the scalp. And that's where he found what he believes to be the electrical signature of the brain's preparation for physical movement.

"When you plan a movement of a right finger, there is an increase in 40 Hz activity over the portion of the brain's left hemisphere that controls the hand; when you plan a left-finger movement, the same pattern occurs in the reciprocal place over the right hemisphere," he says. "These patterns are quite different for each specific type of movement, but they are repeated in the same test subject on different days."

Now that he's snared himself a good, reliable brain wave to monitor, Pfurtscheller intends to use it to help people paralyzed by spinal-cord injuries. His first project is based on technology already in use by paraplegics--an electronic stimulator that gives them bladder control. When the patient presses a button, the device stimulates his bladder and he urinates; when the button isn't being pressed, there is no urination. The device has given paraplegics the ability to urinate when and where they choose--an important bit of independence.

"But what if the person has no use of his hands?" Pfurtscheller asks. "If he can't move his hands, he can't press a button." So in a three- year trial just now beginning, Pfurtscheller's research group will attempt to give such patients bladder control through the use of neural networks able to interpret their desires. "Of course, it's too complicated to read abstract thoughts," says Pfurtscheller. Instead he envisions using a neural network through which a patient might activate an electrical bladder device by mentally preparing to point the toes of his right foot up and then turn it off by thinking about pointing the left toes down.

Eventually, Pfurtscheller hopes his technique could be extended to give paraplegics mental control of electrical stimulators now being implanted experimentally in leg muscles. These stimulators provide the electrical impulses no longer being delivered by the nerves and, in theory, allow the patients to walk. But the only way for the patients to control the speed and direction of their steps is by pressing buttons on a computer that hangs in a box from their shoulders or trails behind them in a cart. Pfurtscheller's neural network could ultimately give them control over the computer that controls their steps. "We have done the basic work," he says. "We need now to do the experiments and improve the basis that we have set down."

One researcher seems to have already begun. Five years ago Andrew Junker left the Wright-Patterson laboratory he created to pursue new ideas about brain-actuated technology. And he's obviously onto something. He's used his device--a narrow cloth headband that holds three postage stamp- size electrodes, and some proprietary software that can translate what the electrodes pick up--to steer his 35-foot ketch, Gypsy Moon, and to maneuver a wheelchair. A quadriplegic in Pittsburgh has used it to play video games. Musicians in Albany and Philadelphia, alone or in groups, have used it to play a flute, a double bass, and an organ in the normal way while mentally improvising harmonies or counterpoints on an electronic synthesizer.

Junker's advances seem to put him head and shoulders above the rest of the brain-actuated technology crew. But some of them say that he's not really working in the same field. The problem, they say, is that electric signals picked up from the forehead are not pure brain waves-- they're littered with signals from the head, neck, and shoulder muscles. "The forehead is a great place to read muscle activity," says Wolpaw, "and that makes it a terrible place to read brain activity."

Junker doesn't disagree. "Those muscle signals are known as 'artifact,' " he explains. "To a trained archeologist, an artifact is a very valuable thing. But often, the way EEG researchers deal with the appearance of artifact is to shut off their machines because they regard it as noise." Junker puts it to use: he considers the signals to be like training wheels, a means by which a person can gain quick control of a task using the more easily manipulated muscle signals, then gradually develop the more subtle brain-actuated control at leisure. "Everyone knows that the brain-body link is there, but most researchers try to work around it," he adds. "We embrace it."

The question, however, is just what role brain actuation plays in Junker's creation. Wolpaw believes that Junker's machine may be nothing but an elaborate muscle switch--a common device that translates movement or tension in a muscle into commands that activate mechanical or electrical devices. "If you're interested in getting a new kind of control for someone who can control only his forehead muscles or his neck muscles, then that's just fine," he says. "But if he can't, then it's not fine. If you have someone who has totally paralyzed muscles, then muscle activity isn't going to work."

While Junker contends that brain waves play an integral role in his device, he freely concedes the presence of muscular electricity in its signal. "The bottom line is that it doesn't matter, as long as people can use it to enhance their quality of life," he says. "The point is, you can't have muscle activity without brain activity. The dilemma for the scientist is how to separate those two. And they can't be separated easily, if at all."

McMillan's lab recently awarded Junker a research contract to try to help him untangle brain signals from muscle signals within the cascade of data his machine collects. "From a scientific point of view, we would like to know exactly what's going on," says McMillan. "But it doesn't necessarily mean the device isn't useful, even if we find out it's purely muscle."

"What's important is that this works," Junker adds. "Typical biofeedback experiments require several sessions before a person gains minimal control. Our device gives most people some sense of connection to the task, and even a degree of control, within a minute or two."

The scientists involved in brain-actuated technology do a lot of blue-skying when talking about advances in their field. Take Tumey, for instance. David Ingle, one of the Wright-Patterson engineers, says that Tumey's mental control of the simulator is so good "he'd have no problem handling the control stick of a Piper Cub." It will undoubtedly be a while before he has the chance--the technology will first be used for tasks that won't endanger anyone's life, such as scrolling a computer display or choosing items from an on-screen menu. "But who knows?" says McMillan. "Twenty or thirty years from now, we might be saying, 'Gee, I'd never want a pilot to control the stick with his hands when he can do it so much better by manipulating his brain activity.' "

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2023 Kalmbach Media Co.