The Rise of the Cyborgs

Melding humans and machines to help the paralyzed walk, the mute speak, and the near-dead return to life

By Sherry Baker
Sep 26, 2008 5:00 AMJun 28, 2023 3:24 PM

Newsletter

Sign up for our email newsletter for the latest science news
 

“First we build the tools, then they build us.” —Marshal McLuhan

It is a stiflingly hot summer day in Atlanta. Scientist and physician Philip Kennedy has a packed schedule, so he suggests that I interview him while he drives to the tiny town of Bowdon, Georgia, just east of the Alabama line. It’s a journey he takes every Memorial Day to a small cemetery.

We park behind the redbrick Sandy Flat Baptist Church. The sun is blazing overhead, the glare from the white gravel parking lot almost blinding. But Kennedy knows exactly where he’s going. He quickly walks through the simple graves carpeted by carefully tended grass, then stops and bends down. Visibly moved by private thoughts, the scientist touches a simple headstone and leaves beside it a tribute he wrote for the man buried here, a man he calls a hero.

Johnny Ray, who died on this date six years ago, was Kennedy’s patient, his research subject, and the world’s first human cyborg, fitted with brain implants that allowed him to communicate directly with a computer.

Kennedy is the chief scientist of Neural Signals, a company he founded in 1987 to develop a brain-computer interface, or BCI, though he prefers the term “neural prosthetics.”

By any name, the devices created by Kennedy and a handful of others can decode the conscious intentions conveyed by neural signals. For those who are missing a leg or who have a broken spine, the signals can control computers, wheelchairs, and prosthetic limbs. For those suffering from “locked-in syndrome,” their bodies so immobilized by catastrophic disorders like amyotrophic lateral sclerosis (ALS) or brain stem stroke that they are unable to speak or communicate their needs, the devices can translate neural signals to spell out words on a computer screen. Spoken language through a voice synthesizer is coming soon.

Although his current work is aimed at the severely disabled and locked-in, Kennedy believes neural prosthetics will have applications for the well-bodied, too. In fact, he awaits a new, technologically driven stage of evolution that will qualify cyborgs for a branch on the human family tree.

“By connecting intimately with computers, we will take the human brain to a new level,” he says. “If we can provide the brain with speedy access to unlimited memory, unlimited calculation ability, and instant wireless communication ability, we will produce a human with unsurpassable intelligence. We fully expect to demonstrate this kind of link between brain and machine.”

Getting OutOriginally a physician from County Limerick in Ireland, Kennedy was so intrigued by the workings of the brain he decided to go back to school and train as a neuroscientist. After earning a Ph.D., he made his way to Emory University in Atlanta, where, as a postdoc, he began recording and studying neural signals from rats’ brains.

He found the task daunting. Unreliable and laborious, his research required sticking electrodes through holes in the rats’ skulls, risking scarring and infection that could play havoc with data. If the animals moved, the electrodes often slipped out of place.

In l986 while running a lab at Georgia Tech, Kennedy learned that Canadian scientists were spurring neuron growth in rats’ brains by adding bits of sciatic nerve. An idea took shape: Why not create an implant that would spur the brain to grow into it? If the brain could meld with such a device, the nerve cells would hold it permanently in place and risks would drop.

To build his implant, Kennedy took a tiny glass cone, filled it with a mix of nerve growth factors, and ran two fine, coiled gold wires through. Then he inserted it into a rat’s skull, right over the motor cortex that controls movement. Soon neural cells had grown through the implant, keeping it in place and ensuring a solid electrical connection. The gold wires, meanwhile, conducted neural signals through the skull to the outside, where they could be amplified and analyzed.

Then Kennedy performed a simple study. He implanted rats with electrodes in the part of the brain that receives input from the animals’ long whiskers. When he tapped on certain whiskers, he “heard” neural activity via the electrode, but other whiskers didn’t produce these signals. The observation suggested, he says, that “specific neurons were connected to the movement of specific whiskers.” Next he snipped off the hooked-in whiskers and tapped on the alternate whiskers again. This time, to his surprise, the neurons previously assigned to the now-missing whiskers began to adapt, picking up the signals and even causing the remaining whiskers to move. The brain, apparently, was able to compensate for a loss and adapt to fill a need; in short, it was malleable and plastic.

Instructed by this discovery, Kennedy’s vision of a neural prosthesis took hold. If neural activity corresponding to one body part could adapt and move another body part—in this case, whiskers—then perhaps it would be possible to reroute nerve signals on a larger scale, around an injured spine or into a prosthetic limb. Moreover, if the brain’s intentions for movement or language could be deciphered, there might be a way to create an interface between these nerve patterns and the outside world.

Kennedy patented his device (called the Neurotrophic Electrode) in 1989 and spent years testing it in monkeys. Ultimately his increasingly sophisticated technology could amplify neural signals about 10,000 times, converting them to radio waves and transmitting them to an FM receiver. That receiver, in turn, could broadcast the signals as radio waves to a nearby computer.

In 1996 the FDA gave Kennedy the go-ahead for human trials. His first great triumph came with test subject Johnny Ray.

Photos reveal a man with smiling eyes set in a round, slightly chubby face. He was what Southerners call a “good ol’ boy,” a 53-year-old drywall contractor and Vietnam veteran living in Douglasville, Georgia, who liked to play jazz guitar, have a few beers, and hang out with the guys. But one day in the fall of l997, while talking on the phone, he became one of more than 700,000 Americans each year who have a stroke.

“He had high blood pressure and didn’t take his medicine,” Kennedy says. “It was a catastrophic stroke of the brain stem.”

In a coma for weeks, Ray finally woke up in the Atlanta VA Medical Center, his mind intact but his body unable to move or communicate except by the slightest quiver of a few muscles in his face, including his eyelids. He was what doctors call “locked-in.” Blinking twice for yes and once for no, he agreed to participate in Kennedy’s study.

Kennedy and Emory neurosurgeon Roy Bakay implanted a Neuro­trophic Electrode near the part of Ray’s brain that controlled his left hand. The outer end was attached to an amplifier and radio transmitter on his skull under the scalp. In the months that followed, Kennedy encouraged Ray to think about moving a computer mouse with his hand.

As Ray imagined moving the mouse, there was an increase in electrical activity among the neurons that would have controlled that action if his hand could move. These brain impulses were transmitted to a receiver on his pillow, where they were deciphered and translated into digital commands sent to a nearby computer. Over time, the computer began to obey Ray’s neural signals. Within six months Ray was moving a cursor on the screen through intention alone, communicating by clicking on icons for phrases like “I’m cold.” Despite a host of excruciating health problems, including infections (not related to the implant), Ray kept working with the researchers, although it clearly exhausted him. After more months of practice, he could spell words and hold brief conversations.

When Kennedy asked what he felt when he moved the cursor, Ray spelled “NOTHING.” It was a surprising, significant moment. Ray had learned to control the cursor without thinking about moving his paralyzed arm. Neural activity that had been linked to arm and hand movement had changed. Now his brain was communicating directly with the computer.

“The brain is very adaptable,” Kennedy says in the Irish accent he has kept after almost three decades in the United States. “The brain’s plasticity is the key thing to this whole field.”

Telekinetic monkeys The FDA doesn’t readily approve radical procedures like brain implants, so most of Kennedy’s colleagues have pushed the envelope not in humans but in other primates—for instance, a tiny owl monkey named Belle. In 2000 Duke University neuroscientist Miguel Nicolelis trained Belle to move a joystick in sync with a light. If she did the task correctly, she was rewarded with juice.

Belle wore a hat attached to about 100 wires as fine as human hairs. The wires were implanted in Belle’s motor cortex, the part of the brain that plans and initiates movement. As she moved the joystick, her neural signals were picked up by the wires and sent to a computer in the next room. The computer sent them on to a robotic arm, which precisely mimicked the action of Belle’s arm. At the same time, Nicolelis transmitted the brain signals over the Internet to the Touch Lab at MIT in Cambridge, Massachusetts. There, 700 miles from Duke, the neural commands operated another robotic arm. The virtual and physical world had merged.

In January 2008, working with the Computational Brain Project of the Japan Science and Technology Agency, Nicolelis took another step forward, this time with Idoya, a rhesus monkey trained to walk upright on a treadmill for treats. Electrodes were implanted in the part of Idoya’s brain that controls leg movement; these devices recorded the activity of 250 to 300 neurons that fired when her ankles, knees, hip joints, and feet moved or were about to move. Fluorescent stage makeup allowed a special high-speed video camera to capture the details of her limb motions.

The video and neural signals were then combined to show which muscle movements resulted from which neuron firings and how the activity of the neurons appeared to anticipate movement. In the end, computer analysis predicted Idoya’s leg movements about a second before the animal actually carried them out. As if all this weren’t enough, the system transmitted the predictions over a high-speed Internet connection to Kyoto, Japan, and into the actuators of a robot named CB-1 (for Computational Brain), which was designed to have a remarkably humanlike range of motion. Idoya walked and so did CB-1, in perfect sync.

The 500-pound, 5-foot-tall robot was much larger than the 12-pound, 32-inch-tall monkey whose neural signals were directing it, and this underscored a simple yet remarkable point: Implant technology could enable brainpower to control a huge object (like a robot crane) or a tiny one (like a microscopic surgical tool) just as easily as a life-size mechanical arm.

For the technology to truly aid humans, robot limbs will have to function in a lifelike way. That’s where neuroscientist Andrew Schwartz of the University of Pittsburgh School of Medicine comes in. He recently implanted an array of electrodes about the size of a freckle in the motor cortex of two macaques. The animals’ arms were gently restrained and a mechanical arm with a grasping claw was strapped to their left sides. The neural signals captured by the implant had been produced by the macaques’ brains to direct their arm movements. Instead the signals were shunted to a computer, which in turn directed the movements of the robotic arm. The monkeys didn’t have to think about moving the robotic arm; they simply reached out fluidly with the prosthetic limb as if it were part of them. The success of this experiment was due in part to a computer capable of rapidly interpreting brain signals based on the monkeys’ desire to move their limbs. This “advance warning system” moved the robotic arm in just 150 milli­seconds, about the length of time it takes brain activity to spur real arm movement. The rapid response helped the monkeys use the robotic arm in a natural way, reacting quickly if they were about to drop a piece of food, for instance, and refining movements in real time.

Work like this lends credence to the cyborg-maker’s long-sought goal: the possibility, in the not-too-distant future, of helping the paralyzed walk, reach, and grasp. Front and center in this effort is Northwestern University neuroscientist Lee Miller, who injects local anesthetic into a monkey’s arm so that the limb is temporarily paralyzed. Then, instead of sending neural signals from the animal’s brain to a robot, he shunts them back into the muscles of the paralyzed arm, thereby bypassing the spinal cord. “The signals are going to a stimulator that is electrically stimulating those same muscles,” Miller explains. “So essentially it allows the monkey to use his arm again, flexing the wrist and playing a video game all entirely voluntarily, despite the fact that the arm is actually paralyzed.”

Merging man and machine The spectacular successes of brain implants in primates has paved the way for new human trials, including one at Brown University, where neuroscientist John Donoghue is moving ahead with BrainGate, a minuscule array of tiny, spikelike electrodes implanted in the motor cortex. Candidates are quadriplegics, with all four limbs paralyzed due to ALS, spinal cord injury, or brain stem stroke. So far, three patients implanted with BrainGate can voluntarily modulate several dozen neurons sufficiently to type on a screen, move a prosthetic hand, or control a robotic arm.

“Our goal is to help restore communication and independence,” says Donoghue’s colleague Leigh Hochberg, a neuroscientist with the Department of Veterans Affairs. One patient, a 37-year-old with ALS, died 10 months into the trial after his respirator was inadvertently disconnected. His untimely death, and the progress he made while participating in the experiments, were especially moving to Hochberg. “He could demonstrate that his mental status was fully intact,” Hochberg says. “He had great insight into his disease and the research we were doing and great humor as well.”

Even with the limited number of subjects, the human research has already confirmed the monkey findings and answered important questions about how the brain works. “One thing we wondered was how a particular part of the brain functioned years after an arm and hand hadn’t moved due to disease or injury,” Hochberg says. “We found some insight thanks to one of our first participants, who had a spinal cord injury. He was paralyzed, but the moment he thought about using his hand, we saw changes in neural activity in the specific part of the motor cortex associated with hand movement. Different neurons fired at different rates depending on what he imagined performing.”

What that means, Hochberg says, is that the brain signals that once controlled the subject’s paralyzed hand and arm were still there and functioning—they just could not pass through the damaged spinal cord to allow the arm and hand to move. He hopes devices like BrainGate will circumvent such damage and allow the brain to communicate with a prosthetic limb or even the actual one, in the manner of Miller’s research. “If we can take those natural signals and send them to a functional electrical stimulation system placed in and around the muscles and nerves of an arm or a leg,” Hochberg says, “someone might be able to control their own limb again using neural technology rather than injured biology.”

Another planned clinical trial involves a miniaturized neural electrode the size of a couple of kernels of corn, pioneered by neuroscientist Richard Andersen at Caltech. Hoping to accomplish what some have compared to mind reading, Andersen wants to implant his device in the brain’s higher-level sensory-motor areas, including the parietal lobe and premotor cortex, the seats of personal preference and intent. From a practical perspective, the implant could empower patients to use their abstract thoughts and feelings to control a medical device—a nuanced form of biofeedback. On another level, it could help physicians interpret thoughts that would normally control the patient’s body. “The first thing a doctor often asks is ‘How are you feeling?’?” Andersen says. “By looking at the decoded neural signals, the doctor could know.”

Speaking His Mind Even better, says Philip Kennedy, would be giving the locked-in the gift of actual voice—and he’s getting close.

Erik Ramsey became the first subject for this research after he suffered a horrible car accident. Surgery repaired a host of broken bones and torn muscles, but Ramsey didn’t seem to wake up. At first his doctors thought that the anesthesia was taking an unusually long time to wear off. But eventually Ram­sey’s father, Eddie, realized something was terribly wrong. Ramsey had suffered a catastrophic brain stem stroke, appearing to doom him to a locked-in life at the age of 16.

It might be odd to describe Ramsey as lucky, but in a way he is, because his condition was correctly di­ag­nosed. No one knows how many others in the same state are incorrectly labeled as vegetative or semicomatose and warehoused, doomed to a nightmare world where they exist in a body that feels but cannot move, with a brain that is intact, hearing and seeing but unable to communicate a presence to others. In one 1996 study published in the British Medical Journal, 17 of 40 patients originally diagnosed as vegetative turned out to be locked-in instead. Ramsey found someone who could unlock him.

Ramsey, now 25, has dark, almond-shaped eyes that hint of his mom’s Filipino heritage. He is big and broad-shouldered. If he could get up out of his wheelchair, he’d be well over 6 feet tall. If he were mobile, he’d have the physique of a football player.

His mind is fine but he cannot move, except for tiny eye gestures (up for yes, down for no) and occasional muscle spasms. I ask him if coming to the lab is fun. He looks down. Is it more like work? He looks up. He is an inner-space pioneer whose work holds the promise of freeing himself and others who are locked-in, at least to a degree, by eventually allowing them to have real-time conversations.

“Developing a neural prosthesis for speech is extremely important to me,” says Kennedy, who, as a neurologist, regularly sees patients with ALS and stroke. There are some 30,000 ALS patients in the United States alone. All will become locked-in eventually, and 5,000 to 6,000 each year are at the point where they must decide whether to spend the rest of their lives on a ventilator, unable to speak, or to refuse it and let themselves die. If they knew they could continue to communicate as their disease progressed, they would undoubtedly more often choose to live, and they could even be productive. “People call my office all the time about a loved one who has had a brain stem stroke, lying in bed, unable to speak,” Kennedy says. “I expect to be able to help these people with this research.”

In the effort to unlock the door, Ramsey is treading where no one has gone before. The brain’s precise speech center varies from person to person, so to find Ramsey’s target area—the place where an implant could discern the appropriate speech signals—Kennedy used a functional magnetic resonance imaging (fMRI) scan. Showing Ramsey pictures, he told the young man to say to himself phrases like “This is an elephant” and “This is a dog.”

As Ramsey “spoke” internally, the MRI pinpointed neurons associated with speech, but the results were surprising. The neural signals were not sparked by words or their meanings, per se, but instead by how the muscles of the lips, tongue, jaw, and larynx would move to produce the sounds—movements that Ramsey could only imagine.

In 2004 a neurosurgeon on Kennedy’s team inserted an electrode in the part of Ramsey’s cortex where the signals were most dense. Then an amplifier and transmitter were screwed onto the top of Ram­sey’s skull. Ever since then, he has arrived at Kennedy’s lab every Monday, Wednesday, and Friday afternoon. Kennedy’s team affixes an amplifier atop Ramsey’s head to record speech signals from his motor cortex as he imagines physically moving his mouth, tongue, and jaw to make speech sounds, called phonemes.

Over the course of three years, Kennedy’s group has recorded 41 distinct patterns from 56 neurons in Ramsey’s brain. Decoding the signals has been tricky and slow going. But Kennedy collaborator Frank Guenther, associate professor of cognitive and neural systems at Boston University, and his colleague Jonathan Brumberg recently worked out a system that translates neural signals from Ramsey’s implant into vocal form via a synthesiser that produces the corresponding sound.

In February, for the first time, Ramsey heard the synthesized vowels he was “saying” in his head (consonants are harder and will come later) played back in real time, as he was thinking them. He heard the phonemes blare from computer speakers and, at the same time, could see his neural signals directing a cursor to the symbol for the sound (like “ooh” or ?“aah”) on the screen.

Guenther and Brumberg flew in from Boston for this groundbreaking experiment. When the computer “spoke” for Ramsey for the first time, whoops of delight could be heard in Kennedy’s lab. “It was incredibly exciting,” Guenther says. “We finally all knew this was going to work.” Ramsey’s brain is already changing as his neurons learn to fire in specific ways that better control the synthesizer. “We are now convinced we’ll be able to give him rudimentary speech within not too many years,” Guenther says.

The ultimate aim is not just speech but restoration of full bodily function. If Kennedy has his way, someday the blind will see and the paralyzed will walk—and other researchers are racing him to make those things happen.

Miguel Nicolelis says his quadriplegic sub­jects will walk again—not in 10 or 20 years, but in just a few. We are talking over the phone when he makes this grand statement from a restaurant in Brazil. He is effusive on this call, pausing in a long conversation only once, to order sushi in his native Portuguese. Nicolelis says he’s confident he has solved most of the technical problems once dogging brain electrodes, and he’s mostly ignoring naysayers who think it can’t be done. “They can go on saying what they want to, but I am just going ahead and doing it,” he states.

Specifically, he is leading an international consortium based in Brazil called the Walk Again Project. Other participating nations include the United States, Israel, Switzerland, Germany, Japan, and France, as well as a country in the Arab world and one in Africa he cannot yet announce. Together they are aiming to do what sounds miraculous: help paralyzed quadriplegics walk again, not by fixing their lesions or broken spines but by creating wearable robotic exoskeletons controlled by neural signals. The effort will be based at the Edmond and Lily Safra International Institute of Neuroscience of Natal.

Setting Our Brains Free While the immediate future is filled with hope for the disabled, cyborg technology may soon spread, giving ordinary people extra­ordinary skills. The possibilities are both terrifying and amazing: Brain implants might be the key to interspecies communication, for instance, and could offer true immortality as our brain patterns find new life in the belly of a machine. From bloodless wars (fought with cyborg-controlled robots) to apparent mind reading, the cyborg age could change the meaning of being human and thrust us into another evolutionary realm.

It seems like our logical destiny, according to the futurist Ray Kurzweil. “We already have brain implants for people with Parkinson’s disease and computerized implants for other conditions such as deafness and epilepsy,” he says. “Some people may articulate an abstract opposition to the idea of merging with machines, but that is how we will get from where we are now to my conception of the future, through many steps, each one benign and useful.”

Kennedy is convinced that neural implants will inevitably be used to download information into our brains, creating superintelligent humans. He speculates that astronauts equipped with implants will tap into the massive amount of information they will need while colonizing the moon or exploring the universe.

He would also like to work with neural speech prostheses in great apes. “Already we know they can understand a lot of human language and sign language. What if we could find a way to give them synthesized speech?” he asks. “What would we learn from them?”

The melding of man and machine appears inevitable, Kennedy believes. “It’s not hard to imagine that eventually somebody’s brain will be incorporated into a robotic body,” he says. “It could grant humanity a kind of immortality and also make us redefine what a human is.”

The U.S. Department of Defense is sponsoring research along these lines to help amputees injured in war, and the Army is investigating exoskeletons to give soldiers superstrength and resilience. Are cyborg soldiers with machine-enhanced strength, endurance, and vision on the drawing board too?

In a few generations, Nicolelis speculates, brain implants will be as socially acceptable as breast implants are today. “Implants will happen in normals when there is a benefit and they are safe,” he states. He agrees with others that the technology will shape the evolution of Homo sapiens, and his perspective is unmistakably philosophical.

Today, he says, we are all in a sense locked-in, but we won’t be for long. “With these experiments we’ve accomplished something that nobody has noticed yet: We have freed the brain from the body. We have created a profound new paradigm for the brain—and not just the disabled brain—to enact its will without the limitations of the biological machinery that we call a body.

“My children probably will see the day when they can sit physically on a beautiful beach in Brazil but at the same time control a rover on Mars, experience Mars,” Nicolelis reflects. “Their bodies will be here, but their brains will be free.”


Reading Minds From the Outside To tap into the brain activity of his subjects, Klaus-Robert Müller, a computer scientist at the Technical University of Berlin, does not need to get inside their heads. He just gives each a cap that is embedded with electrodes and stuck to the scalp using conductive gel. Instead of collecting signals from direct contact with neurons, as Philip Kennedy and others do, Müller captures the cacophony of thousands of brain cells chiming in together, recorded noninvasively from the outside of the skull.

Müller uses a computer to track brain waves through a technology known as electroencephalography, or EEG. One especially useful EEG signal, named the P300, registers the brain’s reaction to a novel event or a flash of recognition. Here’s how it works: Subjects wearing an EEG “hat” stare at a computer screen as letters are essentially illuminated one by one. Their brains issue a P300 when and only when a desired letter appears. Using computer software programmed to recognize the P300, the paralyzed can be trained to “type” with the power of the brain alone.

For basic communication for the severely disabled, this system seems best to Jonathan R. Wolpaw, chief of the Laboratory of Nervous System Disorders at the Wads­worth Center of the New York State Department of Health in Albany. Wolpaw has shown that a P300-based device could help patients with advanced ALS communicate at a rate of one to four words a minute. When the technology has been fully developed, he hopes, the locked-in “should be able to move a cursor someplace” and perform useful operations, “just as we do when using a computer mouse.”

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.