Surgeons describe the laparoscopic removal of a gallbladder as a routine operation: in the United States it's performed 600,000 times a year. The procedure begins with the insufflation of the abdomen, which in surgical parlance is just a fancy way of saying that you put a needle through the wall of the abdomen and then inflate the abdomen as if it were a football. Then you punch through the wall again, this time with three long, thin, tubelike devices called trocars. You feed a laparoscope-- essentially a computer-chip camera--through one of the trocars so that you can watch on a television monitor what it is you're doing inside your patient's body. Your assistant feeds a long-handled instrument through another trocar, with which she gently takes hold of the liver and lifts it out of the way to expose the gallbladder.
You work through the third trocar: first with a scalpel, to strip away the fatty connective tissue around the gallbladder; then with a clip ligator, to clip off the cystic artery and the cystic duct. Once you've severed both, you've freed up the object of your labors. You need to crush any large stones you find inside the gallbladder, then siphon them out along with the bile. Only now can you pull the deflated bladder through the hollow trocar and into the harsh reality of the external world. Your patient, virtually unscarred, goes home the next day.
All in all, a very mundane operation--but it's not without its difficulties. First of all, surgeons say, there's the loss of stereoscopic vision--the TV view is only two-dimensional. They also note that laparoscopic surgery is awkward and unwieldy, and they compare working at the end of those trocars to signing autographs with a 16-inch pen that's sticking through a hole in a wall between the writer and the paper. In fact, a host of other surgeries would open themselves up to the techniques used in laparoscopy--known in general as endoscopy--if they didn't require a level of manual dexterity that no surgeon can achieve.
No mortal surgeon, that is. But with this surgical stumbling block in mind, researchers across the country are pursuing a new technology in which a human surgeon manipulates surgical instruments that appear to extend into a three-dimensional image of a patient's body while computers perfectly translate the surgeon's movements into the parallel actions of a machine wielding real surgical tools inside a flesh-and-blood patient.
It's a technology whose time has come, insists Army colonel Rick Satava, who is a laparoscopic surgeon as well as a program manager at the government's Advanced Research Project Agency (ARPA). Everything you need-- from the movement of the surgeon's hand, to the feel of the scalpel on the fatty tissue, to the image of the abdomen (whether from visual inspection, X-rays, magnetic resonance, computerized tomography, or any other imaging technique)--can now be broken down into a stream of electronic information, fed into a computer, and reconstituted elsewhere. "In other words," says Satava, "you can think of medicine as just another aspect of the information age, and what we used to think of as blood and guts are just bits and bytes. That's all it is--it's transferring information back and forth."
ARPA is now investing millions of dollars each year in research and development on advanced biomedical technologies; in the process Satava has become the impresario and beneficent visionary of a burgeoning field that, for lack of a better name, can be called telepresence surgery. Satava likes to call it the future of medicine: the merger of computer technology, robotics, fiber-optic communications, virtual reality, high-tech medical diagnostics, and surgery. "It is," he says, "the beginning of the age of the 'digital physician.' "
Of course, telepresence--sometimes referred to as teleoperation or telerobotics--is not to be confused with robotics. Robots, the practitioners of telepresence are quick to point out,
In Satava's vision of the biomedical future, surgeons will practice their skills in surgical simulators on virtual cadavers generated by computers; they will operate with mechanical limbs on patients hundreds of miles away; and their expertise, or that of any medical specialist, will be made available via the information superhighway to local physicians, small-town clinics, and even battlefields. "A fundamental change in medicine is coming about right now," says Satava. "All medical information- -X-ray or lab test or blood pressure or pulse monitor--can now be digitized. It can all be brought to the physician or surgeon in digital format. Now we just close the loop and let the surgeon manipulate and act on the patient through the digital world of information."
This fundamental change in medicine rests on two technological foundations: the augmentation of reality and the enhancement of performance. Satava, not entirely tongue in cheek, calls the former "giving the surgeon X-ray vision." Diagnostic information from, say, an MRI or CT scan is taken from a patient, digitized, and superimposed in three dimensions over the visual information perceived by the surgeon.
Such technology is already in place at facilities like Brigham and Women's Hospital in Boston, which recently inaugurated its "operating room of the future." Among other visionary technologies, the room includes an MRI scanning device that is open to the world so that the surgeon can perform an operation even as the patient is being scanned over and over again, as often as once every second. Position sensors are attached to the surgeon's instruments so that the computer can register their exact location in the body and show it in the three-dimensional MRI data. While the surgeon works, then, he and the radiologist can watch the operation progress on a monitor. Ferenc Jolesz, a former neurosurgeon and head of the Brigham program, likens the effect to that of an airplane pilot flying over land that hasn't been mapped recently. "Roads have changed, there are new bridges, there are new buildings," he notes. "But if you put a camera under your airplane wing, you can get real-time information and use that for navigation."
The true test of this surgical revolution's technological mettle, however, will come in the area of performance enhancement. Rather than giving the surgeon more information, performance enhancement confers superhuman skills in the guise of remotely operated mechanical devices.
Take, for example, the telepresence surgery system developed at a California company called SRI International. The apparatus is the creation of Philip Green, an SRI electrical engineer who has been inventing medical diagnostic equipment since the 1960s. Its goal is to change laparoscopic surgery from an unwieldy operation to an effortless one. To that end, Green wants to create an optical illusion so powerful that surgeons will believe they are operating on an open abdomen. In reality, however, they will work through instruments connected to a computer that will translate their motions and control a device Green calls the telemanipulator, which will do the actual surgery on the patient.
"What we have is a surgeon's console," explains Green. "The surgeon looks into the console and sees a three-dimensional image of the inside of the patient's abdomen." The surgeon then reaches into the console and grasps real instrument handles, which appear to be extensions of the actual surgical instruments inside the patient. "Then he simply carries out surgery as if it's going on in front of him. He feels the tissues, the touch of sutures when he's suturing. . . ."
The device consists, in part, of a laparoscope that uses two computer-chip cameras in place of the surgeon's two eyes to capture the intra-abdominal scene in stereo. The digitized information is displayed on a screen above the surgeon's head and reflected by a mirror placed just above where the patient would normally lie. "You don't have the sense of looking at a monitor," says Green. "You have the sense of looking into a three-dimensional world." The handles of two surgical instruments extend from beneath the mirror. They are, however, only half-instruments attached to the console at some point under the mirror; the other halves are held by the telemanipulator and are inserted through trocars into the actual patient. They appear as an image on the mirror. "What the surgeon sees when he looks down," says Green, "appears to be the whole instrument. And when he touches something, he feels resistance."
A computer algorithm takes the motion of this "master" instrument--as controlled by the surgeon--and translates it into directions for the "slave" telemanipulator so that the instrument controlled by the telemanipulator precisely mimics the motions of the instrument in the surgeon's hand. Position and force sensors on the real instrument send position and force information through the computer; the computer translates this information into sensations that can be felt by the surgeon through the handles of the half-instruments.
"You can even feel different textures," says Green. "We also have sounds reflected back--scraping or tapping sounds, for instance. It's amazing how effective that is as auxiliary feedback to enhance your sense of touch. If you move the manipulator across something and it's rough, that vibration will translate back to your hand. If you also hear that noise, it makes the experience more powerful."
The device has not yet been used in actual surgery. But according to Satava, who worked with Green as a clinical consultant, surgeons practicing on dummy humans have found the illusion created by Green's device so convincing that they have tried to reach into the reflected image of the abdomen, attempting to use their free hand to help in the operation. Of course, patients may not be quite as comfortable as surgeons seem to be with this version of reality. But for those who might feel a bit anxious over the thought of a remotely controlled device removing one of their organs, Green has an impressive demonstration. All he does is hold a grape between his thumb and index finger while the scalpel-wielding telemanipulator--controlled remotely by a surgeon--slices the grape neatly into razor-thin sections.
Such a device could also be useful outside the operating room. In Satava's vision, the remote-controlled teleoperator would work perfectly as a fixture of future combat ambulances. Combat surgery is something Satava knows a lot about. He was one of only a small handful of MASH surgeons in Grenada--"definitely an interesting episode," he says--and he put in six months during the Gulf War at the Eighth Evac Hospital. "Ninety percent of casualties occur at the front lines," says Satava. "But about half of those people don't actually have killing wounds. They die waiting to be found, or when found, they can't be stabilized on the route back." With telepresence, surgeons working in base hospitals behind the front lines would be able to operate quickly on soldiers injured in the thick of battle.
Increasing a surgeon's literal reach is one way to enhance performance; increasing hand-eye coordination is another. That's Steve Charles's goal. Charles, a Memphis eye surgeon who has performed more than 15,000 operations, has played a key role in creating a telepresence surgical system designed specifically to enhance dexterity.
Five years ago Charles, who is also a mechanical and electrical engineer, started thinking about ways to electronically control and position the microsurgical instruments he designs, miniaturized power tools that he describes as tiny "electric shavers on a stick" or "hedge trimmers .033 inch in size." He asked different surgical specialists about operations they'd like to do but just physically couldn't. Ear surgeons told Charles they'd like to operate on the semicircular canals, but the canals were too small. Eye surgeons had trouble working right at the surface of the retina. Neurosurgeons said they'd like to work at the base of the brain but had a very hard time doing so. "Many people think laser surgery took care of these problems," says Charles, "but lasers make propagating pressure waves, like a bunch of little tiny explosions. That means you can have remote damage."
The two-part contraption Charles came up with--which was developed by Paul Schenker and his colleagues at the Jet Propulsion Laboratory in Pasadena--would fit comfortably on a desktop. The first part is an aluminum box from which emerges an instrument handle--the master-- that's the size of a large pen. The slave arm emerges from another, slightly larger box. The arm is about an inch and a half in diameter and six inches long, and it has seven joints that allow it to mimic any human motion.
Charles can readily list the advantages the microdexterity system will give surgeons, including what he calls position downscaling. "What does an operating microscope do?" he asks. "It magnifies your view, scales up the workplace, so it enhances your visual system. What this does is scale down your positioning motion. You make a one-inch motion, and it makes a half-inch or quarter-inch motion. The net effect is to improve your dexterity like a fine-focus knob." The computer can also sense the involuntary tremor of even the steadiest surgeon's hand and remove it from the signal sent to the slave arm. When the mechanical arm then mimics the surgeon's movement, it does so with a cold-blooded calm no human could ever approach. And lastly, says Charles, the system allows the surgeon to preset any coordinates past which the teleoperator shouldn't go. "If I know from my training, for example, that the tissue I want to cut is only 100 microns deep," he explains, "I can set a depth layer of 100 microns so that as I drill through tissue or bone, I can't end up going too far."
Charles is already talking to major health care distribution companies about marketing his microdexterity system, but he first wants to get the cost down to a level at which every hospital could buy one and every surgeon could use one. That means getting it below $100,000. "I think we're about a year and a half away from the marketplace," he says, "and a half-year away from operating on patients in my own facility."
Ian Hunter is looking much further into the future, with what might be called the ultimate surgical training tool and teleoperator. Last April, Hunter--an instrumentation scientist--moved his laboratory en masse from McGill University in Montreal to MIT, where he is working on combining all the technologies of the information revolution into one virtually unbelievable package.
Years before the move, he and his lab mates built what Hunter calls a teleoperated microrobot. It is a remotely operated device with two limbs, each one of which can move with a precision of 400 billionths of an inch. And it is constructed so that when you manipulate the master controls, your motions are scaled down instantaneously by a factor of up to a million. This enables the microrobot to hold a single living cell between the tips of its two limbs--a skill Hunter has used to do experiments on isolated muscle cells. Not only can the microrobot mimic your motions on a microscopic scale, but you can feel the forces it feels while holding the cell: just as the microrobot's motions are scaled down, the forces it feels are scaled up. Hunter claims this is a scientific first. "By this master- slave system," he says, "we were able to bring the microworld of the muscle cell up to human scale."
Four years ago Hunter went looking for real-world applications for his system. Like Charles, he found one in eye surgery; he met a prominent surgeon who told him that much of his practice involved correcting the damage done by other eye surgeons. "It seemed pretty clear," says Hunter, "that certain areas of microsurgery were ripe for the infusion of microsurgical robots. So we decided to develop an eye surgery system. The technologies would also be applicable to other areas of microsurgery, but you have to focus on something."
Hunter's ophthalmic system has a master-slave telepresence surgeon and a stereo head that observes the surgery for the human surgeon from a point of view just above the microsurgical robot. The view is controlled, however, by the human surgeon, who wears position sensors on a virtual reality headset; those sensors send data to two small computer-chip cameras in the stereo head, and the cameras then faithfully follow the movements of the surgeon's head. "You make a movement with your head," says Hunter, "and the slave camera head makes the same movements and presents you with stereo color images of what it happens to be looking at." Hunter says he already has the technology for an even finer-tuned eye-tracking system so that the slave eyes can follow the motions of the surgeon's eyes, but he hasn't decided whether to incorporate it into his telepresence surgery system.
What's perhaps most impressive about Hunter's system is its speed. Both the slave head and the slave surgeon can move faster than the human eye itself. This means that during eye surgery, the slave surgeon can position itself over a single point on the eye--say, where the surgical incision is to be made--and then stay positioned perfectly over that point, no matter how much the eye flicks around. "A fast eye can move up to about 800 or 900 degrees a second," says Hunter, "and our device can move at well over a couple of thousand degrees a second." It can do this because of a second tracking system in the slave head, which detects the motion of the eye that's under the knife and moves both the slave head and the slave surgeon in response. As a result, the human surgeon, who would be performing the operation remotely and seeing the eye as the slave head sees it, "would not see an eye that was moving but one that was stable," explains Hunter.
By giving the surgeon an inherently stable target on which to work, notes Rick Satava, Hunter's telepresence surgeon should also be able to operate in a battlefield ambulance on the move, bumping along. And, adds Hunter, it could be a boon to open-heart surgery. Because it's impossible to work on a beating, moving heart, surgeons need to stop the heart and then bypass it to perform an operation. "If you were developing a heart surgery robot," he says, "a highly desirable feature would be to make it faster than the heart so you could stabilize it with respect to the heart and do the operation while the heart was still pumping blood for the patient."
All this, however, is mere reality. Hunter's system is able to go beyond--to create a virtual reality surgical training system in which a surgeon can learn to work on a three-dimensional, computer-generated, full- color image of a patient's eye. This virtual eye can be programmed to be so realistic that if the surgeon was to shine a virtual light on its virtual retina, the virtual pupil would constrict.
The same master instrument that controls the slave scalpel in the real surgical system also controls the virtual slave scalpel in the virtual reality system. "When the virtual slave comes in contact with virtual reality tissue," says Hunter, "we calculate the forces that would have been generated if it was real, and can reflect them back to the master. So the surgeon can perform an operation on a real eye or a mannequin eye or an animal eye--or can perform that same operation on the virtual environment."
The result is a surgical training system that provides for a surgeon what a flight simulator provides for a pilot, which is--to put it bluntly--a way to learn the craft without risking anyone's life. Rather than sit in a simulator and view an imaginary world outside, however, the surgeon will just don a headset through which he can see his virtual patient, hold onto his master, and then cut away inside a computer. "This will allow surgeons to practice their surgical procedures on imaginary cadavers over and over again," says Satava. "Eventually we will be able to merge the diagnostic information from X-rays, MRI scans, and CT scans, and surgeons will be able to generate a patient-specific model of the anatomy and disease. They'll practice on these virtual patients the night before and perform the actual thing the next day."
Whatever happens, surgery is about to change once and for all. Satava and Green foresee a day in the not too distant future when local hospitals and even physicians' offices will be able to have remotely operated surgical robots, and surgical specialists will do operations from medical centers perhaps hundreds of miles away. "This is a way to do surgery better without having to use your own hands," says Satava. "It just doesn't matter where the patient is anymore. It's rosy both for medical care and for the patient."
Whether this technological fantasy is feasible is an open question. The speed and power of computer and robotic technology is increasing exponentially, and much of what can only be imagined today is likely to become reality someday soon. Still, some surgeons worry that telepresence surgery will be useful only for very specialized operations, like laparoscopy. They're also troubled by the ethical and technological problems that will have to be solved before surgery can be performed without a surgeon present. What if blood squirts on a camera lens, asks one surgeon, or a circuit fries in the robot arm or computer? Advocates of telepresence admit that such problems may crop up but point out that a surgeon can always have a heart attack during an operation, or one too many beers beforehand.
John Siegel, a surgeon who runs the New Jersey Trauma Center at the New Jersey Medical School University Hospital in Newark, thinks that response is a bit simplistic. If something goes wrong during laparoscopic surgery, for instance, a live surgeon can open up the abdominal cavity immediately and go in with his hands. A telepresence surgeon would be at a loss. In addition, surgeons generally operate in teams, so if one surgeon falls ill, another can usually step in. Also, a surgeon's hands can be as crucial a surgical tool as the scalpel or laser beam. "When you're dealing, for example, with major wounds to the abdomen and chest," says Siegel, "you can have many organs injured. In a large number of penetrating injuries from urban warfare, you're working in a pool of blood. A lot of what's done is done by feel rather than instruments. The ability to get your hands in there and feel around, and the speed you often have to move at to get that kind of control, is key."
Still, he says, the potential for telepresence is obvious in the operating room. He just doesn't want it to get there before it's ready. "People shouldn't think this is more than a technique in the workshop," he notes. "To the best of my knowledge, nobody has actually done a full surgical procedure under these circumstances from beginning to end--even on an animal. There's a lot that has to be done between concept and actual implementation in a clinical setting. But that's what research is all about."
Ian Hunter, meanwhile, is not worrying so much about implementing his technology today--"We're talking about years of regulatory testing before this sort of system will be out in the commercial world"--but about "blasting ahead" with innovation. In the short term, for instance, there's his idea for a miniaturized artificial hand for his slave robot. "We now have conventional eye surgery tools being held by the microsurgical slave," he says, "but ultimately what you want is a very sophisticated microhand on the end of the microsurgical robot, so that under control of the human hand you could go in and just grasp the tissue."
Further down the line, Hunter has an even more fantastic idea. Although his surgical robots are small enough to do microsurgery, they are not themselves microscopic. "A major challenge in the future," he says, "is to create microrobots that are truly micro. In other words, devices that could autonomously navigate inside the body, sending out diagnostic information, delivering drugs, and ultimately performing surgery.
"That's not short-term research. However, it's not a fantasy, either."