WINNER: Pacific Northwest National Laboratory’s Portable Ultrasound Unit
INNOVATOR: Rik Littlefield
Imagine a doctor in New York examining a wounded soldier in Bosnia with ultrasound. All the technology needed to perform such a feat-- portable ultrasound devices and the means for transmitting the data from them--are available right now. There’s just one problem, says Rik Littlefield--most doctors wouldn’t know how to use the equipment. To sift through transmitted images displayed on a monitor, the doctor in New York would have to master arcane computer keyboard and mouse commands. It’s nonintuitive and ineffective, he says. Most doctors just wouldn’t bother to learn how.
Littlefield, a computer scientist at Pacific Northwest National Laboratory in Richland, Washington, has put together a portable ultrasound system that makes the doctor in New York feel as though he is actually on the battlefield. The field operator--who requires only a modest amount of training--positions a probe over the patient’s wound and then the system mechanically scans inside the patient. The 3-D data are then transmitted to a computer back at the hospital. There the doctor holds an input device that looks and feels like the handheld probes that doctors ordinarily use on ultrasound patients. As the doctor moves and twists the ersatz probe in midair, motion-tracking sensors relay positioning data to the computer, which responds by changing the view on the doctor’s screen as though the patient were lying there in front of him. Of the two dozen doctors who have tried it, says Littlefield, not one has taken more than five minutes to get comfortable enough to make a diagnosis. The old system took me hours to learn, and I’m supposed to be an expert.
Littlefield had a prototype tested last August, and now he is readying a final unit to submit to the U.S. Food and Drug Administration for approval. Although the military will probably make the first units for the battlefield, rural health care is likely to benefit as costs come down in a few years. I’ve already started getting calls from doctors in Alaskan bush communities and from Native American reservations, says Littlefield.
MIT and IBM’s Personal Area Network
INNOVATOR: Tom Zimmerman
It started out as a magic trick. In 1994 the performers Penn and Teller were collaborating with mit’s Media Lab on technology for their act: they needed a way for Penn to play a set of invisible drums by waving his hands in the air. Graduate student Tom Zimmerman came up with the novel idea of setting up a quasi-static electric field that radiated outward from Penn’s body--making him, in a manner of speaking, a human transmitter. A receiver picked up his motions and used them to control a music synthesizer.
Afterward, Zimmerman and some fellow students were complaining about pagers, cellular telephones, and other so-called conveniences that often require lengthy strings of numbers to be punched in. Wouldn’t it be great, they said, if your pager could automatically transmit the telephone number on its screen to your cell phone without your having to punch it in? Zimmerman thought his static field would fit the bill. He rigged up a rough prototype, called a Personal Area Network, for two people so that when they shook hands, electronic devices strapped to their bodies exchanged a few bytes of information--about as much as you’d find on the typical business card. Last year, now as a researcher at ibm’s Almaden Research Center in San Jose, California, he added some security features and made it small enough to fit in a pocket. I want to replace all the cards you have in your wallet with this one device, he says.
Unlike the free-flowing electrons of an electric current, the static electric field created by one of Zimmerman’s devices passes right through clothing and other insulators: a tiny transmitter’s electric charge exerts a force against the ions in your body (such as the salt in your sweat), which in turn pushes against other nearby ions and so on until the field envelops your entire body, as well as any electronic device you’re holding in your hand. To relay information, the transmitter merely turns the field on and off in code. The catch is, this turning on and off must take place slowly, or it will send radio waves that interfere with other devices. For this reason Zimmerman’s tool transmits only 300 bits of data a second, about the speed of an obsolete computer terminal. That’s not a problem, he insists, because all it needs to do is identify the person wearing it. We’re usually touching a machine we’re working with, and the first thing we do is tell it who we are, says Zimmerman. You don’t need a lot of data to do that. With only 32 bits, you can identify 4 billion people.
Me and It, Side by Side
Northwestern University’s Cobots
INNOVATORS: J. Edward Colgate and Michael Peshkin
Auto-assembly workers must often install heavy, unwieldy parts smoothly and precisely, and it’s never been an easy task. Conventional robots aren’t much help--they’re lousy teamworkers and dangerous for humans to work alongside. So when Northwestern University mechanical engineers J. Edward Colgate and Michael Peshkin were asked by officials at General Motors to devise a better assembly tool, they quickly focused on the issue of guidance. And the solution they came up with was virtual walls. A conventional robot would hold the bulky part, but the robot would be programmed to keep the part within a certain area contained by the invisible walls. A worker putting in a door, seat, or windshield would use the virtual walls as invisible guide rails to maneuver the part into the correct spot, in the same way that a ruler helps you draw a straighter line.
Colgate and Peshkin quickly ran into a couple of problems. Robots aren’t good at making the walls feel smooth, Colgate says. There is a jerkiness or bumpiness to a robot’s motion. More important, they still felt uneasy about allowing powerful motor-and-gear-driven robots to hoist heavy parts close to human workers. After pondering their dilemma, Colgate and Peshkin came up with a rather unconventional robot, designed specifically to work side by side with humans.
Their cobots, for collaborative robots, have no strong motors or gears and won’t move unless pushed. Their only moving parts, in fact, are the wheels (the same ones used on Rollerblades), and the motors that steer the wheels along the virtual walls. Inside the walls created by the cobot’s computer, those wheels move smoothly and freely, like the casters of an office chair. At the wall boundary, however, the computer takes over, turning the wheels parallel to the wall and allowing movement only along the periphery.
The Northwestern team has developed a variety of cobots, including a simple one-wheeled machine, first displayed publicly at a robotics conference in April 1996. Since then the engineers have built three-wheeled cobots to help gm assembly-line workers install automobile doors. These cobots, fitted with gripper arms to hold the doors, will be tested this fall.
Stanford University’s Total Access System
INNOVATOR: Neil Scott
It’s easy to see how computers can be so helpful to the disabled: among other things, they can synthesize speech, open doors, and answer phones. Unfortunately, not everyone has the dexterity required to use a mouse or keyboard. And devices designed to get around these problems--those that track the movement of the eyeball or head, or understand speech-- usually must be customized to the individual and to the particular computer, which makes them expensive. I knew there had to be a better way, says Neil Scott. I felt these sorts of aids should become a part of everyday life.
Scott, an engineer at Stanford’s Center for the Study of Language and Information, and his colleagues have developed what they call a total access system, which acts as a universal interpreter between the computer and the person using it. It consists of one tool that attaches to an input device, such as a head tracker or speech recognizer, and converts the device’s electronic signals into the interface’s own standard signal. Another tool attaches to the computer and converts the standard signal into those the computer can understand.
Scott thinks the dual interface can eventually be made inexpensively and small enough so a user could carry it in his or her pocket and be able to approach any computer and start using it right away-- in most cases with greater efficiency than with a mouse or keyboard. And disabled people might not be the only beneficiaries--Scott believes the dual interface could be especially useful for preventing repetitive strain injury. I get furious when I see people whose lives have been devastated by rsi and who have been out of work for years, he says. It doesn’t have to happen.
University of California’s Human Power Extender
INNOVATOR: Homayoon Kazerooni
Homayoon Kazerooni has a robot that bears an uncanny resemblance to the one Sigourney Weaver uses in the climactic scene of the movie Aliens. In the movie, Weaver climbs into a gigantic mechanical robot whose arms and legs move precisely as she moves her own arms and legs, and proceeds to fight an alien monster on its own terms.
Kazerooni, a mechanical engineering professor at the University of California at Berkeley, is actually more interested in the movie’s earlier scenes, when workers use the robot to lift heavy objects. For the last 15 years, he has worked on ways to extend the reach and amplify the strength of human workers. I don’t believe in total automation, Kazerooni explains. Even for jobs that involve a great deal of manual labor, he says, it makes more sense to keep a human mind in control by augmenting human muscles with mechanical ones. His human power extender--a two-armed, two- legged robot that looks like a 12-foot mechanical stick figure--is intended to shield workers in warehouses and assembly plants from occupational injuries.
Although the prototype robot was completed and tested in June 1996, it’s only a demonstration model. The technology is very expensive and complicated, Kazerooni explains, particularly the legs. The legs are taking me longer than I had expected. For this reason, Kazerooni hopes that a simpler device he has developed will be ready for commercialization sometime next year. A mechanical arm suspended from a factory ceiling moves in concert with an electronically wired glove worn by a human operator. The glove not only sends signals to the mechanical arm, telling it how to move, but also receives feedback, which it transmits to its human wearer. For instance, when the mechanical arm picks up a 50-pound box, the operator feels as though he’s lifting a 2-pound box. In this way a worker can exert better control over the equipment than is possible with a joystick or a keyboard.