Technology

Future Tech

Soon you'll be able to reach out and touch someone on the internet

By Brad LemleyAug 1, 2000 5:00 AM

Newsletter

Sign up for our email newsletter for the latest science news
 

I'm beating a monkey's face to a pulp. But don't run off to alert PETA— it's really not the way it sounds.The abused ape isn't real. His face is a digital simulacrum, floating on a high-resolution screen, created to help me test a weird computer peripheral called the PHANTOM. It resembles a desk lamp, minus the bulb and shade, and interacts with the computer-generated image. Wherever I move the business end of this gadget, the cursor on the screen moves in sync, denting the virtual simian's cheek, flattening an eyebrow, or crumpling an ear.

In short, the PHANTOM is a computer mouse that has entered into another dimension. A conventional mouse moves a cursor across the screen in many directions— up, down, diagonal, around— but on only one plane. This new tool goes not only to the x and y axis but also opens the in-and-out, z axis. With it, I can smash in the monkey's nose, which is pointed right at me, by pushing my hand forward; the ball-shaped cursor convincingly shrinks as it glides away. Even more amazing, whenever the ball smacks the monkey's face, I feel the resistance in my hand as a stiff, claylike friction. In less than a minute, I've become so proficient that I can repair the fellow— smooth lumps, fill dents, and round off broken teeth. The manufacturer contends a 5-year-old can master this peripheral. I believe it.

The PHANTOM was designed and built by SensAble Technologies, a seven-year-old company with 50 employees housed in a nondescript brick building in Woburn, Massachusetts. Coupled with a software program called FreeForm, it may herald an important change in the way human beings and computers interact. "The last time the computer-human interface had a substantial improvement was the mouse, which came out of Xerox about 25 years ago," says SensAble marketing director Andrew Hally. "This is the next move up."

Which sounds like the kind of overheated claims marketing directors get paid to make. Sure, it's a snap to use, but do computer users really long to smash, stroke, poke, or otherwise manipulate and feel the objects on their screens? After spending a few hours with the PHANTOM, I started to get a feeling of déjà vu. In the early days of personal computing, some users huffily insisted that arrow keys were more than enough, so who needed that goofy mouse? The mouse evolved from toy to essential tool because it inspired designers to dream up applications for it. An interface based on haptics— the more technically correct term for interactions involving both feeling and position— will likely follow the same path. "There are a lot of problems that you can solve better if you can use your sense of touch in 3-D space," says Hally.

The first widespread commercial application of haptic tools arrived as part of computer-aided design, or CAD, of complex forms. My monkey face originated as a concept model to show toy designers how the PHANTOM could help them with their CAD work. Most of the 600 PHANTOMs sold to date are used by sculptors and engineers for companies such as Adidas, Disney, Hasbro, LEGO, and Honda, and users are ebullient. A mouse-driven CAD program works adequately for designing squarish buildings, but try modeling a titanium hip replacement, a snazzy new sneaker, or an action figure of Jar Jar Binks using only lines, grids, and circles. "To a sculptor, it really is a revolutionary development," says David R. Fish, who crafts clay footwear models for Nike and is in the process of switching to PHANTOM-based prototyping. "You're not restricted by gravity or scale, and it's totally intuitive. I was doing useful work with it within one day."

Not only does the system emulate physical sculpting, it allows techniques that are impossible in real clay, such as pushing out from the inside. If an artist wants to sculpt a nose, she can click the cursor "off," push it inside the digital clay, click it "on," and then pull material toward her. Coupled with 3-D printers, which can manufacture plastic items from a computer-specified form, objects sculpted in cyberspace can become real at the touch of a button.

The $15,000 PHANTOM/FreeForm package is among the first haptic applications on the market, but more ambitious systems are already under development. At the Medical College of Georgia and Georgia Technical Institute, doctors and engineers have created an eye-surgery simulator for training students. The digital scalpel feeds back springy resistance as it touches the white of the digital eye; then resistance lessens as the cyber-blade slices through. Physical-skill training of all kinds is "a huge application," says leading haptics researcher J. Kenneth Salisbury, who, while a professor in MIT's mechanical engineering department, helped to develop the PHANTOM with SensAble CEO Thomas Massie. Video game applications are right around the corner. A San Jose, California, company called Immersion is leading the way with its Wingman Force Feedback Mouse. For $99, company literature asserts, gamers can "feel terrain, explosions, the effects of magic, and the impact of combat."

The haptics revolution is bolstering telerobotics, in which a remote user, watching via video cameras, directs robotic arms engaged in a task too dangerous, precise, or inaccessible for naked flesh. Telerobotics works a lot better with tactile feedback indicating what the arms are doing. Applications for such interactive devices include fighting fires, exploring undersea or on other planets, and handling hazardous materials, but the most promising home may be surgery. Already used for heart operations in the United States and Germany, million-dollar machines allow a surgeon to operate through half-inch-wide incisions by manipulating thin robotic arms. The minimally invasive procedure speeds patients' recovery and someday may extend surgeons' careers by digitally canceling age-related hand tremors. "Eventually such techniques will be applicable in neurosurgery or fetal surgery," says Salisbury, now a professor in both the departments of computer science and surgery at Stanford University.

In the long run, virtually every action that involves real-world touch could be simulated or enhanced via haptic interfaces— and some human activities that have never been touchy-feely could become so. "I've often wondered if you could teach physics more effectively if your students could feel molecular attraction or planetary motion," says Salisbury. Or haptic objects might become part of your daily life, he says: "You could go shopping, feel the weight and texture of what you want to buy, examine it from all sides, turn it on, set it down in your virtual kitchen to see how it looks." Such objects might be analogues of real-world items, but they could just as easily be digital entities, created, traded, modified, passed down to succeeding generations, and ultimately discarded only in cyberspace.

While interacting with a tangible world made only of bytes may seem isolating, Salisbury believes the opposite could prove true. Haptic interfaces linked through the Internet could provide new ways for people to gather. "You could have collaborative physical efforts: shaking hands, playing volleyball, pushing the stones up the pyramid ramp," he says. He even imagines scientists performing cooperative haptic molecular engineering. "Imagine a bunch of people with haptic interfaces putting molecules together. They push and shove, and finally you hear somebody say, 'I think it will go!'"

Someday, we may view haptic interfaces as just one more step we took toward the computer-immersed human. The evolution began with the first computers, as punch cards and then screens engaged the sense of sight. Later, computers gained sound, then touch. Today companies such as Digiscents, TriSenx, and AromaJet are laboring to bring smell and even taste to the digital realm. The omega point is the day when the virtual world becomes as real to humans as the outside, three-dimensional world, with one important distinction. For good or ill, the virtual world's only limitations will be the ones we choose."It's kind of an obvious play on words," Salisbury says, "but we've really only scratched the surface."

SensAble Technologies' Web site: www.sensable.com.

For information about Kenneth Salisbury's work with haptics, see www.ai.mit.edu/projects/handarm-haptics/haptics.html and mdn.stanford.edu/events/salisbury.html.

Immersion's Web site offers more info about their interactive mouse: www.immersion.com.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2023 Kalmbach Media Co.