We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

1998 Discover Technology Awards: Sight

Jul 1, 1998 5:00 AMNov 12, 2019 4:34 AM

Newsletter

Sign up for our email newsletter for the latest science news
 

Conventional TV and computer displays huff and puff to form an image--usually on a screen or monitor--and then your eyes take it in. Tom Furness has chosen a slightly more aggressive strategy. Why not skip the peripheral dribbling and passing, he wondered, and slam-dunk the imaging signal directly into your eyeballs with lasers?

"For 23 years," says Furness, "I worked at Wright-Patterson Air Force Base on virtual cockpits for fighter pilots." Since pilots need to process visual information quickly, projecting instrument readings, maps, and so forth onto the plane's windshield seemed like a good idea. This wasn't easy to do because competing with the bright views blasting directly into the pilots' eyes required a very high luminance. "It was clear to me," Furness says, "that we had to have some kind of paradigm shift." After moving to the University of Washington in 1989, Furness teamed up with engineer Joel Kollin and tried using tiny, harmless lasers--about as bright as daylight--to scan virtual images, line by line, directly onto the retina. "A lot of people said it wouldn't work," he recalls. They didn't think that the illusion of whole images would persist. But it did persist.

The real challenge was packaging the technology--how to make the lasers, and the mechanical scanners that point them, small and precise. In 1993 the university received funding from Microvision, a Seattle corporation, in return for a license to develop the technology. The latest hardware package is the size of a thimble. At its heart is a fast, minuscule oscillating mirror that reflects red, green, and blue laser beams into the eyes. Clipped onto glasses, the device gives you virtual color images that seem to hover at arm's length. By adjusting the brightness, you can make this display look like a transparent overlay on reality, or you can make it obliterate the boring real world. Last November, Microvision delivered retinal displays to the Air Force and two aerospace companies, and it has a contract to make a helmet-mounted version for helicopter pilots. Similar hands-free devices could show, say, a patient's ghostly, transparent X-rays to operating surgeons, or float diagrams from a repair manual in front of greasy-handed mechanics, or lay X-marks-the-spot virtual maps of a minefield in front of soldiers' footsteps. For now, at a cost of $400,000 and more, these displays are beyond the grasp of ordinary consumers, but Microvision's zealous engineers are working on ways to mass-produce them.


FINALISTS

Digital DuoCanon's Optura Digital Video CamcorderInnovator: A. Tajima

It's your toddler's birthday, and Grandma's presenting the cake. Do you pick up the camcorder to capture the action, or grab the 35 millimeter for high-quality stills? Whichever you choose, the other opportunity is gone forever. Now you can relax. Last year Canon introduced a new kind of camera, the Optura, that's good at both videos and stills. It happens to be digital, but that's no accident. Digital videotape, which holds an hour of video or 500 high-quality stills in a cassette the size of a matchbox, is what made this hybrid camera possible.

Although the idea seems obvious, it's not easy to grab crisp, still images from a stream of digital video data. The problem lies in the technique of "interlacing," in which the camera scans every other line, and then goes back and scans the alternate lines in the succeeding frame. Interlacing is good for getting smooth-looking video, but it yields lousy snapshots. So a group led by A. Tajima, chief executive of camera operations at Canon in Tokyo, developed, instead, a new chip that executes a faster, "progressive" scanning technique, one full frame at a time. "The large-scale integrated circuit was the most challenging," Tajima says. "We reduced the function of five integrated circuits"--signal processing, image compression, expansion, error correction, and reproduction--"into one."

Tajima's product team then designed a remarkably small, two-pound camera to have the look and feel of a 35mm single-lens reflex. The Optura works, in fact, as if you've simply added video capability to a regular camera.


Eagle-eyed Night Vision Oak Ridge National Laboratory's Micromechanical Infrared Imager Innovator: Thomas Thundat

Have you ever hit a deer in the dark? Many collisions occur because the driver's vision is limited to the range of the headlights. Several years ago Thomas Thundat found a way to peer beyond them.

It started with a chance observation. In 1991 Thundat, a health researcher at Oak Ridge National Laboratory, was examining dna molecules with an atomic-force microscope. Every so often, the instrument's scanning tip--a whiskerlike projection only one ten-thousandth of an inch thick--curled unpredictably. "It took me a month to figure out why," he says. Heat from a diode was bending the gold-coated tip, just as changes in temperature bend the bimetallic strip in a thermostat. Thundat ran some calculations. His microscopic whisker had so little mass, it responded to temperature changes as slight as a millionth of a degree. If a lens focused a lit match's infrared (heat) radiation several miles away, the match might belt out enough heat to bend such a whisker.

Since we are awash in infrared radiation, Thundat realized, an array of these whiskers, etched into a microchip, would act like a mechanical retina. As the whiskers bent, circuits on the chip would light up pixels on a monitor screen, and you'd be able to see in total darkness. They would be as sensitive as the infrared sensors used in astronomers' telescopes, but they wouldn't have to be cooled in liquid nitrogen.

So far Thundat and colleagues have managed to make infrared images by painstakingly moving the tip of an atomic microscope across the focal plane of a lens. Having proved it works, they are now looking for funding to build a microchip array of custom-made whiskers that would give them video-quality images. Eventually, Thundat says, this technology will make its way into diagnostic medical instruments, security-system cameras, or night scopes for cars.


The Look of Noise Scripps Institution of Oceanography's Acoustic Daylight Ocean Noise Imaging System (ADONIS) Innovator: Michael J. Buckingham

Oceans are dark because water transmits light poorly. But water conducts sound easily for miles. Wouldn't it be better, then, if you could "see" sound waves? That's the logic behind sonar: send out pings of sound, which are reflected by enemy submarines or fish, and convert the reflections to images on a monitor. The problem is that water conducts sound so well that oceans are teaming with noise--from thundering waves, pattering raindrops, singing whales, sputtering engines. "In the conventional view," says Michael Buckingham, a physicist at Scripps Institution, "noise does you damage" by interfering with sonar signals. A decade ago, Buckingham had a flash of insight: perhaps the noise could be useful. Sending out a sonar signal is like turning on a headlight. You see only what the beam sweeps across. But meanwhile the ocean's ambient noise is bouncing off everything underwater, like the ambient daylight our eyes can interpret. Why resort to sonar when the ocean may be filled with "acoustic daylight," also known as noise? "It was a fairly radical idea," Buckingham says. Fellow researchers greeted it not so much with skepticism as with silence.

Buckingham responded by building ADONIS, the Acoustic Daylight Ocean Noise Imaging System. A dishlike antenna focuses ambient noise on an array of 126 hydrophones, and the signal at each hydrophone controls a corresponding glowing pixel on a computer screen. With the aid of filters and fancy algorithms, Buckingham tunes in at one specific frequency at a time. "The first time we had ADONIS in the water," he says, "it was quite an exciting moment." Right away, his team could make out vague shapes, such as sand-filled barrels placed up to 40 meters away as targets. "We were all stunned."

The images don't look that impressive, partly because the number of pixels is so few. Buckingham plans to improve resolution by adding more pixels, and he thinks he can boost the range of vision up to a third of a mile. "As with the first television pictures back in the 1930s," he says, "the important point is that the images exist at all. Now we can develop techniques to improve the quality."


Sawed-off Monitor Westaim's Thick Dielectric Electroluminescent Flat Panel Display Innovator: Xingwei Wu

Wouldn't it be nice to unplug that hot, heavy, power-sucking fat hog of a monitor from your desktop computer, saw off the back, and use the screen alone? Liquid crystal displays give you that nice flat shape, but they are expensive, dim, have a narrow viewing angle, and respond sluggishly, which makes them lousy for video. The less widely known electroluminescent display seems bright and quick enough to muscle out the fatties, if only there was a way to manufacture them easily, give them good color reproduction, and bring down the price. Over the past seven years, Xingwei Wu's research and development group at Westaim Corporation, of Fort Saskatchewan, Canada, has demolished the first two barriers and made a large dent in the third.

In an el display, the inside of a transparent screen is coated with layers of phosphors and dielectrics, materials that don't conduct electricity but can become polarized in the presence of an electric field. An alternating voltage applied to this sandwich forces electrons to move, first one way then the other, making the phosphors glow. To avoid using a high voltage, which breaks down the dielectrics, these layers are made so thin--only a few microns, or ten-thousandths of an inch, thick--that manufacturers have to use a fussy, expensive technique called vacuum vapor deposition to make them. When Wu arrived at Westaim in 1990, after much research he substituted a different dielectric that could withstand a higher voltage, allowing him to use layers ten times thicker and less expensive to manufacture. His idea was to build each layer merely by slapping on a thick slurry of paste all at once. "Of course, it wasn't as easy as I thought," he says. After much experimentation, he hit upon a simple procedure similar to silk screening T-shirts. A rubber squeegee forces the pasty materials through a fine metal mesh onto the display screen, where it dries to a uniform thickness. "Those so-called low-tech processes had been forgotten in high-tech industries," Wu says.

Unlike thin film displays, which can be ruined by a speck of dust, Wu was able to make his first el display without a clean room altogether, "which is incredible to anyone in the industry," says applications engineer Don Carkner. Wu is now putting the finishing touches on a stable blue phosophor to complement the reds and greens already available and to achieve, at last, full color. Since tdels still cost a bit more than conventional displays (each pixel in the display needs its own electronic driver), you can expect thick-film el displays to appear more often on industrial instruments, in military applications that need mobile, rugged, bright displays, and on medical displays where you don't want limits to the angle of viewing.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.