Engineers have spent the better part of fifty years shrinking more and more electronic circuits onto tiny specks called chips. Judging from the proliferation of gadgets and personal computers, they’ve done a pretty good job of it. But they all know that the party can’t last, that sooner or later they’re going to reach some kind of limit—though exactly what that will be, nobody really knows. This year, however, two advances promise to postpone the day of reckoning.
First, research made an improvement in how they interconnect the basic component of most electrical circuits, the transistor. When it was invented at Bell Laboratories invented back in 1947, it was a reasonably simple device: it was a pea-size sandwich of several different materials, and it had three wires sticking out of it. As engineers began putting more than one transistor on a single sandwich, or chip, they tried to connect the transistors together right on the chip by building tiny wires out of copper. Copper is an excellent conductor of electricity, but copper atoms have the nasty habit of diffusing themselves into adjacent layers of silicon dioxide, rendering it useless as an insulator and corrupting the chip. So they instead used aluminum, a so-so conductor with well-behaved molecules. By the mid 1990s, however, as engineers cram a hundred million transistors into an area the size of a contact lens, that decision is proving to be an intolerable nuisance. Whereas the average transistor has shrunk to a length of a couple of millionths of an inch or thereabouts, the aluminum wiring has to be ten times ck thicker to carry enough electrical current.
Last September, researchers at IBM’s Thomas J. Watson Research labs in Yorktown, NY put the finishing touches on a new chip-making technique that lets them use copper after all. First they make a layer of the chip with millions of copper wires on it. Then, before applying the insulation between the wires, they lay down an ultra-thin coating of some kind of metal—IBM isn’t saying which—that keeps the copper molecules in place. Since the coating is a reasonably good conductor (albeit not as good as copper), engineers can get almost as much current down the wires as if they were pure copper. IBM also got another benefit into the bargain: the technique allows them to use copper for so-called vertical wiring, which interconnects the different wiring layers on a chip with each other, rather than tungsten. That should simplify manufacturing and drive down the cost of chips. We’re not just changing one thing here, says Tom Theis, a physicist and leader of the IBM team. We’re changing everything about the way the industry makes interconnections in chips—the materials, the process—and we’ve had to figure out all the tricks to make all this work together. IBM expects to roll out its first chips made with copper wiring this year , but eventually, says Theis, all chips will probably be made with copper wiring.
In the second announcement, also in September, Intel Corp. engineers had come up with a new method for reducing by half the number of transistors it takes to make a bit—one unit of binary memory. A bit is represented as an electrical charge in a single transistor; if a charge is absent, the bit is a zero, and if a charge is present, it is a one. Intel devised a way of also measuring a one third charge and a two-thirds charge, which gives them, in effect, two bits instead of one. The advance applies only to so-called flash memories—the kind in your cell phone that can remember your caller ID number even after the power is turned off—but it’s impact should be widely felt in the next year or two in gadgets of all sorts, as bigger and cheaper flash memories make their way into digital cameras, answering machines, and cell phones.
But the real news is—nothing much will change. People will notice that things are continuing on the present course, says Theis, which is very rapid progress.