Atomic-scale computers that exploit the bizarre rules of quantum physics have the potential to process enormous quantities of data far more quickly than today’s devices. In June, researchers at Yale University announced progress toward this goal, creating the first quantum processor that is built into a conventional silicon chip.
Quantum computers process information using bits that behave like atoms, so even the slightest disturbance would ruin the process. Previous experiments had required complicated lasers or magnets to keep the system stable, but the Yale team’s processor was designed into computer chips. With one calculation, the device solved a math problem that would take an ordinary computer as many as four steps. The key difference is that quantum bits can take on fuzzy values: not just 1 or 0, but in some sense everything in between at the same time.
While the Yale research focuses on hardware, a team from MIT and the University of Bristol in England is finding better ways to use quantum computations. In October the group described a new algorithm that could rapidly solve the complex linear equations at the heart of many key processes, including image processing and gene analysis.
Turning the Yale experiment into a useful computer will require adding many more quantum bits and managing how those bits interact. “It just seems so difficult to make a large-scale quantum computer,” says Steven Girvin, a Yale physicist who coauthored the findings. “But five years ago I never thought we’d be where we are now.”