Over the past six months, we Earthlings have seen some pretty awe-inspiring images through the James Webb Space Telescope (JWST). Since the telescope's first image was revealed to the public in July, 2022, JWST has captured images of ancient galaxies, glittering nebulas and remote exoplanets.
It’s clear these pictures aren’t the work of your average point-and-shoot camera — each one is the result of an impressive array of instruments and technologies, finely tuned to bring us cosmic views so dazzling they could be mistaken for computer-generated graphics.
How the James Webb Space Telescope Works
It all begins when light from a distant object strikes the telescope’s 21-foot-wide, gold-plated mirror, which is composed of 18 hexagonal segments. Dividing them this way made it easier for NASA scientists to launch JWST into orbit, but they needed to be calibrated with astounding precision to act as one giant mirror and maintain sharp focus.
That calibration is achieved through “wavefront sensing,” a realignment process engineers must repeat every couple weeks to ensure the segments don’t shift by even a few tenths of a nanometer.
In the words of Lee Feinberg, optical telescope element manager at the NASA Goddard Space Flight Center, “each mirror is aligned to 1/10,000th the thickness of a human hair.”
Unlike its famous forerunner, the Hubble Space Telescope, Webb doesn’t see the same light we do. As an infrared telescope, it sees wavelengths longer than those of visible light, which makes it well suited for observing distant galaxies. That's because the universe is constantly expanding, and as galaxies move away from us, their light grows redder, eventually becoming invisible to human eyes — and to Hubble. But that blind spot is right where JWST thrives.
The Hubble Deep Fields, a set of exposures taken over the past two decades, penetrated farther into the cosmos than any telescope has seen before, revealing the stunning abundance of deep space.
However, says Marcia Rieke, a professor of astronomy at the University of Arizona, “it showed that there were lots of galaxies far away, and that Hubble could not see the most distant ones.” Webb, however, has already produced its own deep field image, capturing even fainter galaxies than its predecessor.
JWST Detects a Range of Wavelengths
Rieke is also a member of the NASA team that developed JWST. Specifically, she is the principal investigator for the Near-Infrared Camera, or NIRCam, one of four scientific instruments aboard the telescope. It works alongside another camera, the Mid-Infrared Instrument, or MIRI. Together, the two can detect a wide range of wavelengths, allowing astronomers to view everything from newborn stars to comets and protoplanetary discs.
Each camera has a set of filters (29 for NIRCam and 10 for MIRI) tailored to specific sections of the infrared spectrum. Most images are composites of multiple filters. They also typically involve multiple exposures — in a process called dithering, the telescope’s frame is shifted slightly after each exposure, to cancel out the corrupting effect of cosmic ray strikes and others issues.
That way, Rieke says, “if there’s a bad pixel, it gets filled in with information from good pixels.” The separate images can then be aligned and combined into one crisp shot.
The telescope has a fixed field of view, so to portray a large swath of space it has to cover a single small patch at a time. For one broad survey of the sky, Rieke says, she’ll be using nine filters, each for nine exposures, across dozens of frames. That’s 81 exposures for a single frame, and roughly 7,000 for the entire composite. Again, not your average point-and-shoot.
Astronomers must also decide how long to expose their images. A quick glance shows enormous differences in brightness between planets, stars, galaxies and other objects, so Webb must be versatile enough to account for them all. NIRCam’s exposure time, for example, ranges from a few thousandths of a second to about 23 minutes. The Space Telescope Science Institute even offers an online exposure calculator, to help users estimate what will give the best result.
One special challenge is viewing dim objects located near brighter ones, like an exoplanet in the neighborhood of a brilliant star. For those scenarios, NIRCam comes equipped with a coronagraph, which is essentially a glass plate with a black dot to obscure the unwanted light.
“You get rid of the glare from the star itself, and you can study its close surroundings better,” Rieke says. “It’s just like putting your thumb up and blocking the sun.”
Another advantage of infrared imaging is that longer wavelengths shine through the fine dust that pervades many galaxies, providing a deeper look into space. But some of JWST’s most breathtaking pictures so far take that dust as their subject. A dreamlike image of the Carina Nebula’s Cosmic Cliffs shows previously invisible stellar nurseries — wisps and pillars of fine particles, contracting to form new stars.
A few Webb images feature what appear to be distorted star arcs, the sort you see in standard astrophotography. In fact, these have nothing to do with how NIRCam and MIRI work. They are the result of gravitational lensing: When a galaxy’s gravity is strong enough, it can warp and magnify the light from more distant objects behind it, projecting them around its periphery so they come within view.
Because human eyes can’t see infrared light, JWST pictures don’t technically show the cosmos as we would experience it firsthand. NASA’s production team has to make adjustments, essentially translating the wavelength data into visible colors. But that doesn’t make them phony, and it’s the only way we can peer into these otherwise hidden corners of the universe.
“The representations that everyone oohs and aahs over are what we would call false color images,” Rieke says. “They’re fake in that sense, but the shapes and things are very real.”