To many of us, the idea that out-of-control global warming is about to parch forests and drown small island nations seems highly implausible. Climate may be fickle, but it’s rarely that malicious.
Skeptics might do well to scroll back Earth history 800 years or so, however. England, today notorious for its dreary chill, was notable for its wines. Greenland, today an ice sheet fringed with land barely suitable for grazing sheep and reindeer, was then a fertile ground where Viking farmers tilled fields of grain; alas, several hundred years later these same Norse settlers would be driven away by the deepening cold, the last survivors struggling to bury their dead in the rising permafrost.
Turn the clock back 17,000 years more, and much of the Northern Hemisphere looks like the interior of Greenland today: a featureless desert of ice thousands of feet thick. Turn back 5 million years more, and the ice is gone. Greenland, living up to its name, is a verdant, forested land. Earlier still, it is downright tropical, to judge from 80-million-year-old fossils of breadfruit trees typical of South Sea islands. Turn back 300 million years, and the southern part of the globe is locked in an ice age that lasted 60 million years.
It’s no secret that climate is changeable, prone to taking erratic turns without rhyme or seeming reason, as unpredictable as, well, weather. Bizarre behavior is all too normal. Yet the forces behind such changes operate in mysterious ways. Climate brews at the interface of the elements, where the atmosphere meets the ocean and the frequently violent surface of the solid planet. Chance interactions can easily plunge whole hemispheres into deep freeze or turn up the heat until even the high latitudes steam like a tropical hothouse. The mystery is, how?
As researchers have learned to read the dramatic story of climate changes from fossils, rock deposits, and subtle chemical clues in ocean sediments, they have offered a host of explanations. The slow drifting of the continents can open and close straits and seas, altering the pattern of warm and cold ocean currents. Volcanoes can spew out clouds of ash, spreading a cooling sunshade over the planet. The sun itself has been fingered as a suspect in some episodes, including the one that drove the Norse from Greenland; though seemingly a reliable companion, over decades or centuries the sun may flicker and pulse like an old fluorescent tube. Over thousands of years Earth itself wobbles in space, changing the amount of energy it can intercept from the sun.
None of these forces, however, seems sufficient to account for long-term global climate shifts, and researchers have been grasping for an explanation. Curiously, in case after case, they see the hand of carbon dioxide, the same heat-trapping gas that may now be warming the globe. CO2 molecules behave like one-way mirrors for heat radiation from the sun; like greenhouse glass, they allow radiation to get through to Earth but absorb it before it can get out into space again. Long before engines and industry started spewing carbon dioxide into the atmosphere, researchers believe, Earth’s natural respiration was blowing the gas into the atmosphere and sucking it out in enormous, and sometimes sustained, gulps. During a recent global freeze, for example, the carbon dioxide content of the atmosphere is known to have been 30 percent lower than it is today.
The reservoir for this CO2 is Earth itself. The waters of the deep ocean store tens of times more carbon dioxide than the atmosphere can hold. Rocks such as limestone hold tens of thousands of times more. Geologic processes that released some of that stored carbon dioxide into the atmosphere could have produced periods of prolonged planetary summer. Conversely, geological and biological processes that sucked carbon dioxide out of the atmosphere and locked it away could have set off sustained cold spells. The planet, it appears, changes its temperature by adjusting its carbon dioxide blanket, like a fitful sleeper kicking the bedclothes on and off.
From the very beginning, Earth has suffered wild climate swings. And from the outset, carbon dioxide seems to have played a pivotal role. During the first half of the planet’s 4.6-billion-year history, temperatures were much higher than they had any right to be. The sun formed not long before the Earth, and solar physicists calculate that in its infancy our neighbor star put out barely three-quarters the light and heat it does today. Under these circumstances, the oceans should have frozen solid and remained that way for Earth’s first 2 billion years. Yet the evidence for a thaw is persuasive. Life evolved in some warm, liquid environment when Earth was less than a billion years old. Even earlier, sedimentary rocks were forming from silt settling at the bottom of lakes or seas. Something in the atmosphere protected Earth against the chilling influence of a weak young sun.
Climatologists are guessing that that something was a concentration of carbon dioxide perhaps a thousand times the present-day level. After all, they argue, volcanic eruptions breathe out large volumes of the gas, and early Earth seethed with volcanic activity. But as is so often the case when unraveling the mysteries of climate, a solution to one puzzle promptly creates a new one. As the sun gradually brightened over hundreds of millions of years, the greenhouse heating from all that carbon dioxide could eventually have brought the oceans to a boil. Since that clearly didn’t happen--for one thing, it would have wiped out life--the question remains: What happened to the CO2? How did Earth shed its layer of insulation as the sun warmed up?
One way of sponging CO2 out of the atmosphere is a process called chemical weathering. In effect, weathering washes carbon dioxide out of the sky and stores it away in rocks. The gas first dissolves in rainwater or wet soil, forming a dilute acid. The acid eats away certain kinds of rocks, chemically transforming the carbon dioxide into bicarbonate ions as it does so. The bicarbonate ions flow down rivers into the ocean. There some of the carbon dioxide captured during the weathering process escapes into the atmosphere. The rest combines with calcium ions dissolved from rocks to form calcium carbonate. The carbonate settles on the ocean floor, ultimately as limestone and other sedimentary rocks. The CO2 remains imprisoned for millions of years.
According to a theory worked out by climatologist James Walker of the University of Michigan and other researchers, chemical weathering could have acted as a planetary thermostat to keep Earth from boiling over. As Earth warmed under its CO2 blanket, the evaporation of water from oceans increased, thereby increasing rainfall. The more rainfall, the more chemical weathering. The more weathering, the greater the pace of CO2 distillation out of the atmosphere and into the ocean floor. The less CO2 in the atmosphere, the cooler the planet’s temperature. This weathering cycle, say Walker and company, saved the globe from cooking in its own greenhouse.
Other scientists remain unconvinced that weathering by physical processes alone would suffice. While it may have cooled things down, they say, it couldn’t have prevented the planet from getting altogether too toasty. To turn down the thermostat tens of degrees further, they add yet another ingredient to the climatic brew, one that could have boosted the effects of weathering dramatically: namely, life.
When life came ashore, perhaps as early as 3 billion years ago, it wasn’t much to look at--just a crust of bacteria and simple algae. According to earth scientist Tyler Volk of New York University and climatologist David Schwartzman of Howard University in Washington, D.C., however, it could have turned the land surface into a virtual sponge for carbon dioxide. Not that living things themselves could have absorbed sufficient CO2 to lower the temperature; even today, when life carpets most of the world, as much carbon dioxide is stored in the atmosphere as in all living things combined. Instead, the early microbes would have accelerated chemical weathering by creating the first soil. Before bacteria made their debut, rock flakes and grains simply washed away, leaving bare rock that shrugged off rainwater. But some microbes secrete sticky substances that hold on to fine particles like a rug trapping dirt, according to Volk. When it rained, moisture was trapped in these primitive soils. Dissolved carbon dioxide could eat away at the rock particles continuously rather than in short spurts following rainstorms.
On the early Earth, bacteria colonizing the land could have enhanced weathering a hundredfold, says Volk. That might have been enough to cool the planet by an average of 55 degrees or more--from 115 degrees to 60. If so, says Volk, these early microbe-based soils may have been instrumental in cooling the planet enough for higher life-forms to evolve.
Consistent with the controversy that pervades the field, climatologists are in conflict over Volk’s theory as well. According to geochemist Robert Berner of Yale, who has looked at the weathering of fresh volcanic rock in Hawaii before and after it was colonized by algae and lichens, primitive plants do virtually nothing. Life didn’t become a major contributor to chemical weathering, and therefore climate, until about 400 million years ago, according to Berner. That was when more complex plants--seed ferns, horsetails, and the like--began sending long roots into the ground. While the green parts of plants breathe in carbon dioxide from the atmosphere, the roots exhale the gas into the soil and exude organic acids that react with rocks and rainwater, speeding up weathering.
The early climatic record remains necessarily murky; there simply isn’t that much to read. The last 500 million years present a much clearer picture, thanks to a larger inventory of fossils and better-preserved geologic clues, such as rocks scarred by advancing glaciers. But if the story is clearer, it doesn’t necessarily provide more satisfying answers. And even though the sun had reached a steady state, climatic upheavals didn’t become any less erratic. On the contrary, they may have intensified.
Once again, Berner and others identify carbon dioxide as the culprit. By 500 million years ago most of the early carbon dioxide blanket that kept Earth warm under the cool sun was gone. What was left, however, sufficed to set the global thermostat. As the Earth breathed CO2 in and out of rocks, ocean, and air, it occasionally gasped and sputtered, and frequently held its breath. The result was periods of millions of years when ice sheets held the continents in their grip, and equally long intervals when alligators basked in the balmy Arctic summer.
According to Berner, chemical weathering is only one of a number of processes that keep the geochemical carbon cycle going. While weathering followed by sedimentation buries CO2 deep in ocean rocks, for example, the ceaseless drifting of the continents releases it. Chemical weathering loads the seafloor with carbonate rocks; the moving ocean floor, like a giant conveyor belt, trundles those rocks to deep trenches, called subduction zones, around the edges of the ocean floor. There the seafloor bends down into Earth’s interior, heating up as it goes. Carbon dioxide cooks out of the carbonate layers and escapes to the surface, where it burps out of volcanoes, fizzes from mineral springs, or slowly oozes out of the ground. The faster the conveyor belt moves, the more CO2 bubbles up.
Living things can divert carbon dioxide from this cycle. Taking the gas directly from the atmosphere, plants use it to build their tissues and, indirectly, those of the animals that eat them. Some of that carbon dioxide goes back into the atmosphere when living things die and decay. But when the plant and animal remains are buried and turned into coal, oil, or oil shale, or disseminated in sedimentary rocks, the carbon is taken out of circulation.
The amount of carbon dioxide in the atmosphere at any given time, then, results from the interplay of these forces. In periods when plate tectonics sped up, the conveyor belt of moving seafloor operated at a faster than normal pace, causing volcanoes and hot springs to pump more carbon dioxide into the atmosphere. The Earth, in turn, warmed. Conversely, when the seafloor conveyor slowed, or when new varieties of plants evolved and boosted weathering rates, or when coal deposits were forming, carbon was locked away, the concentration of CO2 fell, and global temperatures fell along with it.
To find out how well his carbon dioxide scenario matches the true story of climate, Berner primes it with data about the history of each of those processes. The equations of the model grind away and produce a carbon dioxide curve showing two peaks and two valleys for the past 570 million years. Two major ice ages--one about 300 million years ago, and the most recent one, starting a million years ago--match the curve’s predictions nicely. Major glaciations, says Berner, fit right where my carbon dioxide valleys come in. That’s very convincing to me. So is the match of the second peak, at 100 million years ago, with one of the warmest periods in the climate record, when dinosaur herds grazed under the midnight sun in Alaska.
But major pieces of Berner’s carbon dioxide story simply don’t fit. An earlier ice age, 440 million years ago, falls uncomfortably close to one of the high points of the carbon dioxide curve. That’s a problem, Berner concedes: We’re trying to get glaciation then.
At best, Berner’s model works well only for long-term climatic shifts. My model misses short bursts, ones that don’t last more than 10 million years, he says. Take a global heat wave 55 to 50 million years ago, when western North America was tropical as far north as Seattle, and flying lemurs--squirrellike relatives of monkeys that are now found only in Southeast Asia--were sailing through the greenery on Ellesmere Island, 12 degrees from the North Pole. Berner’s climate model suggests that plate tectonics speeds up during periods of global warming, pumping more heat- trapping gas into the atmosphere. Even if that did occur, the speedup didn’t amount to much. However, that doesn’t mean carbon dioxide wasn’t responsible. Oceanographer David Rea of the University of Michigan, along with his colleague Robert Owen, has come up with yet another geologic process that might cause the planetary crust to cough up its CO2, only faster.
Rea took a hint from the timing of this particular warm spell: it struck just as a new world order was being created, geologically speaking. The North Atlantic was opening, splitting apart Greenland and Norway. Around the globe new midocean ridges were developing--those roiling underwater volcanic rifts that generate new seafloor. Hot, mineral-laden water spewing from the underwater springs along the ridges changes ocean chemistry in a way that drives dissolved carbon dioxide out of the ocean and into the atmosphere.
Indeed, cores of sediments drilled from the deep-sea floor bolster Rea’s case. Nicely matching the evidence that high summer bloomed suddenly 55 million years ago, the cores show that at just the same time, underwater hot springs spurted out iron, manganese, and other minerals at 10 to 100 times today’s rate. The hot-spring surge, says Rea, might have boosted atmospheric carbon dioxide to several times the present level--and average global temperatures to levels never seen since.
Soon after this short burst of summer, Earth seemed to push off its carbon dioxide blanket once again, initiating a long, fitful cooling spell. About 40 million years ago Antarctica began growing the ice sheet that practically hides the continent today. Alaska, subtropical 50 million years ago, was temperate 20 million years later. In east Africa cooling and drying transformed forests into savannas. In Greenland the forests gave way to tundra and then, by 3 or 4 million years ago, to ice.
The processes that drain carbon dioxide from the atmosphere in Berner’s model don’t seem to be at work this time. Berner’s curve shows a rapid drop in carbon dioxide from 100 to 50 million years ago, says geologist Maureen Raymo of MIT, but that carbon dioxide curve doesn’t look like the climate curve. It’s really after 40 million years ago that you see the pronounced cooling.
Instead, she thinks the control on carbon dioxide may have been something altogether new: the frantic mountain building of the last 40 million years. In this period the Himalayas and the Tibetan plateau ascended from sea level into the most dramatic mountainscape on Earth. The rise of those barriers would have deflected prevailing winds and unleashed new ones, according to computer models. And the altered winds would have caused dramatic local changes in climate: a drying in Central Asia, and a wetter, warmer regime, marked by violent monsoons, in India.
Those changes might also have caused some local cooling, but they can’t account for the global chill. To explain that, Raymo once again calls on chemical weathering. The heavy rains unleashed by monsoons would have lashed the steep mountainsides relentlessly; landslides would have continually exposed fresh material to attack, causing rock to dissolve much faster than in gentler terrain. With chemical weathering sucking carbon dioxide out of the atmosphere, the climate would have cooled.
Then, about one million years ago, the climate’s whole character changed yet again. Steady cooling gave way to a series of changes that make up the key motif of recent geologic history: the stately cycle of ice ages, in which ice sheets advance in pulse after pulse, separated by warmer periods lasting tens of thousands of years when the ice shrinks back into polar lairs.
In the 1970s, after more than a century of mystification, scientists finally perfected a way to pin down the timing of these glacial cycles, a method that also helps shed light, indirectly, on their cause. In the shells of tiny marine organisms extracted from long core samples of deep-sea sediment, they traced subtle fluctuations in the chemistry of seawater that seemed to reveal how much of the world’s water was missing from the oceans, locked up in ice sheets. The ice-volume signal could be follow-ed for hundreds of thousands of years.
As climate researchers unraveled the rhythm of the ice’s peaks and valleys, they found support for a largely discounted theory first proposed more than 100 years ago and extended in the 1920s by the Serbian astronomer Milutin Milankovitch. Milankovitch had sought the cause of ice ages in subtle astronomical variations: 20,000- and 41,000-year cycles in which the tilt and orientation of Earth’s spin axis change, and a 100,000- year cycle in which the planet’s orbit around the sun stretches into a more extreme ellipse, then relaxes again. These cycles don’t have much effect on the total amount of solar energy reaching Earth, but they do change how the annual sunshine quota is distributed among the seasons, especially at high latitudes. By affecting the strength of summer sunshine--and hence its ability to melt away snow from the preceding winter--the seasonal changes might be enough to make the difference between the creation of a glacier and its retreat. The clincher came when scientists found a clear echo of the orbital cycles in the deep-sea records: a basic cycle of glacial advance and retreat lasting 100,000 years, plus smaller fluctuations of 23,000 and 41,000 years.
A coincidence in timing, of course, does not a cause make. Geologist John Imbrie of Brown University, who was a key figure in establishing the astronomical link, admits that only the shorter cycles are demonstrably related to changes in solar energy at those periods. They’re clearly driven--forced--by changes in radiation. Now, with the 100,000-year cycle, that’s not true. The 100,000-year radiation changes are very small, about half a percent. They’re so small and the climatic response is so big that the climate changes can’t be a direct response.
The astronomical theory also falls short in other ways, climatologist Barry Saltzman of Yale points out. For one thing, the ice sheets advanced in step in the Northern and Southern hemispheres. The 100,000-year cycle should have had opposite effects in the two hemispheres: when it brought stronger summer sunshine to northern latitudes, it should have weakened sunshine in southern latitudes, so you might expect glaciation to seesaw between the hemispheres. But the glaciers advanced at the same time, north and south.
That leaves climatologists looking for some global factor that could serve as the primary cause of the ice ages yet still leave a role open for the influence of astronomical cycles. For Saltzman and some other researchers, that factor may be carbon dioxide stored in the deep ocean. In a scheme he developed with his Yale colleague Kirk Maasch, the living things of the ocean, which absorb carbon dioxide, and the ocean currents, which can release the gas back into the air again, could have acted fast enough to set the glacial pace. In Saltzman and Maasch’s model, the growth of ice sheets, the cooling climate, and the oceans’ ability to modulate carbon dioxide concentration all reinforce each other, adding up to a natural oscillator. Through interplays climatologists haven’t fully unraveled, all three might cooperate to keep this climate pendulum swinging, even without the astronomical pacemakers, at a natural period of about 100,000 years. It matches the 100,000-year astronomical cycle, says Saltzman, only because external forcing gives it an extra kick that brings it into step.
Recent ice age, ancient hothouse, or primordial glaciation-- Earth’s carbon dioxide blanket seems to be the culprit climatologists most like to blame for unaccountable climate swings. But the planet’s internal processes can’t account for everything under the sun. And recent climatic blips remind us that we can’t take the sun for granted either. Even though it has settled down to a stable middle age, the sun may have its vagaries. The little ice age, which drove the Vikings from Greenland, was a 400-year chill that began around 1450. Curiously, its most intense period corresponded to an odd interval of solar quiescence. From 1645 to 1715, sunspots virtually disappeared from the face of the sun.
Although the causal connection is unclear, measurements in recent decades confirm that the sun tends to dim slightly every 11 years, just as solar activity reaches the minimum part of its cycle and sunspots become scarce. Some scientists speculate that the prolonged disappearance of sunspots in the seventeenth century was a sign of a drastic drop in solar output that could account for the 1- to 2-degree cooling that marked the little ice age. Others, including Tom Wigley of the University of East Anglia in England, give the sun credit for other cold spells as well. A series of minor ice ages over the past 9,000 years, recorded in the heaps of debris left when mountain glaciers advanced and retreated, seems to correspond with changes in the sun’s activity, Wigley discovered. It all amounts to tantalizing hints that the sun has shaped climate since the last ice age, says Wigley, though not proof of a physical, causal relationship.
Whatever the sun, the ocean, or the rocks do, they won’t keep human beings themselves from shaping the climate in coming decades. Industrialization will continue to pump up atmospheric carbon dioxide to levels not seen, perhaps, in 100,000 years, warming the planet by more than 5 degrees. But then, in a few centuries, the furnaces and engines will run out of fuel and fall idle. Gradually, carbon dioxide will drain out of the atmosphere once again, to be locked away in the ocean depths like so much buried treasure. After that geologically brief pause, the cycles of glaciation will resume. And one day, perhaps, plate tectonics will surge again, banishing the ice and restoring the hothouse that warmed the dinosaurs.