We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

The Day Warming Began

Centuries-old temperature records help piece together global climate trends. But how reliable are they?

By Douglas Fox
Nov 4, 2016 12:00 AMNov 19, 2019 1:39 AM
DSC-C1216 01
(Credit: Alison Mackey/Discover/Wellcome Library/London/Wellcome Images/The Evolution of the Thermometer/Henry Carrington Bolton/The Chemical Publishing Co. 1900/W. Derham, Philosophical Transactions (1683-1775) Vol. 22 (1700-1701) p. 527-529 Royal Society Publishing via archive.org)

Newsletter

Sign up for our email newsletter for the latest science news
 

Upminster Church stood 15 miles east of London, among a hodgepodge of farms and pastures. When William Derham ventured outside at 8 a.m. on Jan. 4, 1699, the first fingers of sunlight were still stretching over the horizon, throwing long shadows behind the church’s bell tower and trees. A breeze blew from the southwest, and clouds crept over the sky. Derham ducked into the cool shade to examine an object that hung there. The local Anglican rector, Derham was better known for his observations of Jupiter and his collections of wasps and grubs than for his talents in the pulpit.

On this day, he had come to take the first reading from his thermometer, still a new technology at the time. Derham’s was custom-built: a 2-foot shaft of glass with a bulb at the bottom filled with alcohol. The Fahrenheit and Celsius scales weren’t yet invented, so he read the temperature on his instrument using his own system: 1 degree for every tenth of an inch. When Derham read the thermometer that winter morning, it was mild — 11.00 inches of alcohol, or 110 degrees — roughly 48 degrees Fahrenheit. He would continue reading the temperature thrice daily for the next seven years, recording it along with the wind, rain, barometric pressure and clouds.

Derham could not have known, but his hobby would one day mark the beginning of something monumental: the Central England temperature record, the earliest thermometer readings now included in the massive datasets that track global warming. Eventually, scientists would use Derham’s readings — among others — to better understand human-caused climate change.

Beginning in 1699, Anglican rector William Derham recorded temperatures in England three times every day for seven years using a custom thermometer. (Credit: Diocesan and Regional Library at Skara/Flickr)

Records Store The vast majority of the world’s temperature readings don’t go back further than the 1900s, when large-scale burning of fossil fuels was already underway, and with it the release of planet-warming greenhouse gases. Such a short record makes it difficult to measure natural variations in climate that existed before humans began warming the world; as a result, it complicates efforts to tease apart how much of the planet’s warming has been caused by humans versus natural factors.

The Central England record covers a longer period than any other thermometer series, reaching from 1699 to the present. It owes its beginnings to Derham and a handful of other clergymen, physicians and philosophers in 18th-century England who happened to be early adopters of the thermometer. Modern researchers have combined the fragmentary, overlapping records they left behind into a series of annual temperatures averaged over the region, which stretches from England’s south coast 175 miles north to Manchester.

Thermometer readings going back to 1880 or so tell scientists Earth has warmed about 1.5 F since then, mostly due to greenhouse gas emissions. But even earlier readings can provide a record of natural climate variation caused by volcanic eruptions or cycles in ocean circulation. This thermometer record is imperfect, but researchers are working on improving it using new mathematical methods and experiments with some of the same centuries-old thermometers. Unfortunately, this historical data is showing scientists that warming might be worse than we thought.

The First Temperatures

The earliest temperature measurements by Derham and others amounted to a sort of religious inquiry: They documented not only the weather, but also the body sizes of gnats and wasps and the dates on which snowdrop flowers bloomed and thrushes began to sing each spring. “Let us examine them with all our Gauges, measure them with our nicest Rules, pry into them with our Microscopes, and most exquisite Instruments,” wrote Derham in his 1713 book, Physico-Theology. Hidden in the minuscule details of nature, he believed, were insights into the Creator.

By the early 1700s, scientists had developed the Fahrenheit and Celsius scales. Mass production of standardized thermometers was underway by the mid-1800s. Meteorology emerged as a formal science in about 1870, as the United States and other countries established national weather services and started keeping records. But it wasn’t until the 1930s that anyone tried to use this information to measure changes in climate.

Guy Callendar gathered temperature data from around the world over decades. His data, published in 1938, showed a temperature rise that correlated with a rise in atmospheric carbon dioxide. (Credit: Courtesy of the G.S. Callendar Archive/University of East Anglia)

Researchers were already exploring the role of greenhouse gases in Earth’s long-gone ice ages, but an English engineer named Guy Callendar took it one step further. He wondered if carbon dioxide produced from burning coal might be warming the planet in the present. He gathered temperature records from around the world and calculated average annual temperatures with pencil and paper, hand-drawing graphs showing fluctuations over the decades.

His results, published in 1938, suggested that carbon dioxide indeed was having a warming effect. His figures from 147 weather stations around the world showed that average global temperatures increased by 0.59 F from 1880 to 1935 — double what he had predicted based on increasing carbon dioxide.

Callendar’s discovery prompted skepticism, but work by other scientists at the time studying the altered flowering times of plants and the slow retreat of glaciers spurred a continuing interest in how climate had evolved. Among those inspired to learn more was Gordon Manley, a meteorologist at Bedford College in London. He set out to extend temperatures further back in time than Callendar’s dataset.

From the 1930s to the 1950s, Manley built a temperature record for Central England reaching back to Derham’s first measurements. Searching historical archives, he found dozens of temperature series kept by individuals around England. Manley calculated monthly averages and noted the consistent differences between locations, such as the 1 F difference between Exeter, near the coast in southwestern England, and inland locations like Oxford. These overlaps and offsets allowed him to choose temperature sets that seemed reliable based on a consistent reading time each day and a well-functioning thermometer. He even analyzed the topography of the land to understand where the air masses being measured likely originated — and hence, show that two different locations were climatically similar.

Guy Callendar's data showing the correlation between the rise in atmospheric CO2 levels (top) and temperatures across the globe (bottom). (Credit: The Quarterly Journal of the Royal Meteorological Society, Volume 64, Issue 275, April 1938)

Manley’s Central England record coincides well with the year-to-year rises and falls of temperature proxies: tree rings and written records of when winter ice spread over rivers or harbors and trees sprouted leaves. “It’s been important to have,” Phil Jones, a climatologist at the University of East Anglia in the U.K., says of Manley’s data. Looking at shifts in Manley’s winter temperatures from year to year, he says, gives a good reading of important natural cycles that influence climate, such as changes in ocean circulation like the North Atlantic Oscillation.

But Jones is not sure if Manley did as well at capturing slower changes, of a few tenths of a degree over decades, which is important for detecting the onset of warming due to the burning of fossil fuels. The mere task of measuring air temperature turns out to be surprisingly difficult. Researchers still study how this was done in the past, and how it can be done better today.

Eliminating Today's Errors

Climate scientists have recognized for years that today’s temperature data is cluttered with little errors — daily temperatures at a site that suddenly shift by a degree or two. The errors suggest that the thermometer was moved, or a new observer began reading it at a different time of day. Equally worrying are urban heat islands: parking lots and roads near some temperature stations that absorb the sun’s rays and raise the nearby temperature.

In 2011, climate change skeptic Anthony Watts published a study that threw this problem into stark relief. He recruited volunteers to photograph and document 1,007 temperature stations across the lower 48 U.S. states and found that nearly three-quarters of them badly failed the National Oceanic and Atmospheric Administration’s own quality standards, often being too close to sunlight-absorbing structures.

That report evoked consternation among scientists, and NOAA removed some of the offending stations. But the agency had already been working on the problem since 2000, building a second temperature system, called the U.S. Climate Reference Network (USCRN), intended to provide a rigorous backup.

More than 100 USCRN stations now operate across the U.S. Each sits on public land off-limits for development, and each includes wind gauges and three duplicate thermometers. Computers automatically comb the data for inconsistent readings that indicate a broken instrument or other problems. A 2015 study found that these USCRN stations do in fact provide more accurate results than the standard ones — but the differences are subtle.

Jared Rennie and Ronald Leeper at NOAA’s National Centers for Environmental Information in Asheville, N.C., reported that daily maximum temperatures were 0.86 F warmer for the traditional screened stations compared with USCRN stations. However, the temperature rise was offset in part by the daily minimum temperatures, which were 0.65 F cooler for the screened stations.

Setting the Record Straight

Some of the best research on the detailed methods of temperature measurement come from another part of Europe, the central Alps, where temperature readings began a few decades after those in England. Reinhard Böhm, a climatologist at the Central Institute for Meteorology and Geodynamics in Vienna, spent much of the past two decades reconstructing early temperatures from this region, based on several thermometer series that began between 1760 and 1780.

Around 2003, he started to notice something strange. These early measurements suggested that temperatures were relatively warm between 1760 and 1850. But other sources, such as records of advancing glaciers, suggested the opposite, that the Alps were actually cool during that time. The thermometers and proxies gave similar results for the winters — but each summer, the two sets of data seemed to diverge for a few months, with the thermometers reading warmer. “We realized there is a bias” in early thermometers, says Böhm.

He thought he might know the reason: Thermometers often hung outside on north-facing walls, where reflected sunlight might warm the devices slightly above the temperature of the surrounding air. The national weather bureaus were correcting this problem by the 1860s by placing thermometers inside vented wooden boxes called screens, which blocked sunlight but allowed air to circulate through gaps — a method still used today. Böhm set out to test this hypothesis.

Reinhard Böhm reconstructed how temperatures were measured in the 18th century at the Kremsmünster Abbey near Vienna. He placed a thermometer at the same north-facing window in a shaded basket, and a second thermometer was placed in a modern screen box a few yards away. (Credit: Raab Creative Commons 3.0)

He had access to the unaltered site where temperature was measured in the Kremsmünster Abbey, 60 miles west of Vienna, starting in 1767. Böhm placed a thermometer back in that original spot, on the north side of the clock tower in a shaded basket. For eight years, he took daily readings from the old site and from a modern, screened thermometer a few yards away in the garden. 

The northern-exposed thermometer yielded higher temperatures than the screened one during summer, and nearly identical temperatures in winter — due to heat absorbed from sunlight during the brighter months. Böhm’s theory was correct: He estimated that early thermometers across the Alps gave summer readings that were 0.7 F above the actual air temperature. “This doesn’t sound like much,” he says, “but it is.” With temperatures rising 1.5 F in the 20th century, being off by 0.7 degrees suggests that actual warming since pre-industrial times might be more than 50 percent greater than assumed, around 2.2 F.

Correcting this error did not bring the early thermometers completely in line with proxies — up to 0.9 F of additional warm bias might still persist from other sources, such as differences in the thermometers or in how people read them — “but I think we are nearer to the truth,” said Böhm in 2012. (He died later that year.)

The uncertainty “is almost all in one direction,” agrees Peter Thorne, a climatologist at Maynooth University in Ireland, who has mostly worked on more recent temperature data. “The risk is that we’ve underestimated the global [warming] trend” over the past 200 years.

A Swiftly Warming Past

By the time Callendar died in 1964, he had grown disconcerted by a troubling inconsistency in the data: Even as carbon dioxide continued to rise, his warming trend vanished by the late 1930s. Global temperatures stalled and even cooled for the next several decades, only resuming warming several years after his death. Warming since 1970 now stands at 0.31 F per decade — three times what Callendar reported in 1938. Climate models show it’s enough to shake up ecosystems and make sea levels rise.

Callendar “was lucky” to detect warming in the first place, says Ed Hawkins, a climate scientist at the University of Reading in the U.K. Unbeknownst to Callendar, the period from 1880 to 1935 “probably had a bit of extra warming” unrelated to carbon dioxide, as the world’s climate rebounded from a cooling spate of volcanic eruptions.

But Callendar’s findings have held up well over the long term. In 2013, Hawkins and Jones compared Callendar’s pencil-and-paper estimates of warming, based on 147 stations, with modern estimates based on several thousand stations. “The agreement is quite surprising,” says Hawkins. The two plots showed nearly identical warming from 1880 to 1935.

The first centuries of the thermometer record were intriguing but unreliable, and people like Callendar, Manley and Böhm approached them with the same caution as historians examining ancient manuscripts. But those early temperatures are now a tool unto themselves, helping scientists tease out when humans might have started to warm Earth’s climate — and suggesting that the warming may be greater than first thought.

The biggest pitfall of early temperature readings from Derham and others may not be the reliability of the readings, but rather that there are so few of them, and that they cover such a small patch of the Earth’s surface. Derham’s letters mention other temperature readings collected by people he knew — series that still haven’t been found. Some were likely destroyed by London’s Royal Society in the 1920s, ironically considered “of no further use.” But some of this information, perhaps hidden in a dusty box in some library, may still come to light, adding precious years to our understanding of Earth’s climate history and future.

The First Thermometers

Thermometers in the 17th and 18th centuries varied in length, design, measuring scales and reactive liquids. Alcohol and mercury were liquids of choice. Today, scientists are replicating the thermometers to help gauge the accuracy of their centuries-old readings.

(Credit: Museo Galileo, Florence-Photographic Archives)

Derham’s thermometer probably resembled a thermoscope, invented in 16th-century Italy possibly by Galileo, consisting of a fluid-filled jar and a clear tube topped by a glass bulb. Liquid in the tube rose and fell in response to the bulb's temperature.

(Credit: Museo Galileo, Florence-Photographic Archives)

This alcohol thermometer from mid-18th-century Paris was mounted on a wooden board.

(Credit: Museo Galileo, Florence-Photographic Archives)

An Italian mercury thermometer with a centigrade scale dates to the 18th century.

(Credit: Museo Galileo, Florence-Photographic Archives)

An English mercury thermometer with a Fahrenheit scale also from the 18th century.

(Credit: Museo Galileo, Florence-Photographic Archives)

These alcohol thermometers from mid-17th-century Italy were marked with dots, one showing 15 degrees (foreground) and the other 20 degrees.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.