We have completed maintenance on DiscoverMagazine.com and action may be required on your account. Learn More

The Dating Game

By tracking changes in ancient atoms, archeologists are establishing the astonishing antiquity of modern humanity.

By James Shreeve
Sep 1, 1992 5:00 AMNov 12, 2019 4:35 AM

Newsletter

Sign up for our email newsletter for the latest science news
 

Four years ago archeologists Alison Brooks and John Yellen discovered what might be the earliest traces of modern human culture in the world. The only trouble is, nobody believes them. Sometimes they can’t quite believe it themselves.

Their discovery came on a sun-soaked hillside called Katanda, in a remote corner of Zaire near the Ugandan border. Thirty yards below, the Semliki River runs so clear and cool the submerged hippos look like giant lumps of jade. But in the excavation itself, the heat is enough to make anyone doubt his eyes.

Katanda is a long way from the plains of Ice Age Europe, which archeologists have long believed to be the setting for the first appearance of truly modern culture: the flourish of new tool technologies, art, and body ornamentation known as the Upper Paleolithic, which began about 40,000 years ago. For several years Brooks, an archeologist at George Washington University, had been pursuing the heretical hypothesis that humans in Africa had invented sophisticated technologies even earlier, while their European counterparts were still getting by with the same sorts of tools they’d been using for hundreds of thousands of years. If conclusive evidence hadn’t turned up, it was only because nobody had really bothered to look for it.

In France alone there must be three hundred well-excavated sites dating from the period we call the Middle Paleolithic, Brooks says. In Africa there are barely two dozen on the whole continent.

One of those two dozen is Katanda. On an afternoon in 1988 John Yellen--archeology program director at the National Science Foundation and Brooks’s husband--was digging in a densely packed litter of giant catfish bones, river stones, and Middle Paleolithic stone tools. From the rubble he extricated a beautifully crafted, fossilized bone harpoon point. Eventually two more whole points and fragments of five others turned up, all of them elaborately barbed and polished. A few feet away, the scientists uncovered pieces of an equally well crafted daggerlike tool. In design and workmanship the harpoons were not unlike those at the very end of the Upper Paleolithic, some 14,000 years ago. But there was one important difference. Brooks and Yellen believe the deposits John was standing in were at least five times that old. To put this in perspective, imagine discovering a prototypical Pontiac in Leonardo da Vinci’s attic.

If the site is as old as we think it is, says Brooks, it could clinch the argument that modern humans evolved in Africa.

Ever since the discovery the couple have devoted themselves to chopping away at that stubborn little word if. In the face of the entrenched skepticism of their colleagues, it is an uphill task. But they do have some leverage. In those same four years since the first harpoon was found at Katanda, a breakthrough has revived the question of modern human origins. The breakthrough is not some new skeleton pulled out of the ground. Nor is it the highly publicized Eve hypothesis, put forth by geneticists, suggesting that all humans on Earth today share a common female ancestor who lived in Africa 200,000 years ago. The real advance, abiding quietly in the shadows while Eve draws the limelight, is simply a new way of telling time.

To be precise, it is a whole smorgasbord of new ways of telling time. Lately they have all converged on the same exhilarating, mortifying revelation: what little we thought we knew about the origins of our own species was hopelessly wrong. From Africa to the Middle East to Australia, the new dating methods are overturning conventional wisdom with insolent abandon, leaving the anthropological community dazed amid a rubble of collapsed certitudes. It is in this shell-shocked climate that Alison Brooks’s Pontiac in Leonardo’s attic might actually find a hearing.

Ten years ago I would have said it was impossible for harpoons like these to be so old, says archeologist Michael Mehlman of the Smithsonian’s National Museum of Natural History. Now I’m reserving judgment. Anything can happen.

An archeologist with a freshly uncovered skull, stone tool, or bone Pontiac in hand can take two general approaches to determine its age. The first is called relative dating. Essentially the archeologist places the find in the context of the surrounding geological deposits. If the new discovery is found in a brown sediment lying beneath a yellowish layer of sand, then, all things being equal, it is older than the yellow sand layer or any other deposit higher up. The fossilized remains of extinct animals found near the object also provide a biostratigraphic record that can offer clues to a new find’s relative age. (If a stone tool is found alongside an extinct species of horse, then it’s a fair bet the tool was made while that kind of horse was still running around.) Sometimes the tools themselves can be used as a guide, if they match up in character and style with tools from other, better-known sites. Relative dating methods like these can tell you whether a find is older or younger than something else, but they cannot pin an age on the object in calendar years.

The most celebrated absolute method of telling archeological time, radiocarbon dating, came along in the 1940s. Plants take in carbon from the atmosphere to build tissues, and other organisms take in plants, so carbon ends up in everything from wood to woodchucks. Most carbon exists in the stable form of carbon 12. But some is made up of the unstable, radioactive form carbon 14. When an organism dies, it contains about the same ratio of carbon 12 to carbon 14 that exists in the atmosphere. After death the radioactive carbon 14 atoms begin to decay, changing into stable atoms of nitrogen. The amount of carbon 12, however, stays the same. Scientists can look at the amount of carbon 12 and--based on the ratio-- deduce how much carbon 14 was originally present. Since the decay rate of carbon 14 is constant and steady (half of it disappears every 5,730 years), the difference between the amount of carbon 14 originally in a charred bit of wood or bone and the amount present now can be used as a clock to determine the age of the object.

Conventional radiocarbon dates are extremely accurate up to about 40,000 years. This is far and away the best method to date a find--as long as it is younger than this cutoff point. (In older materials, the amount of carbon 14 still left undecayed is so small that even the slightest amount of contamination in the experimental process leads to highly inaccurate results.) Another dating technique, relying on the decay of radioactive potassium rather than carbon, is available to date volcanic deposits older than half a million years. When it was discovered in the late 1950s, radiopotassium dating threw open a window on the emergence of the first members of the human family--the australopithecines, like the famous Lucy, and her more advanced descendants, Homo habilis and Homo erectus. Until now, however, the period between half a million and 40,000 years--a stretch of time that just happens to embrace the origin of Homo sapiens--was practically unknowable by absolute dating techniques. It was as if a geochronological curtain were drawn across the mystery of our species’ birth. Behind that curtain the hominid lineage underwent an astonishing metamorphosis, entering the dateless, dark centuries a somewhat precocious bipedal ape and emerging into the range of radiocarbon dating as the culturally resplendent, silver-tongued piece of work we call a modern human being.

Fifteen years ago there was some general agreement about how this change took place. First, what is thought of as an anatomically modern human being--with the rounded cranium, vertical forehead, and lightly built skeleton of people today--made its presence known in Europe about 35,000 years ago. Second, along with those first modern-looking people, popularly known as the Cro-Magnons, came the first signs of complex human behavior, including tools made of bone and antler as well as of stone, and art, symbolism, social status, ethnic identity, and probably true human language too. Finally, in any one region there was no overlap in time between the appearance of modern humans and the disappearance of archaic humans such as the classic Neanderthals, supporting the idea that one group had evolved from the other.

Thanks to the efforts of the new dating methods, says Fred Smith, an anthropologist at Northern Illinois University, we now know that each of these ideas was wrong.

The technique doing the most damage to conventional wisdom is called thermoluminescence, TL for short. (Reader take heed: the terrain of geochronology is full of terms long enough to tie between two trees and trip over, so acronyms are a must.) Unlike radiocarbon dating, which works on organic matter, TL pulls time out of stone.

If you were to pick an ordinary rock up off the ground and try to describe its essential rockness, phrases like frenetically animated would probably not leap to mind. But in fact minerals are in a state of constant inner turmoil. Minute amounts of radioactive elements, both within the rock itself and in the surrounding soil and atmosphere, are constantly bombarding its atoms, knocking electrons out of their normal orbits. All this is perfectly normal rock behavior, and after gallivanting around for a hundredth of a second or two, most electrons dutifully return to their normal positions. A few, however, become trapped en route--physically captured within crystal impurities or electronic aberrations in the mineral structure itself. These tiny prisons hold on to their electrons until the mineral is heated, whereupon the traps spring open and the electrons return to their more stable position. As they escape, they release energy in the form of light--a photon for every homeward-bound electron.

Thermoluminescence was observed way back in 1663 by the great English physicist Robert Boyle. One night Boyle took a borrowed diamond to bed with him, for reasons that remain obscure. Resting the diamond upon a warm part of my Naked Body, Boyle noticed that it soon emitted a warm glow. So taken was he with the responsive gem that the next day he delivered a paper on the subject at the Royal Society, noting his surprise at the glow since his constitution, he felt, was not of the hottest.

Three hundred years later another Englishman, Martin Aitken of Oxford University, developed the methods to turn thermoluminescence into a geophysical timepiece. The clock works because the radioactivity bombarding a mineral is fairly constant, so electrons become trapped in those crystalline prisons at a steady rate through time. If you crush the mineral you want to date and heat a few grains to a high enough temperature--about 900 degrees, which is more body heat than Robert Boyle’s constitution could ever have produced--all the electron traps will release their captive electrons at once, creating a brilliant puff of light. In a laboratory the intensity of that burst of luminescence can easily be measured with a device called a photomultiplier. The higher the spike of light, the more trapped electrons have accumulated in the sample, and thus the more time has elapsed since it was last exposed to heat. Once a mineral is heated and all the electrons have returned home, the clock is set back to zero.

Now, our lineage has been making flint tools for hundreds of thousands of years, and somewhere in that long stretch of prehistory we began to use fire as well. Inevitably, some of our less careful ancestors kicked discarded tools into burning hearths, setting their electron clocks back to zero and opening up a ripe opportunity for TL timekeepers in the present. After the fire went out, those flints lay in the ground, pummeled by radioactivity, and each trapped electron was another tick of the clock. Released by laboratory heat, the electrons flash out photons that reveal time gone by.

In the late 1980s Hélène Valladas, an archeologist at the Center for Low-Level Radioactivity of the French Atomic Energy Commission near Paris, along with her father, physicist Georges Valladas, stunned the anthropological community with some TL dates on burned flints taken from two archeological sites in Israel. The first was a cave called Kebara, which had already yielded an astonishingly complete Neanderthal skeleton. Valladas dated flints from the Neanderthal’s level at 60,000 years before the present.

In itself this was no surprise, since the date falls well within the known range of the Neanderthals’ time on Earth. The shock came a year later, when she used the same technique to pin a date on flints from a nearby cave called Qafzeh, which contained the buried remains of early modern human beings. This time, the spikes of luminescence translated into an age of around 92,000 years. In other words, the more advanced human types were a full 30,000 years older than the Neanderthals they were supposed to have descended from.

If Valladas’s TL dates are accurate, they completely confound the notion that modern humans evolved from Neanderthals in any neat and tidy way. Instead, these two kinds of human, equally endowed culturally but distinctly different in appearance, might have shared the same little nook of the Middle East for tens of thousands of years. To some, this simply does not make sense.

If these dates are correct, what does this do to what else we know, to the stratigraphy, to fossil man, to the archeology? worries Anthony Marks, an archeologist at Southern Methodist University. It’s all a mess. Not that the dates are necessarily wrong. But you want to know more about them.

Marks’s skepticism is not entirely unfounded. While simple in theory, in practice TL has to overcome some devilish complications. (If these new techniques were easy, we would have thought of them a long time ago, says geochronologist Gifford Miller of the University of Colorado.) To convert into calendar years the burst of luminescence when a flint is heated, one has to know both the sensitivity of that particular flint to radiation and the dose of radioactive rays it has received each year since it was zeroed by fire. The sensitivity of the sample can be determined by assaulting it with artificial radiation in the lab. And the annual dose of radiation received from within the sample itself can be calculated fairly easily by measuring how much uranium or other radioactive elements the sample contains. But determining the annual dose from the environment around the sample--the radioactivity in the surrounding soil, and cosmic rays from the atmosphere itself--is an iffier proposition. At some sites fluctuations in this environmental dose through the millennia can turn the absolute date derived from TL into an absolute nightmare.

Fortunately for Valladas and her colleagues, most of the radiation dose for the Qafzeh flints came from within the flints themselves. The date there of 92,000 years for the modern human skeletons is thus not only the most sensational number so far produced by TL, it is also one of the surest.

The strong date at Qafzeh was just good luck, says Valladas. It was just by chance that the internal dose was high and the environmental dose was low.

More recently Valladas and her colleague Norbert Mercier turned their TL techniques to the French site of Saint-Césaire. Last summer they confirmed that a Neanderthal found at Saint-Césaire was only 36,000 years old. This new date, combined with a fresh radiocarbon date of about 40,000 years tagged on some Cro-Magnon sites in northern Spain, strongly suggests that the two types of humans shared the same corner of Europe for several thousand years as the glaciers advanced from the north.

While Valladas has been busy in Europe and the Middle East, other TL timekeepers have produced some astonishing new dates for the first human occupation of Australia. As recently as the 1950s, it was widely believed that Australia had been colonized only some five thousand years ago. The reasoning was typically Eurocentric: since the Australian aborigines were still using stone tools when the first white settlers arrived, they must have just recently developed the capacity to make the difficult sea crossing from Indonesia in the first place. A decade later archeologists grudgingly conceded that the date of first entry might have been closer to the beginning of the Holocene period, 10,000 years ago. In the 1970s radiocarbon dates on human occupation sites pushed the date back again, as far as 32,000 years ago. And now TL studies at two sites in northern Australia drop that first human footstep on the continent--and the sea voyage that preceded it--all the way back to 60,000 years before the present. If these dates stand up, then the once-maligned ancestors of modern aborigines were building ocean-worthy craft some 20,000 years before the first signs of sophisticated culture appeared in Europe.

Luminescence has revolutionized the whole period I work in, says Australian National University archeologist Rhys Jones, a member of the team responsible for the new TL dates. In effect, we have at our disposal a new machine--a new time machine.

With so much at stake, however, nobody looks to TL--or to any of the other new time machines--as a geochronological panacea. Reputations have been too badly singed in the past by dating methods that claimed more than they could deliver. In the 1970s a flush of excitement over a technique called amino acid racemization led many workers to believe that another continent--North America--had been occupied by humans fully 70,000 years ago. Further testing at the same American sites proved that the magical new method was off by one complete goose egg. The real age of the sites was closer to 7,000 years.

To work with wrong dates is a luxury we cannot afford, British archeologist Paul Mellars intoned ominously earlier this year, at the beginning of a London meeting of the Royal Society to showcase the new dating technologies. A wrong date does not simply inhibit research. It could conceivably throw it into reverse.

Fear of just such a catastrophe--not to mention the risk that her own reputation could go up in a puff of light--is what keeps Alison Brooks from declaring outright that she has found exquisitely crafted bone harpoons in Zaire that are more than 40,000 years older than such creations are supposed to be. So far the main support for her argument has been her redating of another site, called Ishango, four miles down the Semliki River from the Katanda site. In the 1950s the Belgian geologist Jean de Heinzelin excavated a harpoon-rich aquatic civilization at Ishango that he thought was 8,000 years old. Brooks’s radiocarbon dating of the site in the mid- 1980s pushed the age back to 25,000. By tracing the layers of sediment shared between Ishango and Katanda, Brooks and her colleagues are convinced that Katanda is much farther down in the stratigraphy--twice as old as Ishango, or perhaps even more. But even though Brooks and Yellen talk freely about their harpoons at meetings, they have yet to utter such unbelievable numbers in the unforgiving forum of an academic journal.

It is precisely because no one believes us that we want to make our case airtight before we publish, says Brooks. We want dates confirming dates confirming dates.

Soon after the harpoons were discovered, the team went to work with thermoluminescence. Unfortunately, no burned flints have been found at the site. Nevertheless, while TL works best on materials that have been completely zeroed by such extreme heat as a campfire, even a strong dose of sunlight can spring some of the electron traps. Thus even ordinary sediments surrounding an archeological find might harbor a readable clock: bleached out by sunlight when they were on the surface, their TL timers started ticking as soon as they were buried by natural processes. Brooks and Yellen have taken soil samples from Katanda for TL, and so far the results are tantalizing--but that’s all.

At this point we think the site is quite old, says geophysicist Allen Franklin of the University of Maryland, who with his Maryland colleague Bill Hornyak is conducting the work. But we don’t want to put a number on it.

As Franklin explains, the problem with dating sediments with TL is that while some of the electron traps might be quickly bleached out by sunlight, others hold on to their electrons more stubbornly. When the sample is then heated in a conventional TL apparatus, these stubborn traps release electrons that were captured perhaps millions of years before the sediments were last exposed to sunlight--teasing date-hungry archeologists with a deceptively old age for the sample.

Brooks does have other irons in the dating fire. The most promising is called electron spin resonance--or ESR, among friends. Like TL, electron spin resonance fashions a clock out of the steadily accumulating electrons caught in traps. But whereas TL measures that accumulation by the strength of the light given off when the traps open, ESR literally counts the captive electrons themselves while they still rest undisturbed in their prisons.

All electrons spin in one of two opposite directions-- physicists call them up and down. (Metaphors are a must here because the nature of this spinning is quantum mechanical and can be accurately described only in huge mathematical equations.) The spin of each electron creates a tiny magnetic force pointing in one direction, something like a compass needle. Under normal circumstances, the electrons are paired so that their opposing spins and magnetic forces cancel each other out. But trapped electrons are unpaired. By manipulating an external magnetic field placed around the sample to be dated, the captive electrons can be induced to resonate--that is, to flip around and spin the other way. When they flip, each electron absorbs a finite amount of energy from a microwave field that is also applied to the sample. This loss of microwave energy can be measured with a detector, and it is a direct count of the number of electrons caught in the traps.

ESR works particularly well on tooth enamel, with an effective range from a thousand to 2 million years. Luckily for Brooks and Yellen, some nice fat hippo teeth have been recovered from Katanda in the layer that also held the harpoons. To date the teeth, they have called in Henry Schwarcz of McMaster University in Ontario, a ubiquitous, veteran geochronologist. In the last ten years Schwarcz has journeyed to some 50 sites throughout Europe, Africa, and western Asia, wherever his precious and arcane services are demanded.

Schwarcz also turned up at the Royal Society meeting, where he explained both the power and the problems of the ESR method. On the plus side is that teeth are hardy remains, found at nearly every archeological site in the world, and that ESR can test a tiny sample again and again-- with the luminescence techniques, it’s a one-shot deal. ESR can also home in on certain kinds of electron traps, offering some refinement over TL, which lumps them all together.

On the minus side, ESR is subject to the same uncertainties as TL concerning the annual soaking of radiation a sample has received from the environment. What’s more, even the radiation from within a tooth cannot be relied on to be constant through time. Tooth enamel has the annoying habit of sucking up uranium from its surroundings while it sits in the ground. The more uranium the tooth contains, the more electrons are being bombarded out of their normal positions, and the faster the electron traps will fill up. Remember: you cannot know how old something is by counting filled traps unless you know the rate at which the traps were filled, year by year. If the tooth had a small amount of internal uranium for 50,000 years but took in a big gulp of the hot stuff 10,000 years ago, calculations based on the tooth’s current high uranium level would indicate the electron traps were filled at a much faster rate than they really were. The big question is, When did the uranium get there? Schwarcz says. Did the tooth slurp it all up in three days, or did the uranium accumulate gradually through time?

One factor muddying the big question is the amount of moisture present around the sample during its centuries of burial: a wetter tooth will absorb uranium faster. For this reason, the best ESR sites are those where conditions are driest. Middle Eastern and African deserts are good bets. As far as modern human origins go, the technique has already tagged a date of about 100,000 years on some human fossils from an Israeli cave called Skhul, neatly supporting the TL date of 92,000 from Qafzeh, a few miles away. If a new ESR date from a Neanderthal cave just around the corner from Skhul is right, then Neanderthals were also in the Middle East at about the same time. Meanwhile, in South Africa, a human jawbone from the site of Border Cave--so modern it boggles the mind, as one researcher puts it--has now been dated with ESR at 60,000 years, nearly twice as old as any fossil like it in Europe.

But what of the cultural change to modern human behavior--such as the sophisticated technological development expressed by the Katanda harpoons? Schwarcz’s dating job at Katanda is not yet finished, and given how much is at stake, he too is understandably reluctant to discuss it. The site has good potential for ESR, he says guardedly. Let’s put it this way: if the initial results had indicated that the harpoons were not very old after all, we would have said ‘So what?’ to them and backed off. Well, we haven’t backed off.

There are other dating techniques being developed that may, in the future, add more certainty to claims of African modernity. One of them, called uranium-series dating, measures the steady decay of uranium into various daughter elements inside anything formed from carbonates (limestone and cave stalactites, for instance). The principle is very similar to radiocarbon dating--the amount of daughter elements in a stalactite, for example, indicates how long that stalactite has been around--with the advantage that uranium-series dates can stretch back half a million years. Even amino acid racemization, scorned for the last 15 years, is making a comeback, thanks to the discovery that the technique, unreliable when applied to porous bone, is quite accurate when used on hard ostrich eggshells.

In the best of all possible worlds, an archeological site will offer an opportunity for two or more of these dating techniques to be called in so they can be tested against each other. When asked to describe the ideal site, Schwarcz gets a dreamy look on his face. I see a beautiful human skull sandwiched between two layers of very pure flowstone, he says, imagining uranium-series dating turning those cave limestones into time brackets. A couple of big, chunky hippo teeth are lying next to it, and a little ways off, a bunch of burned flints.

Even without Schwarcz’s dream site, the dating methods used separately are pointing to a common theme: the alarming antiquity of modern human events where they are not supposed to be happening in the first place. Brooks sees suggestive traces of complexity not just at Katanda but scattered all over the African continent, as early as 100,000 years before the present. A classic stone tool type called the blade, long considered a trademark of the European Upper Paleolithic, appears in abundance in some South African sites 40,000 to 50,000 years before the Upper Paleolithic begins. The continent may even harbor the earliest hints of art and a symbolic side to human society: tools designed with stylistic meaning; colorful, incandescent minerals, valueless but for their beauty, found hundreds of miles away from their source. More and more, the Cro-Magnons of Europe are beginning to look like the last modern humans to show themselves and start acting human rather than the first.

That’s not an easy notion for anthropologists and archeologists to swallow. It just doesn’t fit the pattern that those harpoons of Alison’s should be so old, says Richard Klein, a paleoanthropologist at the University of Chicago. Then he shrugs. Of course, if she’s right, she has made a remarkable discovery indeed.

Only time will tell.

1 free article left
Want More? Get unlimited access for as low as $1.99/month

Already a subscriber?

Register or Log In

1 free articleSubscribe
Discover Magazine Logo
Want more?

Keep reading for as low as $1.99!

Subscribe

Already a subscriber?

Register or Log In

More From Discover
Recommendations From Our Store
Shop Now
Stay Curious
Join
Our List

Sign up for our weekly science updates.

 
Subscribe
To The Magazine

Save up to 40% off the cover price when you subscribe to Discover magazine.

Copyright © 2024 Kalmbach Media Co.