Not Set in Stone

Environmental changes during the Holocene Epoch show the beginnings of dramatic spikes, marking the would-be Anthropocene. Graph courtesy of authors Jan Zalasiewicz, et. al.

Welcome to the Anthropocene – the “human epoch.” Geological time may seem set in stone, and certainly it has been (as far as we’ve defined it) for the past 12,000 years in the Holocene.

But the Holocene’s relatively stable interglacial climate, so hospitable that it allowed the rise of human civilization, seems to be coming to a close. Geologic epochs are typically defined by distinctive changes in sedimentary layers. So what makes these days geologically different? Well, long after we’re all gone, a million years into the future, some intelligent life would be able to see clear signs of human activity in the same layers of soil across the globe.

To show up, the disturbance has to be on a massive scale, and in fact it is. Climate change, mass extinctions, soil erosion, cleared land, pollution, radioactive isotopes from nuclear tests, sea level rise: all these human-induced changes are producing clear patterns of change that are being documented in the soil.

Anthropocene has been used informally, starting with Nobel Prize winning chemist Paul Crutzen, who in 2002 coined the phrase quite unwittingly at a conference where, according to his quote in the Encyclopedia of Earth:

“… someone said something about the Holocene. I suddenly thought this was wrong. The world has changed too much. So I said: ‘No, we are in the Anthropocene.’ I just made up the word on the spur of the moment. Everyone was shocked. But it seems to have stuck.”

In February 2008, a GSA Today article (a monthly publication by the Geological Society of America) made the case for making the anthropocene epoch official among geologists. It nicely tracks the way human activity has made its mark in the geologic record, from mid-Holocene biotic evidence of weed pollen and the remains of cultivated plants in human settlement areas, to a layer of lead pollution that has settled in the polar ice caps and peat bog deposits from the Greco-Roman times onward.

The authors write:

Human activity then may help characterize Holocene strata, but it did not create new, global environmental conditions that could translate into a fundamentally different stratigraphic signal.

Change in sea surface pH caused by anthropogenic CO2 between the 1700s and the 1990s.
Credit: Wikipedia Commons

That only began to happen during the Industrial Revolution, which resulted in dramatic erosion due to expanded agriculture and construction, the damming of most major rivers thereby changing sedimentary patterns, waves of extinction and the replacement of natural vegetation with agricultural monocultures, ocean acidification, and, of course, a spike in carbon dioxide levels.

A temperature rise of between 2 degrees to 11.5 degrees Fahrenheit, as predicted under climate change models, hasn’t been seen since the Tertiary period 66 million years ago, when mammals replaced reptiles as the predominant vertebrates, the article states.

The first step towards making Anthropocene official is to select a date when it begins — not exactly an easy task considering that human impact hasn’t be uniform across the globe throughout history. Does it make sense to start at the launch of Industrial Revolution in the West?

One clear sign that we’re already deep into the Anthropocene is that we talk, not only about human destruction of the Earth, but also contemplate ways to fix it through geo-engineering. Wired this month ran a book review of Hack the Planet by Eli Kintisch about embracing our “God role” since things have so changed so much that “stewardship” is no longer an option.

This story has been translated into Portuguese.