The Warmest Year on Whose Record?

In January 1999 the National Oceanic and Atmospheric Administration (NOAA) announced that 1998 was the "warmest year on record." A year earlier NOAA had declared 1997 the "warmest year on record." Then in January 2000 NOAA proclaimed 1999 the "second warmest year on record."

With so many records being broken, something unusual must be happening with Earth's climate. Or is it? When told that a year was the warmest on record, the critical mind might ask, "How long is the record, and how accurate is it?"

Limits to Measuring Temperature in the Recent Past

Meteorologists have estimated the average temperature of our planet back to the year 1860. However, the database on which these estimates are premised is incomplete and contains significant uncertainties and inconsistencies. Before 1900 there were no reliable temperature records for more than 50 percent of the globe. Interpretation of the climatic record is confused by changes in instrumentation, station location, observation times and the urban heat island effect.

Temperatures in the Southern Hemisphere, which is 80 percent ocean, are particularly hard to capture. Measurements of sea surface temperature exemplify the problem. Oceans cover 71 percent of Earth's surface, yet the accurate estimation of sea-surface temperatures is problematical. Historically, these measurements have been made by collecting water in canvas buckets. Since the early 1940s, measurements have been taken in the pipes that draw in water to cool ship's engines. When measurements are made on the same body of water at the same time, comparisons of the two methods differ from 0.5ºF to 1.3ºF. The modern method routinely yields higher temperatures, because water collected in canvas buckets cools by evaporation. Meteorologists attempt to correct for this, but the magnitude of uncertainty in the correction is as large as the total warming they see in the global record.

Measuring Long-term Temperature: What Does the Record Say?

Prior to 1860, the global record of recorded temperatures is so spotty that no meaningful estimates of worldwide climatic conditions can be made. Yet the past is the key to understanding the present. Earth's climate system is complex and poorly understood. Natural changes and trends in temperature exist at all time scales. Without the benefit of a longer perspective, it is impossible to conclude whether the record warmth of the 1990s is truly anomalous or is simply part of the natural cycle.

In the past few years a new methodology has been developed that allows us to infer past temperatures from measurements of temperatures in wells or boreholes. This method relies upon the fact that temperature changes at the Earth's surface are captured in the subsurface and preserved there. Measurements of temperatures in boreholes can be used to reconstruct climatic conditions at the surface for the past several thousand years.

The procedure is simple. A well is drilled and allowed to stand for several months so that the thermal disturbances or temperature changes caused by drilling can dissipate. A thermometer is then lowered into the well and the temperature recorded at different depth intervals. These temperature measurements are combined with information on the thermal properties of the local bedrock in a mathematical analysis that estimates how the ground surface temperature has changed over hundreds or even thousands of years.

When the geologic context is understood, the past century's warming is inconsequential. In August 1997, Professor Henry Pollack and his colleagues at the University of Michigan published a study in Geophysical Research Letters. They estimated mean global temperatures for the last 10,000 years from temperature measurements in more than 6,000 boreholes around the world [see the figure ].

Pollack and his colleagues found that the modest 1.0ºF temperature rise recorded by meteorological instruments over the last 140 years is present in the borehole measurements. However, the borehole data also showed that present-day climatic conditions are in fact colder than average when compared to climatic conditions that prevailed over the 10,000-year-long rise of human civilization. The average global air temperature in modern times is 57.2ºF. For 7,000 of the last 10,000 years, the mean planetary temperature was more than 1.0ºF warmer. The warming of the last 140 years is a recovery from a period of unusually cold temperatures in the 19th century.

In 1998 these results were confirmed in the most accurate study of ancient temperatures ever conducted. As part of the Greenland Ice Core Project (GRIP), research scientists from Denmark and the U.S. Geological Survey measured temperatures in two deep boreholes drilled near the summit of the Greenland Ice Sheet. Although these temperatures were not necessarily representative of global conditions, the climatic history inferred from them was largely consistent with the global record obtained by the University of Michigan scientists. The results, published in Science in October 1998, were ignored by major media in the United States. The Greenland and the University of Michigan findings agree in several important respects. Both show that long before man was capable of influencing Earth's climate, natural cooling and warming trends lasting hundreds and thousands of years were present. For instance:

  • The temperature rise seen in meteorological measurements of the last 140 years is a recovery from a cold period in the 19th century.
  • Even after the modest 1.0ºF global warming of the last 140 years, present-day global temperatures remain cooler by about 1.0ºF than they were when the Vikings settled Greenland in medieval times.
  • For more than 7,500 of the last 10,000 years, temperatures have been higher than today.
  • For at least 5,000 of the last 10,000 years, the mean planetary temperature was about 1.5ºF warmer than today.

Global Climate Hysteria Redux

Sometimes it is difficult to remember that nature operates on a geologic time scale. Human beings have a tendency to take short-term trends and extrapolate them to ominous doomsday scenarios.

From about 1945 to 1975, average land temperatures in the Northern Hemisphere fell by a very small amount, about 0.4ºF. This led to wave of speculation concerning global cooling. In 1975 reporter Peter Gwynne wrote in Newsweek, "The central fact is that after three-quarters of a century of extraordinarily mild conditions, the Earth's climate seems to be cooling down." Gwynne went on to warn of "profound climatic change" with "catastrophic famines" and said that meteorologists were "almost unanimous" in their view that a cooling trend would reduce agricultural productivity. The article concluded by warning, "The longer the planners delay, the more difficult will they find it to cope with climatic change once the results become grim reality." The hysteria was taken a step farther that same year by Nigel Calder in an article titled "In the Grip of a New Ice Age?" in the National Wildlife Federation's journal, International Wildlife. Calder warned that "the threat of a new ice age must now stand alongside nuclear war as a likely source of wholesale death and misery for mankind."

Conclusion

Although the menace of "global cooling" has abated, the last 25 years have not seen any moderation in the tendency of the media to focus on the bizarre, the unusual and the speculative in relation to climate science. The geological evidence demonstrating that 20th-century warming is nothing unusual has been ignored, while hysteria over "global warming" has been pushed relentlessly. With important policy decisions depending on an informed public, this is journalistic negligence.

David Deming is an Associate Professor of Geology and Geophysics at the University of Oklahoma.