Temperature Homogenisation Errors

Author: Dr. Michael Chase

Figure_4.1_Adelaide_screens

“When breakpoints are removed, the entire record prior to the breakpoint is adjusted up or down depending on the size and direction of the breakpoint.”

Extract above from: http://berkeleyearth.org/understanding-adjustments-temperature-data/

Temperature measurements have two classes of non-climatic influence:

  • Transient influences, with no impact on the start and end of the record
  • Persistent shifts, with an impact on the start of the record relative to the end

An example of a persistent shift is the switch from the use of a Glaisher stand or thermometer shed in the early years, to the use of a Stevenson screen at some time within the record. Another example of a persistent shift is a station move within the record.

Examples of transient influences are urban growth near a weather station before it is relocated, and deterioration and replacement of the thermometer screen.

This article asserts that the major temperature homogenisations are failing to make a distinction between transient and persistent influences, treating all perturbations as being persistent. The consequence of that poor assumption is that the transient influences, which are predominantly warming, are leading to an over-cooling of the early periods of many records, with this artifact increasing in severity as the detection of step changes in temperature becomes more sensitive.

Suppose a weather station recorded accurate temperatures around 1900, but now those temperatures are being changed by homogenisation algorithms, as a result of all the things that happened at, and to, the weather station between 1900 and its current location at (say) an airport. Typically, between 1900 and now, the temperatures recorded will have been influenced by several non-climatic effects, such as varying amounts of urban warming, various screen and thermometer deteriorations and replacements, and observer errors.

Suppose a village or town was built around the weather station, maybe consisting of just a few nearby buildings, and the weather station was relocated to an airport or rural site. Suppose the screen deteriorated, allowing sunlight to shine on the thermometers, until the screen was replaced. Would a computer be able to use the temperature record, compared to those of neighbours, to properly correct the temperatures all the way back to 1900 so that they reflected only the background climate?

In order to change temperature data back to 1900, to give what would have been measured in the past, but at the current location of the weather station, with its current equipment, you must have a COMPLETE history of the non-climatic influences on the temperature data being adjusted. But, such a complete history is usually lacking, the homogenisation algorithms only quantify step changes in influences, and only when the steps are large enough to detect, or when there is information to suggest that a change is expected.

The algorithms for step detection and quantification are already quite sophisticated and are being further developed, but there is a barely mentioned elephant in the room, the drastically simple (and often bad) assumption that the history of non-climatic influence consists solely of its step changes, which is only true if the non-climatic influences don’t change with time between the steps.

A significant part of the total non-climatic influence on recorded temperature can be regarded as “thermal degradation”, typically urban growth near a weather station, and deterioration of the (usually wooden) thermometer screen. Such degradation is both warming and time-varying, often growing slowly then ending suddenly with a step change down in temperature when the weather station is moved to a better location, or an old or broken screen is replaced with a shiny new one. It is of course correct to adjust temperatures down in the years leading up to such sudden cooling events, but the assumption of constant influence will often over-correct earlier years, in particular the years before the influence in question began.

Currently, in temperature reconstructions produced by homogenisation algorithms, early temperature measurements are being reduced incorrectly as a result of the detection of time varying warming influences. As the detection algorithms become more sensitive the invalid cooling of the past increases.

As a systems engineer I find it useful to regard temperature homogenisation as a system, inputs are raw data, outputs are meant to be better representations of background temperature history. Treating all detected step changes as being persistent is a functional design error. The step change detection algorithms themselves are blameless, the problem lies with an invalid assumption made at the functional design stage.

To avoid over-correcting the past, expert meteorological data analysts should make the final decision about whether or not, and how, each detected step change is removed from the data. Station history, and the temperature data itself, must be examined to classify each detected step change into two groups, persistent changes (such as station moves or equipment changes) or time-varying changes, the latter requiring special analysis to either measure (from the data) or model how the influence varies with time.

This entry was posted in Uncategorized. Bookmark the permalink.

Leave a comment