Climate Change is one of the biggest concerns of today and yet there remains much uncertainty about the science, the extent being expressed in the Intergovernmental Panel on Climate Change (IPCC), Fourth Assessment Report (AR4) (2007) thus: The complexity of the climate system and the multiple interactions that determine its behaviour impose limitations on our ability to understand fully the future course of Earth’s global climate. There is still an incomplete understanding of many components of the climate system and their role in climate change.
The IPCC list the main uncertainties as: clouds, the cryosphere, the oceans, land use and couplings between climate and biogeochemical cycles.
While professional scientists discuss these uncertainties among themselves routinely, the view that has become prevalent in the public arena is that climate science is all completely settled with no need for further discussion. The IPCC statement clearly says otherwise. Indeed, the IPCC reports (and many scientific papers) contain numerous caveats and cautions that never reach the public. This article reviews the main uncertainties.
Floods and droughts may affect human life more than a small increase in temperature, but it is energy from the sun that powers all such processes and so uncertainties concerning temperature will be the main concern of this article.
The greenhouse effect – some misunderstandings
The Greenhouse Effect (GHE) is real. If it were not, Earth’s average temperature would be about -18º C instead of +15º C.
The main Greenhouse gases (GHGs) are water vapour (WV) and carbon dioxide (CO2). There are others like methane and nitrous oxide but they are less influential. How the GHE operates is not as simple as many visualise it. Indeed, greenhouses operate mostly by trapping the warmed air rather than by trapping infrared (IR) radiation; the analogy is misleading.
The GHE cannot be visualised from the viewpoint of an observer on the ground with a pane of glass immediately overhead representing the GHGs. Earth balances the incoming solar radiation by emitting IR radiation to space, not directly from the surface but from the upper troposphere where the GHGs are sufficiently rarefied to enable the IR photons to escape and where the temperature is low. If there was not a fall of temperature with altitude (the lapse rate) and a reduction of air density, there would be no GHE 1.
It can be calculated that if CO2 levels doubled on those of pre-industrial times, the temperature at the surface could, theoretically, increase by about 1° C.
When it is said that the science of climate change is settled, the statement refers mostly to this single fact. But many other factors are involved which are not settled.
Water vapour and cloud feedbacks – uncertainties
While WV is the most powerful GHG, climate models treat it only as a feedback because its amount varies greatly and its ‘residence time’ is short. The argument goes that an increase in temperature, due to anthropogenic increases in CO2, would increase evaporation and the extra WV, being a GHG, would then ‘amplify’ the warming – perhaps resulting in ‘runaway’ warming. It is this amplification that produces model projections of future temperatures higher than that due to CO2 alone, along with an uncertainty about future CO2 emissions.
WV amplification, however, will not occur just in the future due only to anthropogenic warming. WV feedbacks have occurred back over geological time, amplifying natural changes of temperature. Water vapour amounts also change naturally and very rapidly through variations in sea surface temperature, for example due to the El Niño Southern Oscillation 2 & 3. These changes will increase (or decrease) evaporation from the oceans, affecting the amount of WV in the atmosphere and thus the GHE (although precipitation balance will be a complicating factor). The reverse also happens, cooling, by reducing the amount of WV in the atmosphere, is then amplified, increasing the cooling (as probably happened during the eruption of Pinatubo). These natural changes of WV are in addition to any anthropogenic WV feedback, making the situation more complex and less easily predictable.
Why the global temperature has not risen over the past decade, when conditions for WV amplification would appear to have been favourable, is obviously a puzzle. Although WV feedback is potentially a real phenomenon, it is clearly also complex and not always inevitable.
There is a further uncertainty involving WV. Under warmer conditions, the additional WV would take its extra latent heat up into the troposphere and release it as it condensed into cloud droplets, thereby warming the upper atmosphere. This reduces the lapse rate making the atmosphere where the photons escape warmer, thereby reducing the GHE (the lapse-rate feedback). This effect works in the opposite direction to the WV feedback and the balance between them is unknown.
Resolving WV uncertainties is important because climate models include a substantial WV positive feedback.
Clouds reflect incoming solar radiation and retain outgoing IR radiation, making them central to Earth’s energy balance. Overall at present they produce a gross negative feedback. The question arises as to how clouds may change if it gets warmer (colder) and how any changes might feedback and change the climate. But there is little understanding of how different clouds (Figures 1 and 2) might change in response to changes of global temperature. There are many differing views, but few firm facts. It is also possible that a natural change of clouds might change the climate; it could work both ways. The IPCC summarise thus:
‘Models exhibit a large range of global cloud feedbacks, with roughly half of the climate models predicting a more negative CRF (Cloud Radiative Forcing) in response to global warming, and half predicting the opposite.
Both WV and cloud feedbacks are important uncertainties which need to be resolved to improve model projections of future temperatures.
Sun variations
Before the satellite era, the ‘Solar Constant’ 2 & 3 was known only from measurements made on the ground. Small changes could not be detected because of the interfering effects of the atmosphere. Recently, measurements from satellites have shown that the solar constant changes by only about ±0.05% from its present annual average of 1360 Wm2, peaking at sunspot maxima. (Due to Earth’s elliptical orbit it varies by about 3% over the year, peaking in January).
This is a measurement across the full solar spectrum. Changes this small probably cannot affect the climate. But it does not end here:
The SOHO satellite has now shown that during an 11 year solar cycle, changes in the ultraviolet (UV) bands are much greater, varying up to 10 times the minimum. UV reacts with oxygen producing ozone, heating the stratosphere. A change of UV may change the temperature structure not only of the stratosphere but, research suggests, also of the troposphere beneath, including the jet streams, and thereby of atmospheric circulation and so of temperatures at the surface.
The Sun’s magnetic field and solar wind also vary. These create a shield around the solar system, deflecting incoming Galactic Cosmic Rays (GCR), more being repelled during a sunspot maximum. GCRs are the primary source of ionisation in the troposphere (above 1 km) and changes in their number may affect the number of cloud nuclei available 2, a weaker sun allowing more GCRs to reach Earth, possibly causing more clouds to form. Reading University is studying the relationship between GCRs and the electrical charge on clouds while the CLOUD Experiment at CERN is carrying out tests in a cloud chamber with simulated GCRs.
Again, the IPCC does not hide the uncertainty, saying:
‘More research to investigate the effects of solar behaviour on climate is needed before the magnitude of solar effects on climate can be stated with certainty’.
Instrument records
Measurements have been made by instruments on the surface since around 1850. For the last 50 years, measurements have also been made high into the atmosphere and deep into the oceans, but there is only space here to consider the surface measurements. For the reasons given, we will look just at temperature, starting with Land Air Temperature (LAT).
All meteorological measurements until about the 1970s originated from manual instruments such as Stevenson screens and liquid-in-glass thermometers.
Even today, many instruments operated by the world’s National Meteorological Services are still of this type. Errors originate from how the instruments are operated, exposed, maintained and calibrated. The microclimate they operate in will change over time, as may the surrounding country. All of these factors have the potential to give a misleading indication of climate change 2, 3, 4 & 5.
Two methods are used to check the records for ‘homogeneity’ (or being free from non-climatic influences). The first, ‘absolute homogeneity’, uses the station’s metadata (records kept about changes at a site) as a guide for likely errors. The second, ‘relative homogeneity’, looks for inconsistencies between neighbouring stations. In fact the latter is usually the only possibility since most stations lack any metadata. While the relative method can detect large sudden errors, it cannot tell if slowly-changing differences are errors or a real change in climate at one station. Questionable records are rejected even though they might be correct.
But 71% of the globe is ocean-covered. While Marine Air Temperature (MAT) is measured aboard ships, it is not used widely because it is unreliable. Instead, Sea Surface Temperature (SST) is measured, using buckets to collect the water. More recently the temperature of the engine-room intake water and hull temperatures have been measured. Moored and drifting buoys have also now been introduced. The lack of a need for thermometer screens, the homogeneity of the oceans and minimal human presence, avoids most of the problems encountered in measuring air temperature over the land, making SST a potentially more accurate record.
It would be disingenuous to argue, as some do, that the records of temperature are so unreliable (or even that they have been intentionally falsified) as to make them worthless. Although there is inevitably some uncertainty over their accuracy, they contain a wealth of valuable information.
Calculating global averages
Because in climate studies we are interested in temperature changes, ‘anomalies’ are used rather than simple averages. Temperature measurements from land instruments are first converted into daily, monthly and yearly averages. The average of each average is then calculated over a 30-year period, currently 1961-1990. The anomaly is derived by subtracting each daily, monthly and annual average from its 30-year average. But we are more interested in anomalies over an area, so these point anomalies are next converted into anomalies over a ‘grid’, typically of 5º x 5º latitude by longitude. To calculate the average anomaly for each grid, the anomalies from all the stations in each grid are simply averaged.
But this procedure may contain flaws. Is it legitimate, for example, to average anomalies from different climates without some adjustment? Should anomalies not be ‘weighted’, for example to allow for differences in the amount of available free water, or of WV in the atmosphere, which will affect the division of incoming energy into sensible and latent heat fluxes? This is not done, but probably should be.
Over the oceans, the conversion of spot readings of SST into average anomalies is calculated in the same way as with LAT. Because the oceans are homogeneous, however, it seems more legitimate to average anomalies within a grid without weighting them, but whether this holds true up to the hemispheric scale is doubtful since the processes in tropical and polar oceans are very different.
Other uncertainties arise; for example, should LAT be merged with SST, given their extremely different natures, processes and speed of change? And again, does one average figure for the whole globe – land and sea and both hemispheres combined – have any useful meaning? Such gross figures hide all the differences between hemispheres, latitudes and seasons that could help explain why temperature changes occurred. The main use that one figure probably has is for the media, public and politicians who want simple answers; but there aren’t any.
Past climates
Hubert Lamb 6, founder and first director of the Climatic Research Unit at the University of East Anglia wrote:
‘There is some danger at the present stage of developing, at considerable expense, the theory (and elegant mathematically complete theoretical models) of climate without adequate knowledge of the observable behaviour of climate, and of the processes involved in climatic changes, over the last 300 to 3000 years.’
Thirty years later, the IPCC repeat his statement thus: ‘If the behaviour of recent temperature change is to be understood, and the mechanisms and causes correctly attributed, parallel efforts are needed to reconstruct the longer and more widespread pre-instrumental history of climate variability, as well as the detailed changes in various factors that might influence climate.’
While modelling has progressed enormously since Lamb gave this warning, research on past climates has received much less attention. We can be fairly confident about temperature changes over millions of years past because these are thought to be caused by quantifiable changes in Earth’s orbit and tilt – theMilankovitch cycles – plus the various feedbacks (WV, CO2, etc) associated with them.
But we are less certain about the climate of the last few millennia, which would be an invaluable benchmark against which to compare measured 20th century temperatures – to help clarify if they might be natural. Temperature changes over this period are detected by using ‘proxy’ measurements – tree rings, ice cores, coral, lake and sea sediments, borehole temperatures, cave stalactites and written historical records. Over the last decade a number of proxy data studies have been carried out, several being discussed by IPCC. The proxy data come largely from Europe and the USA, with few from elsewhere, especially the tropics and the southern hemisphere.
Proxy data from the oceans originate mostly from coral, limiting them to shallow warm tropical seas.
Several different statistical methods are used to convert the proxy data into hemispheric and global temperature datasets, all giving somewhat different results. One of the most recent (and post AR4) papers is by Mann et al 7 (reference 13 in this paper is also particularly informative). This paper shows that the much debated Medieval Warm Period (MWP) and Little Ice Age were indeed real, occurring respectively between 950-1250 and 1400-1700, although their global extent has not yet been completely defined because of the paucity of proxy data generally, and from the south in particular. For the same reasons, reconstructions are inadequate for saying anything certain about the MWP warmth relative to the present anthropogenic period. Uncertainty will remain in all of these areas until more complete proxy datasets have been generated; this is no easy task.
Attributing temperature change causes
The IPCC explains:
‘Difficulties remain in attributing temperature changes on smaller than continental scales and over time scales of less than 50 years. Attribution at these scales, with limited exceptions, has not yet been established.’
Models are designed ‘from the top down’, being based on established physical laws, like Newton’s laws of motion and the conservation of mass, energy and momentum. This works well, but as IPCC explains above, only on continental scales and over long time intervals. Despite these limitations, the IPCC’s firm, unambiguous conclusion is:
‘(Anthropogenic) greenhouse gas forcing has very likely (90% probability) caused most of the global warming over the last 50 years.’
‘Attribution’ is defined by the IPCC as the process of establishing the most likely cause of a detected (measured) change of climate by demonstrating that it is ‘consistent with the estimated (model) response to the given combination of anthropogenic and natural forcings.’ But the IPCC acknowledges that this procedure poses ‘considerable challenges’ because different forcings (e g. Solar) can produce the same response as CO2 changes. Nor is attribution helped by the instrument record being too short to define what natural variability is, while proxy records are still not an adequate substitute. And as we have seen, models deal only in 50-year bites, at continental scales and cannot handle WV and cloud feedbacks very well, raising questions about how certain attribution can actually be with the present state of knowledge. Yet sifting through the vast literature on climate, one feels obliged to take the IPCC’s statement on attribution very seriously – although not absolutely unconditionally, for there are many caveats and unanswered questions that remain, as I hope this article has shown.
An alternative route to attribution is by studying the instrument records. These are built ‘from the bottom up’; that is, from observations made daily at individual sites which can then be combined to give large areal values over decades. This provides the opposite viewpoint to models. We know fairly well what actually happened since 1850, plots of the datasets from the University of East Anglia (CRUTEM3) and the Met Office (HadSST2) being shown in figure 3. Numerous interesting points arise that would bear extensive discussion, but comments are restricted due to space.
The large rise of LAT in the NH (top left graph) from 1975 to 2000 (over twice the rise elsewhere) occurred only in the high northern latitudes, over land, in winter and is possibly related to changes in the Arctic Oscillation. The other graphs all show smaller changes. The recent warming is thus mainly a high northern hemisphere phenomena, rather than global. Although models suggest there could be ‘Polar Amplification’ of any warming, which might explain this high northern localisation, there has, in contrast, been some slight cooling in the high southern latitudes since 1975. (See Fig 9.6 in the AR4.) Moreover, the IPCC notes that the polar climate involves large natural variability on many time scales, complicating the attribution of Arctic changes. Indeed, the Arctic was as warm in the 1920s – 1940s as it is now.
Concluding remarks
This article has attempted to demonstrate the complexity and ambiguity of climate science and the uncertainties which still need to be resolved. It is, however, a rapidly advancing science and some of the uncertainties described here may well be reduced in time for the IPCC’s Fifth Assessment Report (AR5), due in September 2013. But uncertainty is central to climate science because of its great complexity, as it is for all complex cutting-edge sciences like cosmology and quantum physics. It does not matter much if there was just one ‘big bang’ or an infinite number, or whether an electron can really be in two places at once, but we live in amongst the climate, and uncertainty does matter. ?
References
1 Strangeways, I. (2011) The greenhouse effect: a closer look. Weather, 66, (No 2), 44-48.
2 Strangeways, I. C. (2010 a) Measuring Global Temperatures: Their Analysis and Interpretation,. Cambridge University Press, Cambridge, New York, Melbourne, Madrid and Cape Town
3 Strangeways, I. (2010 b) Precipitation: Theory, Measurement and Distribution. (Paperback edition). Cambridge University Press, Cambridge, New York, Melbourne, Madrid, Cape Town, Singapore, São Paulo, Delhi, Dubai, Tokyo, Mexico City
4 Strangeways, I. C. (2003) Measuring the natural environment (2nd edn.) Cambridge University Press, Cambridge, New York, Melbourne, Madrid and Cape Town.
5 Strangeways, I. (2010 c) Weather Monitoring and Global Cooling. AWE, Issue 23, September 2010, 26-33.
6 Lamb, H. H. (1977) Climatic History and the Future. Princeton University Press, Princeton, New Jersey, USA
7 Mann, M. E. (and eight other authors) (2009) Global signature and dynamical origins of the Little Ice Age and the Medieval Climate Anomaly. Science, 326, 1256-1260 IPCC (2007) Climate Change 2007: The Physical Science basis. Contribution of Working Group 1 to the Forth Assessment Report (AR4) of the Intergovernmental panel on Climate Change. Cambridge University Press, Cambridge, 996 pp.
Author
Dr Ian Strangeways was head of Applied Physics at the Institute of Hydrology (now the Centre for Ecology and Hydrology) for 25 years, since when he has been Director of TerraData Ltd, a consultancy in environmental monitoring. This work has taken him to many remote parts of the world where he has witnessed firsthand how the climate is measured. He is currently researching where gaps lie in our understanding of the climate, as summarised in this article. He is also working on improved methods of measuring air temperature and better snowfall measurements in Antarctica. He is a Fellow of the Royal Meteorological Society and has written several books and many articles and papers, as well as giving talks, on these interrelated topics.
www.osedirectory.com/environmental.php
Published: 10th Dec 2011 in AWE International