Advancing the development and use of forecasting technology
Providing predictions for the oceans, the dispersion of pollution, rainfall run-off, climate change and other environmental parameters.
The Met Office is best known for weather forecasting. But, as well as a huge range of forecasts for a wide range of purposes around the globe, the Met Office also provides predictions for the oceans, the dispersion of pollution, rainfall run-off, climate change and other environmental parameters.
These forecasts and predictions are made using numerical models which are constantly being developed and improved by a comprehensive research programme. The technology used encompasses everything from weather buoys, satellites and aircraft, to the supercomputers that are an integral part of forecasting and work on predicting climate change. Advancing the development and use of technology is key to the improvement of forecasting techniques.
This article describes some of the science and technology underpinning our work, and the improvements and new developments expected from our current work.
What is numerical weather prediction?
It may be simple enough to guess that it is about to rain if you see clouds gathering on the horizon. But what if you have to produce a forecast for several days, or even weeks in advance, covering a whole country or region, predicting quantities such as wind strength and direction, cloud cover and temperature?
The process begins with making observations of temperature, humidity, winds and other parameters around the world. These are then used as the initial conditions in a numerical forecast model. The behaviour of the atmosphere can be approximated using the equations of fluid dynamics. These include the equations of motion, the thermodynamic and moisture equations, and the continuity equation. These equations form the basis of the numerical weather prediction model. They are solved mathematically, using computer techniques that calculate, step by step, future states of the atmosphere as a set of values on a 3D grid.
Advancing from one state to the next is called the ‘time step’ and the gap between the points on the grid is called the ‘grid length’ or ‘spatial resolution’. Observations from satellites, aircraft and weather balloons, as well as from many other sources, are used to determine the initial conditions from which the forecasts begin.
The models of the atmosphere and oceans used in the Met Office are part of the Unified Model system, so-called because it incorporates global and regional components that use the same code for both short-range weather forecasts and longer period climate simulations. Various components of the model are run at regular intervals throughout the day and to tight deadlines in order to provide up-to date forecast information.
A global configuration provides forecasts up to five days ahead, and higher resolution regional and UK configurations are used to provide more detailed forecasts for Europe and the UK for the period up to two days ahead.
Reliable initial data are essential in the production of high quality forecasts and continued efforts to make best use of the available observations are required. Satellites have the advantage of global coverage and have become the primary data source for global forecasts. The use of other sources of data, such as those from an enhanced UK radar network, are also important.
Reliable initial data are essential in the production of high quality forecasts and continued efforts to make best use of the available observations are required. Satellites have the advantage of global coverage and have become the primary data source for global forecasts. The use of other sources of data, such as those from an enhanced UK radar network, are also important.
These observations cannot be introduced directly into numerical models and must be passed through an assimilation scheme which handles them in an optimal way, taking account of their error characteristics, consistency with other observations and the state of the model. The development, over the last few years, of new approaches to data assimilation – 4 Dimensional Variational data assimilation (4D-Var) – allows better use to be made of observational data, through improved representation of model errors and the time evolution of observations.
The Unified Model has been reformulated over the past three years and is now non-hydrostatic. This change allows vertical motion to be modelled properly. This important change, and advances in representation of the physical processes, will allow high-resolution models to be developed with hills and valleys represented in detail – features which have a major influence on the weather.
Observations to improve models and forecasts
Computer models do a very good job of accurately describing the large-scale motion of the atmosphere. But they can’t fully describe many of the small-scale phenomena such as convective clouds which affect the large-scale motion.
We overcome this problem by using a technique called parametrization, which simplifies the physics in order to represent the effects of the small-scale phenomena on the larger scale. This technique is dependent on obtaining detailed measurements of specific meteorological conditions that can be used to develop improved parametrization methods.
There are various ways of taking these measurements. One of the most important has been the use of a range of instruments on specially-equipped aircraft.
A BAe 146-301 aircraft has completed an extensive conversion for use as an atmospheric observation platform. It is run jointly by the Met Office and the Natural Environment Research Council on behalf of the academic science community.
Specialised forecasts
When a volcano erupts, if there is a nuclear accident or there is an outbreak of an airborne disease such as foot and mouth, it is vital to be able to track airborne pollutants.
The Met Office uses a dispersion model, known as NAME (Numerical Atmospheric dispersion Modelling Environment), to track plumes of pollution. It was originally developed after the nuclear accident at Chernobyl in 1986, which highlighted the need to predict the spread and deposition of radioactive material released into the atmosphere.
Over the years, NAME has been applied to a number of atmospheric releases, including radioactivity, oil fires in Kuwait, major industrial fires and chemical spills, and two major volcanic eruptions in Iceland.
Recently, the model has been adapted to run backwards. This is designed to help forecasts of air quality and identify the source or causes of a pollution incident, where an area may be affected by poor air quality but the source of the pollution is unclear.
A European severe weather warning system
There have been many episodes of damaging weather to parts of Europe in recent decades. It has become clear that there is a need for early warnings of probable damage to be exchanged across the continent, so that the risk of damage can be minimised.
The network of European Meteorological Services has developed the European Multiservice Meteorological Awareness (EMMA) system to help deal with these risks.
To allow easy interpretation and receipt of warnings across the continent, a geographical information system is used. Colour coding of the warnings show their expected severity and the area affected.
The system will be available to the general public, government authorities and European forecasters to provide a useful tool for the timely preparation of those likely to be affected. Warnings will be easily accessible through the internet, on display in the user’s language.
Ocean observations are essential for accurate analysis
Ocean observations are essential for accurate analysis and forecasting of ocean conditions, seasonal prediction and understanding, and the prediction of climate variability and change. As part of our work improving ocean observations, the Met Office is a prominent member of an international programme called Argo, to deploy 3,000 free-drifting profiling floats to measure the temperature and salinity of the upper 2,000 m of the ocean. The Met Office co-ordinates the UK contribution, and over the last three years more than 100 UK floats have been deployed.
Satellite observations of the ocean are an essential component of the ocean-observing systems and UK support for the Jason-2 satellite altimeter – which will deliver high accuracy sea surface height information – is being channelled through the Met Office. The Global Ocean Data Assimilation Project High Resolution Sea-Surface Temperature (GHRSST) pilot project aims to deliver a new generation of highly accurate worldwide sea-surface temperature products with a resolution of less than 10 km every six hours. As an important step towards achieving this goal, we are hosting the GHRSST International Project Office, which co-ordinates sea-surface temperature data merging and analysis activities in the UK, Japan, USA, Australia and France.
The greenhouse effect and global warming – is the climate changing?
There is clear scientific evidence that the Earth’s temperature is rising and that most of the warming over the last 50 years has been caused by human activities. These activities have altered the chemical composition of the atmosphere through the build-up of certain greenhouse gases.
The Earth’s climate is driven primarily by heat from the sun. Ultraviolet radiation from the sun warms the Earth’s surface. Infrared radiation is emitted from the surface and atmosphere, back to Space. However, some of this radiation is absorbed by the greenhouse gases in our atmosphere (such as water vapour, carbon dioxide and nitrous oxide) thus slowing down the rate of re-emission to Space.
In this way, they trap heat rather like a greenhouse – hence the name ‘greenhouse effect’ – and raise the surface temperature. This is a natural effect, and without it the Earth would be much too cold to support life. However, by changing the balance of gases in the atmosphere, human activities appear to be causing a rise in temperatures which could have serious consequences for future generations.
Average global surface temperatures have risen about 0.7 °C in the past 100 years. We are confident that this is evidence of global warming because the rise in temperature is much greater than the uncertainties in the record. The evidence indicates that both the land and the oceans have warmed. Scientists at the Met Office are using climate models to understand the causes of global warming.
Temperatures are also monitored in the upper atmosphere as well as near surface and up to 15 km altitude in the tropics they appear not to be warming as expected from the evident surface warming there.
A possible reason is that these temperatures are influenced by cooling of the layer immediately above most weather – the lower stratosphere. This cooling is occurring because of reductions in ozone and increases in some greenhouse gases, such as carbon dioxide, at these levels. As climate models do not show this behaviour, we are working to improve our understanding of the physical mechanisms.
As well as simulating observed climate, it is important that we demonstrate that models represent processes accurately. For example, one of the main uncertainties in climate change is the interaction between clouds and climate. Increased high cloud tends to warm the climate, since high cloud absorbs outgoing infrared radiation. Increased low cloud tends to cool the climate as incoming ultraviolet radiation is reflected back to Space from its top.
What caused recent climate change?
Detection of climate change involves showing that a particular change in climate is unusual relative to the natural variations in climate. Attribution involves demonstrating that a detected change is due to a particular cause (for example, human activity).
The recent warming is so large that it is unlikely to be due to natural (unforced) variations in climate. Hence it is likely that we have detected a ‘forced change’ in climate. Potential explanations for the change are natural factors (changes in solar intensity, a reduction in the cooling associated with major volcanic eruptions) or human influences. Model simulations indicate that the natural factors fail to explain the recent warming, whereas human influences – through increases in greenhouse gases – can. Indeed, it is likely that most of the warming over the last 50 years can be attributed to human activity.
Observations of temperature show that on average the globe has warmed substantially over the 20th century but that there have been large regional variations in the amount of warming.
The modelling study investigated the historic impact on the climate system of the combined effect of greenhouse gases, anthropogenic sulphate aerosol, lower atmosphere and stratospheric ozone and the effects of volcanoes and changes in the output of the sun. The study compared this with the climate change that would have been expected if only natural factors from volcanoes and changing solar output were important.
How much will the climate change?
Predictions of future climate change are a vital input to the strategic plans made by governments and many private companies. These predictions depend on factors such as politics, population growth, economic growth, technological development – and so on.
The scientific community works with a number of carefully developed scenarios which take into account as many of these factors as possible. These scenarios were developed by the Intergovernmental Panel on Climate Change (IPCC). They are described in detail in the IPCC Special Report on Emissions Scenarios. Currently the global mean temperature rise over the 21st century is predicted to be 4.5 °C for the highest emissions and 2 °C for the lowest. The mid-range scenario projects a global mean temperature rise of 3 °C.
Predictions are required at scales of tens of kilometres or less. Global climate models are too expensive to be run at this resolution so we bridge the gap using regional climate models run at a resolution of 25 km or 50 km over areas covering whole continents. Recently, the Met Office’s Hadley Centre has used its climate model to simulate and predict the evolution of the Greenland ice sheet over several thousand years. This experiment is unique in that changes in the ice sheet, such as the height of the ice or whether the ground is covered in reflective ice or dark soil, are fed back into the climate model.
The results show that over the 3,000 years following a quadrupling of atmospheric greenhouse gas concentrations, the ice sheet recedes from most of Greenland. By the end of the simulation, it exists only on the mountainous ground of the east. The fresh water released from this loss would cause a sea-level rise of around seven metres.
Earlier results suggested that if the ice sheet is removed in this way it would not recover, even if greenhouse gas concentrations were significantly lowered. The next task is to understand at what point the melt down of Greenland becomes irreversible – the so-called trigger point.
A key question in climate research concerns the stability of the thermohaline circulation, a system of large-scale currents including the Gulf Stream in the North Atlantic Ocean, which carries heat from the tropics to higher latitudes as cold salty water sinks near the pole, drawing warm water north-eastwards.
Recent observations have shown a reduction in the salinity of the sea water deep in the north-west Atlantic, and this has been interpreted by some as an early sign of a weakening thermohaline circulation. A complete shutdown of the THC is considered very unlikely however – classified a low probability but high impact event.
Biological and chemical influences
As the atmospheric concentration of carbon dioxide (CO2) increases, so does the ability of vegetation to take up CO2 from the atmosphere.
However, the increase in CO2 leads to changes in temperature and rainfall, which can affect natural carbon sinks. For example, the coupled climate-carbon cycle model predicts a dying-back of the vegetation in northern areas of South America – one of the most significant carbon sinks.
Climate change also affects the amount of CO2 emitted by bacteria in the soil. In the ocean, changes in circulation and mixing alter the ocean’s ability to take up CO2 from the atmosphere. In addition, the warmer oceans absorb less CO2. To include all of these feedbacks it is necessary to treat the carbon cycle and vegetation as interactive elements in full global climate modelling experiments.
The climate-carbon model suggests that the land-surface as a whole could switch from being a weak sink of CO2 to become a strong source from about 2050, under a ‘business as usual’ emissions scenario. This result implies that there may be a well-defined ‘dangerous level’ of CO2 – beyond which the land-carbon cycle begins to accelerate climate change – and also that stabilising CO2 below this level may be more difficult than previously thought – because feedbacks lead to less anthropogenic CO2 being absorbed by ecosystems.
The chemistry of the atmosphere will change as climate changes. Predicting how these processes will interact is a significant scientific challenge.
Some greenhouse gases, such as methane, and ozone are controlled by chemical reactions in the atmosphere. These reactions depend on the emissions of natural and man-made pollution and are strongly affected by climate change. To take account of these complexities, comprehensive models of atmospheric chemistry calculate future concentrations of reactive greenhouse gases and the results are passed to climate models to calculate their warming effect.
Understanding the risks
One of the ways in which scientists are improving our climate modelling techniques is through the use of ensembles. Ensemble climate modelling uses similar principles to ensemble forecasting to span the uncertainties due to errors in initial conditions and shortcomings in the model formulation.
Global climate model predictions are subject to the considerable uncertainties in the modelling process; ensemble techniques help to quantify the uncertainty present in any model.
As a first attempt, Met Office scientists have produced a 53-member ensemble simulating the long-term response to a doubling of atmospheric carbon dioxide. It was created by varying parameters in the model physics whose values cannot be accurately specified on the basis of theory or observations, and are therefore subject to an uncertainty range.
This initiative is being developed to produce ensemble predictions of time-dependent climate changes during the 21st century. This will allow us to produce estimates of the evolving uncertainty range for global and regional climate changes in response to a given greenhouse gas emissions scenario, which are consistent with the uncertainties arising from the range of surface and atmospheric parameters. It will also be possible to cater for key uncertainties associated with other aspects of the climate system, such as physical processes in the ocean, the atmospheric sulphur cycle and the terrestrial carbon cycle.
In future, a climate index will provide a measure of how well the model reproduces present climate changes in climate. This will be used to weight the predictions of different model versions and will provide an improved basis for the assessment of climate-related risks.
Barry Gromett
The Met Office, Fitzroy Road, Exeter, Devon EX1 3PB Tel: +44 (0) 870 900 0100
www.metoffice.gov.uk
Published: 01st Jun 2006 in AWE International