The industrial revolution caused arguably the greatest societal shift in history. Agrarian societies became industrial, the population exploded and technological development accelerated rapidly. But whilst industry and newfound prosperity solved many existing problems, the fossil fuels that powered it caused unprecedented damage to the environment.
Until the advent of alternative energy sources like nuclear power in the 1950s, industrialisation was powered by coal, initially as a direct fuel and later to generate electricity. Its uncontrolled burning emits a concoction of damaging pollutants like sulphur dioxide, nitrogen oxides, particulate matter (soot) and mercury, to name just a few.
As the industrial revolution took hold, domestic fires and unregulated factories burnt increasingly large amounts, causing the new phenomenon of smog (literally ‘smoke and fog’) to settle over urban spaces. In London, there are several reports of deadly pollution events throughout the 1800s, but it was the infamous ‘Great Smog’ of December 1952 that finally motivated change.
Households had kept their hearth fires lit to ward off the cold and, as always, factories continued to put smoke up the chimney stacks. At the same time, an inversion layer in the atmosphere prevented the smog from dispersing and it quickly became so thick that visibility dropped to just a few metres. Press reports even claimed that cows at Smithfield market were killed from asphyxiation. Officially, 4,000 people died as a result of that single pollution event, although the true number is thought to be a lot higher.
Cleaning up our act
The ‘Great Smog’ was the impetus needed for the government to introduce the first serious attempt to deal with air pollution with the introduction of the Clean Air Act in 1956.
The legislation included measures to relocate power stations away from cities and shift domestic energy towards ‘smokeless fuels’. It is no coincidence that coal use in the UK peaked in the same year as the Act’s introduction, from an annual consumption of 221 million tonnes then to 49 million tonnes today, despite significant increases in population and energy requirements in the intervening years.
Tightening the rules on what could be emitted and where ensured that the world’s worst air pollution disaster couldn’t happen on that scale in the UK again. Alongside the succession of Clean Air Acts that passed through parliament in the 50s and 60s came the world’s first co-ordinated national air quality monitoring network in 1961, known as the National Survey.
The National Survey monitored black smoke and sulphur dioxide, initially at 500 sites which quickly grew to 1,200 by 1966. The new laws and network did their job; both pollutants have fallen continually since and cows no longer die of asphyxiation.
While lethal smog events from coal are gone for good in the developed world, air pollution has by no means been confined to the history books. Today transportation accounts for most of the problems for cities, emitting nitrogen oxides and particulate matter such as black carbon. Greenhouse gases are also under increasing scrutiny, as were chlorofluorocarbons (CFCs) after their destructive effects on the ozone layer were realised in the 1980s.
It is easy to look back at the ‘Great Smog’ and think that we no longer have anything to worry about, but the invisibility of most current pollution can cause complacency. Recent figures link air pollution to about 40,000 early deaths a year in the UK alone as it increases the risk of cancer, asthma, heart disease and even neurological changes that could lead to dementia.
As well as being a direct hazard to health, greenhouse gases and black carbon – particulate matter left from incomplete combustion of fossil fuels – directly contribute to climate change. As the ominous name suggests, black carbon’s colour absorbs sunlight and dulls bright surfaces like snow to make them less reflective. As a result, less heat is reflected back to space, warming the Earth so much it is said to be the second most important contributor to climate change after carbon dioxide.
Sixty years of the Clean Air Act
Of course, regulation has moved on considerably since the first Clean Air Act. Whilst it initially focussed on controlling the sources of emissions, today we take a pincer approach by setting targets for maximum pollutants in the air, based on what are deemed to be acceptable limits, in addition to limiting what comes out of stacks and exhausts in the first place.
At the international level, we have seen successful agreements like the Montreal Protocol phasing out CFCs and reducing the size of the ozone hole, as well as current targets like the Gothenburg Protocol on air pollution and, more recently, the COP21 Paris Agreement, all of which target lower emissions, cleaner fuels and, ultimately, fresher air.
In the UK, adherence to these agreements is gauged through air quality monitoring by both local authorities and coordinated national networks.
To keep up with new technologies, the old National Survey has evolved considerably since its inception. Up until the early 1970s, monitoring air was a fairly labour intensive process, with particulate pollutants sampled by drawing air through a filter and taking it back to a laboratory for analysis. This can only ever give you a snapshot of pollution and makes inferring what pollution levels are beyond the immediate monitoring site very difficult.
Research into automatic analysers began in the early 1970s and, by 1987, the UK had installed its first automatic network: the excitingly named ‘Statutory Urban Network’. Designed to ensure compliance with European Directives, the networks expanded over the next few years as laws were developed and local authorities integrated their individually managed sites into the national system.
Today there are nine national networks across 300 sites that monitor a range of pollutants, with data available for everyone to see from Defra’s website. The Statutory Urban Network in its modern guise – now called the Automatic Urban and Rural Network (AURN) – is the most comprehensive UK network with over 120 sites providing hourly levels of nitrogen, sulphur dioxide, ozone, carbon monoxide and particulate matter.
There are a number of techniques used to measure the pollutants and each station in the AURN must monitor at an accuracy required to meet the EU Air Quality Directives.
Measuring the oxides of nitrogen is achieved by chemiluminescence, where a reaction of nitrogen monoxide and ozone produce an ‘excited’ form of nitrogen dioxide. As the excited molecule returns to its ground state, fluorescent radiation is emitted at an intensity proportional to the concentration of nitrogen monoxide in the sample. Nitrogen dioxide is measured by catalytically converting it to nitrogen monoxide within the instrument and observing the change in the total nitrogen monoxide. A similar process is used for measuring sulphur dioxide, although using ultraviolet light to produce fluorescence instead of a chemical reaction.
Simple ultraviolet and infrared absorption spectroscopy produce measurements of ozone and carbon monoxide respectively. For particulate matter, a range of techniques are used, including drawing air through a filter at the tip of a glass tube, using an electrical circuit to vibrate the tube, and measuring its resonant frequency to infer the mass of particles (the ‘tapered element oscillating microbalance’ method).
While these measurement techniques are robust, just making the measurements doesn’t fix any problems – it only exposes them. The issue of air pollution was widely reported again earlier this year when Putney High Street broke its hourly limit of 200 micrograms per cubic metre of nitrogen dioxide for the 19th time within only eight days of the New Year. Under EU rules, sites may only exceed this limit for 18 hours a year.
Putney isn’t an isolated incident. In February 2014, the EU announced it was taking legal action against the UK as a result of its nitrogen dioxide levels persistently exceeding the limit. Nitrogen dioxide is a common problem and is difficult to reduce because most of it comes from vehicle exhausts. It is not just the UK that’s struggling: at the same time that the UK faced scrutiny, over half of EU member states were also facing fines for breaking agreed limits.
Despite the best efforts by the government, the problem won’t go away soon. Recent figures show that Liverpool, Tyneside, Nottingham, Sheffield and Bristol will not be in compliance for nitrogen dioxide concentrations until 2025, though they had previously thought they could get there by 2015. For London, Birmingham and Leeds, the government expects to break limits until at least 2030.
Complying with legislation has had to be pushed back as a result of an increasingly better understanding of the emissions from diesel vehicles and older petrol cars. The recent emissions scandal involving Volkswagen demonstrates how difficult it is to acquire accurate long term emissions data. The issue highlighted how diesel vehicles – not just VWs – produce more emissions under real world driving conditions than in laboratory tests, making accurate emissions data based on how many cars are driving difficult to establish.
With almost every aspect of modern life involving emissions at some point in the supply chain, monitoring air quality robustly is challenging. There is only a limited budget to support expensive monitoring stations in fixed locations, and it is unfeasible for the government to monitor every pollutant constantly in every conceivable location with elevated pollutant concentrations. Having said that, with a comprehensive monitoring network already existing, can the data we produce do more, and is there a better way of doing it?
Improving coverage, lowering cost
In addition to the monitoring network, a suite of models are used to ‘fill in the gaps’ between the monitoring stations in order to predict the evolution of air quality over the whole UK.
The Pollution Climate Mapping model, for example, generates background pollution maps at a 1×1 km spatial resolution and 9,000 representative road side values for 11 different pollutants.
While no model can ever perfectly simulate reality, they are an important tool in supporting the physical monitoring sites. The data is not only valuable on a scientific front to calculate likely population exposure rates, but also from an economic point of view. Establishing a fixed monitoring station is expensive. Each instrument must be regularly maintained, requires sophisticated equipment and some can’t be autonomous as they require samples to be sent to a separate laboratory for analysis.
A well-run monitoring network, therefore, needs careful evaluation of the best place to establish stations before they’re installed – a ‘chicken and egg’ situation where the results have to be known before the experiment can be done. The air quality models currently underpin these decisions, but even this may not be enough. At 1km spatial resolution, often these models can’t provide sufficient accuracy or detail to capture representative population exposure to suggest optimum positioning of the stations.
Fortunately, there may be a new solution to the problem. A team of scientists from the National Physical Laboratory (NPL) and King’s College London has looked into the possibility of using the cheap and robust monitoring stations for black carbon to indicate where to put the more sophisticated equipment that is used to monitor regulatory pollutants like Polycyclic Aromatic Hydrocarbons (PAHs).
Like black carbon, PAHs are a by-product of incomplete combustion of fossil fuels and have also been identified as a highly carcinogenic hazard to health. European legislation imposes a limit and target value for its abundance, requiring constant monitoring in high risk areas. Currently, stations capture particles containing PAHs on glass fibre filters which are then taken back to NPL to be analysed by gas chromatography-mass spectrometry methods in the laboratory. This non-automatic process only produces calendar monthly average concentrations and takes a lot of time and effort to create.
Thus if the black carbon network can offer real-time data to be used as a surrogate measurement for PAH concentrations, in a more portable, automated and cheaper way, it could solve many problems all at once. A recent paper from the NPL team demonstrates the method’s great promise: the ultraviolet and infrared channels of aethalometers (the instrument used to measure black particle matter in ambient air) showed good correlations – within 1.5% – of all annual averages for existing PAH measurements. Using this data, decisions on where to put the more accurate (but more expensive) PAH stations can be better made before they are installed, saving time, money and effort.
Personalised monitoring
Finding new applications for existing technology is a step in the right direction, but a new generation of technology could start to challenge the status quo of big, relatively expensive monitoring stations.
Terms like the ‘Internet of Things’ and ‘Big Data’ have been touted as the next big thing to revolutionise the way we collect, interact and use data. The potential applications – anything from wearable sensors providing personalised medicines to fridges that tell you when you’re out of milk – are huge, such that the UK government put £40 million towards research in the Internet of Things in its 2015 budget.
For air quality, it’s easy to see where the potential lies. New low cost, personal sensors are already being developed and could offer the potential for anybody to keep track of pollution by providing constant readouts of how much pollution there is in the air around you as you move around a city. Linking these measurements together in a central database could, potentially, provide real-time air quality readings for every city street in the country.
The challenge for technology as it becomes smaller and cheaper, however, is balancing quantity with quality. Ensuring the data generated has been independently tested and shown to be credible is a significant barrier to the commercialisation of innovative sensors. To help tackle the issue, NPL scientists are working as part of a European team to develop a technical protocol and standardised method to test the sensors, enabling cross-comparison and increasing confidence for investors and end-users.
Once personal monitoring systems can be proven to be of sufficient accuracy, the opportunity exists for a new model in how we monitor air quality, with lower costs and better spatial coverage. It could also improve our understanding of the health effects of air pollutants, by enabling personal exposure data to be linked with the health of an individual.
A cleaner future
All things considered, national air quality has come a long way since the deadly December nights of the early 1950s. It is now difficult to imagine pollution so thick that you would struggle to read a road sign. A combination of cleaner technologies and strong policies have taken us a long way, but it is clear from the recent missing of targets that we’ve got a long way to go. The monitoring and modelling network is a critical part of the challenge, providing scientific checks that the policies are working and telling us when limits are being broken.
Although targets may be missed along the way, the future is looking bright. New applications for the existing network – such as using instruments to detect multiple pollutants – have already been shown. Low-cost sensors and ‘citizen science’ projects have the potential for unprecedented spatial coverage if we can link all the observations together and ensure there is sufficient quality assurance.
We are arguably approaching a paradigm shift in how we monitor and manage the quality of our air. Perhaps a future generation in 60 years’ time will look at our current pollution and be almost as amazed as we are looking back at the ‘Great Smog’ 60 years ago.
Published: 11th May 2016 in AWE International