CLIMATE MODELS

The tools of diagnosis

© CNRS Photothèque/Françoise Guichard, Laurent Kergoat
© CNRS Photothèque/Françoise Guichard, Laurent Kergoat

Burning embers
From pale yellow to bright red, this IPCC graph illustrates increases in five risk categories associated with different levels of global warming. Published for the first time in 2001, it was updated in 2007 with a considerably increased level of alert.
  1. Risks faced by certain ecosystems (coral reefs, glaciers, living species, etc.).
  2. Extreme meteorological risks (heat waves, floods, drought, fire,hurricanes, etc.).
  3. Increased regional disparities and vulnerabilities of impacts (in yellow, certain zones that benefit from warming could contrast with those that suffer from it; in the red areas the negative impact is generalised).
  4. Risks of impact on the economy and markets (same remark).
  5. Risks of major upheaval (increased rate of increase in sea levels, ocean acidification, extreme heat levels).
Source: IPCC
Expected rise in temperatures during the last decade of the century (2090-2099) given continued economic growth and use of a ‘mix’ of fossil and non-fossil fuels, according to the IPCC scenarios.
Expected rise in temperatures during the last decade of the century (2090-2099) given continued economic growth and use of a ‘mix’ of fossil and non-fossil fuels, according to the IPCC scenarios.
Source: IPCC

At a time when the wildest disaster scenarios are circulating, the very best climate forecasting techniques are needed if we are to make the right decisions. Today, a major new climate model is making it possible to dispel a number of uncertainties regarding Europe’s climate in the 21st century and providing us with the most precise possible picture of the phenomena that are likely to affect us.

In The Age of Stupid, an environmental film released in 2009, Pete Postlethwaite plays the role of the sole survivor of a climate disaster in the year 2055. He looks out from the top floor of a skyscraper at a devastated world, surrounded by classical sculptures and cultural treasures rescued from the huge floods, and asks himself why mankind had not acted half a century earlier to avert the already looming catastrophe. Although the film is very effective at causing the audience to imagine possible future climate events, many scientists would certainly take issue with the date chosen for the Armageddon.

The end of the world in 2055?

In its fourth report, published in 2007, the Intergovernmental Panel on Climate Change (IPCC) presents climate models that would indeed have embarrassed the film’s director. These show a range of temperature increases based on specific emission conditions determined by factors such as the success of environmental and clean energy policies and demographic growth. A model is a computer programme that simulates climate change from a precise point and on the basis of a series of scenarios linked to emission conditions. The more complex the data, the more complex the models. They can include as many as 30 parameters, such as air speed, humidity, ground humidity, temperature or dew point. A model can consist of as many as a million lines of computer code and take several months to develop, in addition to the even longer analysis time. This is why there are only around 25 global models in existence.

In its most optimistic scenario (low greenhouse gas emissions), the IPCC predicts, at best, a temperature increase of 1.8°C in 2090-2100 compared with 1990 (the temperature has increased by 0.7°C since the beginning of the industrial revolution), with a rise in sea levels from 18 to 38 cm. The most pessimistic scenario (high emissions) predicts, on the basis of our present knowledge, a temperature rise of 4°C and a rise in sea level of between 26 and 59 cm. Whatever scenario proves to be true, the IPCC predicts increased damage by floods and storms. A 3°C temperature rise would thus cause a 30 % reduction in coastal wetlands. But there is no scenario depicting a drowned world in 2055…

A European mega model

Climate models of this kind are improving all the time and serve both to refute alarmist scenarios and to combat denial, thereby ensuring that the public are informed correctly. Measuring and modelling techniques have developed progressively over the past two centuries. The very oldest data come from a series of temperature readings obtained in central England in the 17th century. In the 19th century meteorological observations became widespread. In the 1920s balloons were sent up into the air full of measuring instruments. Thirty years later, aircraft were used to measure the atmosphere and weather stations were built at the North and South Poles. Today, data are collected by satellite.

The ENSEMBLES project, allocated funding of EUR 15 million under the Sixth Framework Programme (FP6), covers a whole series of major new climate models. Its predictions have a greater degree of certitude, not so much due to the precision of the observations as to the quality and the depth of the modelling. Climatologists claim to have produced a picture of Europe’s climate at the end of this century that is clearer and more comprehensive than any of the previous predictions that were based on data provided by global climate models.

The median results of an ENSEMBLES prediction, based on the A1B emissions scenario (in which our economies achieve a balance between the use of fossil fuels and other energy sources, including renewable energy), indicate, for example, a temperature rise of 6°C and a 50 % fall in rainfall in summer in southwest France between 2080 and 2099 compared with the 1961-1990 reference period. The degree of detail achieved by these new models is one of the project’s major contributions to climate change modelling. “We are achieving a much higher resolution than anything obtained previously. This marks major progress,” declares Paul van der Linden, the ENSEMBLES director who is based at the Hadley Centre for Climate Change in Exeter (UK). These results are the culmination of five years of hard work at the heart of the obscure science of climate modelling.

Passing the history test

Six IPCC scenarios (there are 40 in all) are currently being used in the models. These describe a series of future emission levels and are formulated on the basis of socio-economic hypotheses and conjectures on how we manage the climate problem. Some are based on ‘business as usual’ and are therefore negative; others are based on more active energy policies. They also include estimations for the sun’s rays and aerosols and their scope varies depending on whether they include data relating to 1, 2 or 10 to 15 pollutants, for example.

The models were developed over several decades at 12 different centres – the seven centres located in Europe were included in ENSEMBLES, which also includes non-European institutes among its 66 partners. The researchers start by scrutinizing the land surface, usually with a resolution of roughly three to four degrees. The models are subsequently launched on a global scale and the values of each bloc or cell reconstituted. Different models are used depending on the aim of the research: they can be uniquely atmospheric or oceanic or both. In addition to the global characteristics of climate change, scientists can also model their impacts.

Temperature and rainfall data from the past and present are used, for example, to test the model in regard to climate history using a procedure known as ‘hindcasting’. “We want to know how effective the model is and evaluate this compared with climate observations such as historical measurements of greenhouse gases,” explains Paul van der Linden. If the test shows an excessive departure from historical data, the model is deemed to be weak. As it takes an enormous amount of time to produce a global model, only certain elements of them can be tested and over a reduced period.

Thousands of simulations

A single prediction from a model is insufficient. To improve precision, climatologists run the same model thousands of times with different data and are also able to run several models on repeated occasions using the same data. This method, known as the ‘ensembles model’, generates a more reliable result because the averages of several models provide a more precise picture than the result of one alone. One of the reasons why ENSEMBLES differs from previous projects is due to this very effect of scale.

“It is the biggest project of its kind,” says Paul van der Linden. The team of forecasters, who are located across Europe, has designed a vast multi-model that combines seven European ensemble models. Its size enables it to provide a degree of detail superior to that of previous attempts. It also differs from its predecessors because it produces an ensemble on the basis of 15 regional models incorporated in the seven global models. Researchers studied the potential impact of climate change on 14 sites in Europe. They are also able to simulate the effects of an average temperature rise of 2°C in Europe on agriculture, health, energy, water resources and insurance. “We have more of a transverse than a top-down approach,” explains Paul van der Linden.

Another innovation is the development of a new scenario, known as E1, which was tested by an ensemble of global climate models. This supposes that emission reduction policies are a success and that emission targets are achieved. This is an opposite approach to the IPCC scenarios in the sense that results are shared in terms of temperatures in order to then recalculate emission levels. This gives a level of stabilised CO2 equivalent of 450 parts per million in 2140 (the decisionmakers set this figure as the maximum for temperatures not to increase by more than 2°C). The results imply that emissions must fall back to zero at the end of the century, after having peaked in 2010 at around 12 gigatons of carbon equivalent. The next IPCC report is set to include a model that functions with the E1 scenario.

Refined probabilities

But one cannot ignore the fact that, however often the models are run, they remain projections nonetheless and will never be able to provide firm certitudes. Yet, in this respect also the ENSEMBLES team say they have achieved a first. The project produces a series of predictions that are then evaluated to decide which results are more probable than others. The researchers say that they can measure the exactitude probability of any given projection thanks to the multi-model scale of the project. “Before, it was possible to say, for example, that the number of storms could increase by 20 % by a certain date. Today we can say that 95 % of the results show that the increase in storms will be between 5 % and 20 %,” explains Gregor Leckebusch, a climatologist at the Berlin Institute of Meteorology.

Paul van der Linden believes that the creation of this model constitutes major progress for the climate modelling community. “For the first time, we have probabilities, and researchers can now turn to this database to run their models.”

But the fact remains that there is nothing like reality to test the very best models. The summer shrinking of the Arctic ice cap in recent years, for example, was only present at the very end of the previous model scenarios. This raises fears that we are underestimating the rate of climate change and must again revise the predictions.

Elisabeth Jeffries


TOP

Read more

Storm warning

Tempête Kyrill, début 2007. © Eumetsat
Kyrill storm, beginning of 2007.
© Eumetsat

During the afternoon of 18 January 2007 a violent storm swept across the European continent. The wind continued to gather strength, reaching a speed of 202 km/hr in Germany before moving on to the Czech Republic. Insurance claims for damaged buildings and uprooted trees ran into several million euros. Power cuts affected thousands of homes in the United Kingdom and Germany. The storm, named Kyrill, began over Newfoundland, in Canada, before crossing the Atlantic and reaching the British Isles and then northern and central Europe. The trajectory and configuration of Kyrill is typical of winter extratropical cyclones, i.e., storms that originate outside the tropics.

These cyclones are generated in regions where the temperature differences are very great, such as Florida and Greenland, and generally begin on the North American seaboard. The temperature differences cause air movements and strong winds that can grow into violent storms. Over the past 40 years they have occurred at the rate of around five a year. The question is whether they are going to increase due to temperature changes, a development that would not be without its impact on the business world and governments. According to the ENSEMBLES team, who used the A1B emissions scenario, there is a 90 % chance that storm damage in 2017-2100 will increase by between 13 % and 37 % compared with the period 1961-2000 in southern Germany.


TOP

Find out more

  • ENSEMBLES
    66 partners – 18 countries (AT-BE-CH-CZ-DE-DK-ES-FI-FRGR-IE-IT-NL-NO-PL-RO-SE-UK)
    2 non-European countries (AU-US)
    ensembles-eu
    .metoffice.com