Forecasting models

Reading: the impossible weather equation

The metops (meteo operations) room, the ECMWF’s nerve centre, where the new maps according to the probabilistic model are hung up twice a day. © Rob Hine/ECMWF The metops (meteo operations) room, the ECMWF’s nerve centre, where the new maps according to the probabilistic model are hung up twice a day. © Rob Hine/ECMWF
ECMWF Research Director Philippe Bougeault. © Alexandre Wajnberg ECMWF Research Director Philippe Bougeault. © Alexandre Wajnberg
ECMWF Director Dominique Marbouty. © Alexandre Wajnberg ECMWF Director Dominique Marbouty. © Alexandre Wajnberg
Forecasts for the Reading region, published on 3 December 2007, giving the brackets within which cloud cover, precipitation, wind speed and temperature are expected to develop, in six-hour steps, over the next 10 days. Each bar is the result of a probabilistic modelling of 50 forecasts, each diverging slightly from the initial conditions. The longer the time range, the longer the bars as the forecasting uncertainty increases. © ECMWF Forecasts for the Reading region, published on 3 December 2007, giving the brackets within which cloud cover, precipitation, wind speed and temperature are expected to develop, in six-hour steps, over the next 10 days. Each bar is the result of a probabilistic modelling of 50 forecasts, each diverging slightly from the initial conditions. The longer the time range, the longer the bars as the forecasting uncertainty increases. © ECMWF

What is probably the most important weather centre and the least known to the general public is to be found to the west of London, in Reading. Its models and forecasts serve as a basis for most of Europe’s national meteorological bodies. This collaboration of excellence acts as a magnet for Europe’s best scientists and meteorologists.

No breaks or weekends for these 220 researchers and computer scientists. Like the weather, their machines never stop. The European Centre for Medium-Range Weather Forecasts (ECMWF) is buzzing with very silent activity. Every day 160 million weather observations arrive from across the world. Twice a day these are processed in one of Europe’s most powerful computing centres, producing hundreds of weather forecasting maps for the next fortnight.

Set up in 1975, the ECMWF has not stopped expanding ever since. Financed by around 30 states, it also cooperates with international organisations like the European Organisation for the Exploitation of Meteorological Satellites (EUMETSAT) and the European Space Agency(ESA). The reason this autonomous body is so little known is that it does not work directly with end-users, but rather only with national weather forecasting centres and private meteo - rological service companies.

Its many missions include research into climate and climate change, the general archiving of weather data, and medium-range forecasting – from two days to two weeks into the future – on a global scale. ECMWF is today the world leader in medium-range weather forecasting, while national meteorological offices take care of local and short-range forecasting.

Medium-range forecasts

“The entire Earth’s atmosphere is taken into account and modelled”, explainsECMWF’s director, Dominique Marbouty. “The idea is to computer-simulate its evolution. Starting from the weather situation at a particular point in time, and applying the laws of mechanics and thermodynamics, we end up with a system of unsolvable simultaneous equations, to which we have to find approximate solutions. Hence the use of computers and calculators.”

This global vision is predicated on being able to access meteorological observations – temperature, atmospheric pressure, wind speed and direction, hygrometry, clouds, rain, snow – from across the planet. These measurements are gathered from ground stations, by weather balloons and drifting buoys, ships and aircraft, and transmitted to Reading by national centres. Bodies like EUMETSAT, ESA, NASA and others provide satellite data, with precise vertical temperature and humidity profiles, along with cloud positions and types. “Our centre was the first to develop a data assimilation system permitting extensive use of satellite data”, Dominique Marbouty continues. “This scientific choice, which was difficult to make at the time, given the enormous investment, has paid off, and most national centres have followed the trend.”

Modelling the atmosphere

Of these 160 million daily information items, six million – considered the most pertinent and best disseminated across the Earth’s surface – are selected to describe the state of the atmo - sphere. The idea is to get the most accurate “picture” possible. To this end, the whole of the Earth’s surface has been divided into a grid, and observations are attached to the crossover points. That fact that observations are taken at 91 altitude levels gives an idea of the system’s complexity. The smaller the grid squares, the finer and more precise the description of the current atmospheric situation. Right now the centre works with 25-km grid squares. This makes a lot of grid points at planetary level, but very few for little countries like Belgium or Luxembourg, which are described by a few points only. This situation justifies the existence of high resolution models used by national meteorological institutes: their grids are more finely meshed, making their short-range national forecasts more precise and reliable. They are better able to simulate, and therefore forecast, storms and other local impact weather events.

In this way, in both space and time, the areas covered by the ECMWF and national institutes are complementary: the forecasts from Reading provide the general framework of the Earth’s atmosphere and the boundary conditions, which national institutes then use to frame their own local, small-scale, shortrange forecasts.

Two key forecasting tools

Twice a day, the ECMWF publishes both the current atmospheric condition and its hypothetical evolution up to 15 days ahead. 20 years ago it went no further than one week. An important and original feature of the ECMWF is that its forecasts come with a reliability coefficient. This is made possible by two complementary approaches, one based on a “deterministic”, the other on a “probabilistic” model.

The deterministic model is the traditional one: starting with the filtered data, the computers derive the current situation of the atmosphere. This is already valuable news. Just think of air travel, where a hurricane 300 km away will be reached by a passenger aircraft in 25 minutes. But more especially, the current situation is a necessary starting point for any forecasting. Mathematically, the parameters of the current situation provide the starting conditions for resolving simultaneous equations describing the evolution of the atmosphere. In this way weather forecast maps are produced for one, two, three and up to 15 days forward.

The reliability of these forecasts decreases with the number of days ahead. But certain forecasts are more reliable than others. For example, forecasts during a period of fine weather are more reliable than those made during very unstable atmospheric conditions. The probabilities attached to any forecast – that is, what are the chances of the forecast conditions actually occurring? – are determined using the probabilistic model.

The probabilistic model

The basic idea is as follows: as time equation systems are sensitive to the initial conditions – and the atmosphere is a chaotic system – and as the initial data are relatively imprecise, forecasters vary these data to produce 50 variants of the initial atmospheric situation, each slightly different from the next, which they then introduce into the simultaneous equations. These give 50 different forecasts. Comparing these gives us very valuable information: when the results overlap (in x days’ time in a particular region), the reliability of the forecasts is high. Where the forecasts contradict or vary considerably, their reliability is lower.

This comparison also provides the boundary limits within which certain forecast parameters (temperatures, winds, pressures, etc.) can be expected to lie. In this way the probabilistic model can serve to exclude certain weather conditions in x days’ time. For example the speed of the storm wind will not exceed a particular value limit. This type of information is very valuable for many applications in the transport and manufacturing industries, in agriculture, etc. If a boiler needs repairing, it is not important where between 5°C and 11°C the lowest temperature is going to lie, providing one is certain that it is not going to freeze.

Meteorological research

As Research Director Philippe Bougeault explains, “the big uncertainties come from the fact that we do not precisely know the atmospheric equations and the physics of the models. We are therefore trying all the time to improve our digital models, in particular with a better understanding of clouds and how they affectthe weather, and by a better representation of the phenomenon of radiation.”

Tightening the grid plays an important role here. “Our objective is to increase our spatial resolution by 60 % every five years. Between now and 2015, the grid will be reduced down to 10-km squares, at 91 different altitudes. This will require faster calculation methods and a considerable step-up in computing performance. Optimising the forecast model algorithms is something very specific to the ECMWF.”

One thing the centre is working on right now is a non-hydrostatic project. “This will involve taking greater account of the variable ascension speeds of local hot air masses, in particular inside large clouds. Until now we have omitted vertical accelerations in order to simplify the equation systems.” Finally, and always in an attempt to come closer to reality, the current tendency is to introduce new parameters into the atmospheric models. For example, scientists are increasingly keen to include the effects of waves, which act as a brake on displacements of air masses, and of particles suspended in the air (aerosols) which change the temperature by interacting with the sun’s radiation. Another such parameter is the nature of the ground covering: the plant cover has seasonal effects as its colour influences the ground temperature, and therefore that of the air.”

Satellites – the key to the future

“We are investing a lot of energy in obtaining as much data as possible via the satellites. They offer a continuous stream of measurements, of known reliability and which can be automatically processed. They have become essential. Without them, our forecasting range in the southern hemisphere would be cut back from five days to two, with the same level of reliability.

Right now, over 45 satellite instruments are examining our atmosphere. We want to measure the temperature of air masses directly by radiation observance. Right now we are using 300 wavelengths – 300 different “channels” – and we are aiming to up this to 4 000 in a few years’ time. We are also placing a lot of hope on a new instrument, the Lidar – a laser-based radar operating from the satellites. By measuring the Doppler effect cause by the retro-diffusion of light by moving particles, this will give us direct access to wind speeds at every altitude. This will be a real first.”

Finally, using the ECMWF’s meteorological archives, which are the largest in the world, past data can be taken and run using today's models and computers. The “re-analysis”, which serves to validate current models a posteriori, shows that weather events like the famous storm in 1953 which did so much damage in the Netherlands and England – and which was not forecast at the time – would have been forecast several days ahead today.

Marrying weather and climate

The state of the oceans and their interaction with the atmosphere also need to be taken into account, as water temperatures influence those of the air. This makes it possible to anticipate certain phenomena seasonally or even annually, rather than daily, like the famous El Niño. Here we are moving away from the area of weather fluctuation and towards that of climate and its long-term variations.

The incidence of major weather events like hurricanes or heatwaves is statistically rising. Meteorological models will need to be able to forecast them, and the resolution of climate models will need to improve to reflect the expected increase in these events. In other words, meteorological and climatological approaches are increasingly converging. We are on the verge of a conceptual revolution, that of the unified meteo-climate system. This will present an extraordinary scientific and technical challenge.

Alexandre Wajnberg



Top

Read More

Special projects

Specific regional needs are covered by ad hoc cooperations. Two examples are:

Arpège, Action de Recherche Petite Échelle Grande Échelle (small-scale, largescale research action) Five-day forecasts around the globe, by Météo France. Same model and same equations as at the ECMWF, developed jointly but used differently: the grids vary in size and are focused on France. In the short range, the weather in the mid-Pacific does not call for the same precision. Cosmo, COnsortium for Small-scale MOdel - ling. This is the application of Germany’s Lokal Modell in cooperation with Switzerland, Italy, Greece, Poland and Romania. Grid squares of just 7 km and 45 levels above the Alps finely model Alpine meteorology, taking account of the effects of this particular relief. The metops (meteo operations) room, the ECMWF’s nerve centre where the new maps created using the probabilistic model  are hung up twice a day.



Top