Thursday 28 April 2011

From the weather in 1994 to the climate now

The most successful users of grid computing are - without doubt - the particle physics community. The Worldwide LHC Computing Grid stores and processes the Terabytes of data generated from the experiments at CERN.

It is all very impressive - even for people like me who don't understand what a Higgs Boson is.

I am not a particle physicist. Back in the mid '90s - when work on the Large Hadron Collider project was beginning - I was at the Department of Meteorology at the University of Reading.

So it was strange to see a familiar name from way-back-then appear in International Science Grid This Week - and asking what the climate science community could learn from CERN.

Back in 1994, the Meteorologists occupied one end of some re-purposed second-world-war huts on the outer edge of campus. On one wall of the huts, opposite the portacabin that served as a coffee room, were the results of one of the field's latest innovations: ensemble forecasts.

Weather forecasts take the best estimate of the state of the atmosphere at one moment in time and calculate how this will change over the next few days.

That best estimate of the state of the atmosphere - called an analysis - is based on observations gathered by the World Meteorological Organisation.

The observations are scattered unevenly over the planet. For entirely practical reasons, there are many more observations over land than over the sea, and many more at the surface than further up in the air. Parts of the world are not covered at all. The best estimate will never be perfect.

Weather forecasting uses huge amounts of computer power and an individual forecast must be completed within a few hours or it is useless. You do not want to be told that a storm is coming after it has arrived.

Until the mid-90's, the most powerful computers available were only able to run one forecast before time ran out.

Weather is chaotic: small features can grow rapidly into huge storms. If such a feature is misrepresented in the analysis - as happened in 1987 - that one forecast can be very, very wrong.

By the mid 90's, enough computer power was available to produce ensembles of forecasts - each using slightly different initial estimates. Sometimes members of the ensemble stayed close together, sometimes they diverged. For the first time, we could estimate how reliable the forecast will be.

The ensembles were produced by the European Centre for Medium Range Weather Forecasting, a short distance up the road from the Meteorology department. This pioneering work is as relevant to the long term forecasting of climate as it was to the short term forecasting of weather.

Tim Palmer was one of those behind the ensembles. He is now a Professor at Oxford and one of the foremost experts on weather and climate prediction forecasting. The talk he gave at the American Geophysical Union fall meeting in 2010 is one of the clearest descriptions of one of successes and failures of the field.

In this week's edition of International Science Grid This Week carries an article by Professor Palmer advocating 'a CERN for Climate Change'.

Human activity is changing the climate - whatever you might read elsewhere on the Internet - but the details of the changes are still not understood.

If those who have studied our climate say we need a CERN-like organisation to understand it, then we should be listening.

[Small edit for clarity, 3 May]

No comments: