Jul 17, 2007
Climate models prove more reliable
Climate models are numerical representations of the Earth's climate system and are used to predict how the climate will change in response to various man-made factors, like greenhouse gas emissions. A team in the UK has now shown that the outcome of such models depends far more on the input parameters - such as cloud properties - than the computer hardware or software used, as many scientists previously feared. The result will allow researchers to employ climate models, such as "climateprediction.net", with much more confidence than before.
Climateprediction.net is a global experiment that uses computing time donated by the general public. Previous results from the model have shown that average temperatures could eventually rise by 11°C - even if carbon dioxide levels in the atmosphere are limited to twice those found before the industrial revolution. Such levels could be reached by the middle of the 21st century, unless drastic cuts in greenhouse emissions are made.
Christopher Knight of the University of Manchester, together with colleagues at Oxford University and the UK Met Office analyzed over 57 000 climateprediction.net model runs. In each, the model was first run with low, pre-industrial carbon dioxide levels. The researchers then doubled the levels of carbon dioxide and looked at the effect of this increase on global temperature.
Each of the model runs was slightly different - with slightly different parameters and varying initial conditions. Moreover, since different members of the public ran these models, the computers they used varied in make and age, while some had more memory than others.
By comparing all the factors that went into running the models with the global warming due to carbon dioxide increase seen, the team was able to determine which factors were most important. "Fortunately for the climateprediction.net project, the effects of which computer the model was run on were very small," team member Sylvia Knight of Oxford University told environmentalresearchweb.org. "However, the effects of some of the numbers - for example those to do with how to represent a cloud in the computer model - turned out to be very important."
The result means that the "distributed" climate modelling approach is much more reliable than previously thought. "This is exciting as it gives us real confidence in using the amazing computer time that people across the world are willing to donate to understand the climate," said Knight. "This computer time gives far more computing power than any supercomputer and will allow a much more thorough use of climate models to understand climate change."
The result has also highlighted the relative importance of specific aspects of the models in deciding how sensitive they are to carbon dioxide. "This will allow us to focus future improvements of these models for predicting the effect of carbon dioxide on climate," added Knight.
Climateprediction.net is currently distributing an experiment that "hindcasts" the 20th century before forecasting the 21st century under different scenarios of greenhouse gas emissions, solar, volcanic and other activity. It is still possible to join the project if you have a personal computer at home, work or school. Other future experiments include a more detailed look at climate change for certain parts of the world, such as Southern Africa, and a more in-depth study of ocean response. "The more people donate computer time to the project, the more complete the results will be!," said Knight.
The researchers reported their work in PNAS.
About the author
Belle Dumé is a freelance science and technology reporter based in Paris, France.