Recently in EGU 2012 Category
Arctic glaciers and ice caps cover an area of 402,000 square km, roughly 55% of the world's total. But they're punching above their weight when it comes to sea level rise - although Greenland's ice sheet is four times larger, it contributes roughly the same amount of melted ice to the world's oceans. That's according to Jon Ove Hagen of the University of Oslo, Norway, speaking at the EGU meeting in Vienna.
For example from 2006-2010, around 200 Gigatonnes of ice per year melted from the Greenland ice sheet while the equivalent figure for glaciers and ice caps in the Arctic was 160 Gigatonnes. That said, there is considerable variability around the Arctic region, with some glaciers and ice caps losing mass rapidly and a few growing slightly.
As part of the ice2sea programme, Hagen and colleagues have taken continuous GPS measurements on two fast-flowing outlet glaciers of the Austfonna ice cap in northeastern Svalbard since April 2008. The data indicate that the ice is now moving between two and three times faster than four years ago.
What's more, around 30-40% of the total ice mass loss is due to calving. Hagen said the ice cap is exhibiting unstable dynamics and the study shows the importance of monitoring calving.
Michael Mann of Penn State University, US, is used to attack from climate contrarians. But his latest work, as he told environmentalresearchweb at the EGU 2012 Assembly in Vienna, has received more interest from dendroclimatologists who "feel our paper [in Nature Geoscience] exposes a problem with their approach".
The research indicates that growth-rings from trees at the far north of their range may not have picked up the fast cooling caused by major volcanic eruptions in the past. Such trees are particularly sensitive to temperature change, which is why they are used so often in palaeoclimate reconstructions. But there's a snag - temperature drops of a couple of degrees may push them outside their growth range. That could mean a year without growth and a missing growth ring. Not only does this fail to record the temperature drop but it can also "smear" the chronology, explained Mann.
Mann, however, feels his research shows dendroclimatologists are doing a "good job" at reconstructing long-term temperature changes in the past - it's only detection of short-term cooling responses to volcanic eruptions that is an issue.
"Ironically this points to some past work [on climate sensitivity] as biased on the low side and maybe contrarians don't like that," he said. Mann believes the work indicates that climate sensitivity - the increase in temperature for a doubling of atmospheric carbon dioxide - is closer to 3 °C than 2 °.
• Mann was awarded the EGU's Hans Oeschger Medal.
This year's EGU General Assembly in Vienna is not currently experiencing heat extremes. But worldwide the number of local monthly record-breaking temperature extremes is now five times on average what would be expected if the climate was stable. This means that four out of five recent records would not have taken place without climate change, said Stefan Rahmstorf of the Potsdam Institute for Climate Impact Research Germany, in a meeting presentation so early in the morning that he dubbed the attending delegates "heroic".
Rahmstorf calculated the monthly heat record ratio - the number of records divided by the number expected in a stationary climate - for the last 131 years of temperature observations. The global mean ratio was roughly five, but parts of Africa and South America experienced 20 times more records than expected.
The high number of records in the tropics is because these areas normally experience a small temperature variance, even though the trend in temperature rise in the region is relatively low, Rahmstorf explained. The Arctic also has a high record ratio, but this is due to its larger temperature trend.
Rahmstorf's analysis showed that the increase in heat extremes can be explained by a simple stochastic model - a linear trend of increasing temperature combined with uncorrelated noise.
Changes in the wind circulation around Antarctica are causing the region's ice shelves to lose ice, explained David Vaughan of the British Antarctic Survey to journalists at the EGU 2012 assembly in Vienna. But the mechanism is different, depending on where the ice shelves are located.
As detailed in a publication in Nature today, of which Vaughan is a co-author, in West and East Antarctica ocean-warming caused by wind changes is melting ice shelves from below. And where ice shelves have thinned, glaciers inland have accelerated as the buttressing effect of the ice shelf is removed.
On the eastern Antarctic Peninsula, on the other hand, it looks like wind-induced atmospheric warming is the culprit - it's melting snow on the surface of the ice shelves. Some thinning in this region is also due to loss of air compacting the ice.
The team used laser measurements from NASA's ICESat satellite for 2003-2008 to look at 54 - almost all - of Antarctica's ice shelves. Warm ocean currents were pinpointed as melting 20 of the ice shelves, mainly in West Antarctica. Indeed the researchers ascribed the majority of Antarctica's ice loss to ocean change.
For the world's glaciers and ice caps to catch up with the temperatures of the last ten years, they need to lose 38% of their ice volume, on average, and 30% of their area. That's equivalent to 228 mm of sea-level rise over the next few decades, even without any additional climate change, according to Sebastian Mernild of Los Alamos National Laboratory, US, who presented his work at the EGU 2012 meeting in Vienna.
Mernild studied the mass balance of 124 glaciers and 19 icecaps worldwide, using three averaging methods. The results compared well with earlier studies, which incorporated fewer glaciers, he said.
In Central Europe, Svalbard and Greenland, glaciers and ice caps were more out of balance with their surroundings than the global average. Mernild reckons that glaciers in the Alps are likely to lose most of their mass by 2100.
If recent climate trends continue, by around 2040 glaciers and ice caps will lose at least half of their volume.
As the first big nuclear accident in the vicinity of a good measurement network, the events at Japan's Fukushima Dai-ichi power plant in March 2011 enabled scientists to find out more about the spread of radioactive dust and its associated health risks. That's according to Masatoshi Yamauchi of the Swedish Institute of Space Physics, speaking to the press at the EGU 2012 meeting in Vienna.
The incident contaminated an area of more than 100 km diameter with radioactive materials, releasing 10-20% of the radioactivity of Chernobyl. Yamauchi and colleagues used measurements of atmospheric electric field at Kakioka, 150 km southwest of the plant, in combination with a radiation dose measurement network and soil samples to assess dust transport.
After the first problems at the plant on March 11th, the potential gradient measurements at Kakioka dropped by an order of magnitude; ionising radiation increases atmospheric electrical conductivity and decreases potential gradient.
The potential gradient also dropped on March 14th and March 20th. Yamauchi believes the March 14th drop was due to contamination by surface winds, which left radioactive fallout suspended near the Earth's surface. This is potentially a health risk, especially for children as they breathe closer to the ground.
The March 20th drop was probably down to transport by a relatively low-altitude wind followed by rain, which caused the dust to settle on the ground.
Yamauchi and colleagues recommend that all nuclear power plants are surrounded by a network of potential gradient measurement stations.
Field work can be tough - especially when there's no power, bad roads and your transect line is too steep to use your favourite measurement methods. That's what happened to YIt Arn Teh of the University of St Andrews, UK, when he studied methane and nitrous oxide fluxes in the Kosñipata Valley in Manu National Park, southeastern Peru.
As Teh explained in a talk at the EGU meeting, the gradient of much of his route, which started at an altitude of 3500 m and headed down to the Amazon basin, was too steep to use Eddy covariance sampling. Nothing daunted, Teh went "back to basics" and used chambers to gather his data.
Teh was hunting for the "missing" sources and sinks in the tropics that regional atmospheric budget discrepancies indicate might be located in South America. Along his transect lay Puna grasslands at the top of the mountain, cloudforest at around 2800 m, then mid-elevation forest followed by foothills.
The findings? Altitude had a massive effect. At lower elevations the ecosystems were a sink for methane, but at higher elevations they acted as a methane source. The situation for nitrous oxide was more complex - low, mid and high elevations acted as a source, but the cloudforest at around 2800 m acted as a sink for the gas.
Now Arn Teh would like to work with colleagues in modelling and remote sensing to build on these findings.
The thousands of delegates congregating in Vienna this year will find the EGU making further efforts to "green" the meeting - badge lanyards are made from bamboo fibre rather than PET and the conference schedule is smaller to save paper. It seems only appropriate, since many of the sessions at the conference will focus on the cryosphere (shrinking), climate (warming), natural resources (under pressure) and energy. But are such measures just a drop in the ocean, especially as environmental issues appear to have fallen down the priority list for many governments?
Indeed, governments received a call for action within the first half hour of the conference opening, with Millie Basava-Reddi of the International Energy Agency Greenhouse Gas R&D programme (IEAGHG) stressing the need for investment in carbon storage, in her talk presented by session chair Michael Kühn due to a delayed flight.
While the G8 nations would like to see 20 carbon capture and storage projects up and running by 2020, the IEA target is 100 by 2020 and 3,400 by 2050. The agency's latest assessment, however, indicates that while 20 projects are feasible for 2020 its own roadmap isn't, with just 50 projects likely by 2025. Worldwide there are currently 14 large-scale integrated projects in operation or execution; 2011 saw 74 large-scale projects in at least the planning stage. Basava-Reddi called on governments to allow for long project lead times - up to fifteen years - and to help to provide up-front investment.
The challenges for carbon capture and storage in many cases mirror those for other subsurface technologies such as geothermal energy. Indeed Kühn's group at the Helmholtz Centre Potsdam, Germany, is researching how brine extraction from saline aquifers could help reduce the pressure rise induced by the addition of carbon dioxide, whilst at the same time providing geothermal heat.
There are a large number of issues in geothermal energy that need substantial research efforts, explained Adele Manzella of CNR Institute for Geosciences and Earth Resources, Italy. The upper 3 km of the Earth's crust could provide 60,000 times our current power consumption; the only snag is where and how to access that power. The up-front costs are high and it's hard to forecast production, especially since there is a lack of data on geothermal potential. But once systems are set up the energy produced is cheap compared to other types of renewable energy, since power is provided 24 hours a day.
The European Energy Research Alliance has set up a Joint Programme on Geothermal Energy, said Manzella. Areas under study include assessing Europe's resources for geothermal power, how to mitigate induced seismicity in reservoirs, and high-performance drilling.