Recently in EGU 2011 Category
by Liz Kalaugher at the EGU General Assembly in Vienna
At last year's EGU meeting several late-breaking sessions covered Iceland's Eyjafjallajökull volcano, which was still erupting. This year there was plenty of time to schedule sessions well in advance and, when it came to ash, more of on emphasis on its effects on the ground rather than the consequences of airborne ash for aircraft safety.
As a result of the eruption, farmland 15 km south of the volcano was coated with an ash layer more than 10 cm deep. Pierre Delmelle from the University of York, UK, believes he and his colleagues are the first to study the physical effects of ash on soil. There's usually more attention paid to chemical effects but Delmelle reckons that for the Icelandic volcano, the physical effects may be just as important.
Delmelle found that, when fine ash was ploughed into the soil, its permeability to water decreased, probably because of a change in pore size distribution. In the absence of ash, the soil exhibited a hydraulic conductivity - a measure of the ease with which water can flow through it - of around 0.9 mm/second. The figure for soil containing fine ash that incorporated sulphates and fluorides, however, was just 0.25 mm/s.
Water flow through soil is important for agriculture as it affects the distribution of nutrients, and soil moisture is a key factor for healthy plant growth. While such a reduction in permeability is unlikely to be hugely detrimental to the well-draining soils in Iceland, in volcanic areas such as Indonesia, or following a super-volcano, it could lead to water-logging.
With regards to chemical effects, the main concern for soil is the presence of fluoride in the ash, which can harm plants, livestock and people when it gets into the food chain. Delmelle found that ash from the second phase of the volcano's eruption - from 18th April until the end of May - contained eight times more soluble fluoride than ash emitted in the first phase, between 14th and 18th April. This initial ash had less than 200 mg of soluble fluoride per kg.
Delmelle believes that the steam present during the first phase of the eruption scavenged fluoride from volcanic gases. In the second, water-free, phase the ash was able to take up this fluoride instead.
However, it seems that ash from the two phases contained similar levels of acid-soluble fluoride, particularly fluorapatite. Since acid conditions occur in the guts of cows, sheep and humans, this is potentially an issue of concern, although Delmelle does not believe that it will cause diseases such as fluorosis in Iceland.
By Liz Kalaugher at the EGU General Assembly in Vienna
Oxygen minimum, or "dead", zones are found below just two per cent of the surface of the world's oceans but they're responsible for roughly one-quarter to one-half of marine nitrogen removal. Once oxygen levels drop, standard lifeforms cannot survive and bacteria that use nitrogen rather than oxygen as fuel can take over.
It's been hard to measure the precise threshold for this changeover, but now a new oxygen sensor that's one hundred times more sensitive has revealed that it takes place at much lower oxygen levels - just 0.3 microM - than scientists believed.
Using the sensor, Tage Dalsgaard of Aarhus University Denmark and colleagues found oxygen concentrations of less than 0.01 microM (0.3 microgrammes per litre) over a distance of 2500 km along the coast of Chile and Peru. Previous best estimates had indicated levels of 1-2 microM per litre, Dalsgaard told a press conference at the EGU General Assembly. The team only found nitrogen-removing processes taking place when oxygen levels were less than 0.3 microM; these reactions occurred at a greater rate deeper into the dead zone.
Some of the most extensive oxygen minimum zones are found in the Eastern Tropical North Pacific, Eastern Tropical South Pacific and the Arabian Sea. The zones form when nutrient-rich waters from the depths rise to the surface and enable a bloom in plankton growth. Once the plankton reach the end of their lives, decomposition of their bodies as they sink to the depths consumes a large amount of oxygen. In a typical oxygen minimum zone in the open ocean, the top 50 m of water are oxygenated, the next 250 m contain little oxygen and levels of the gas rise again towards the seafloor.
Although the zones are a natural phenomenon, climate change is likely to reduce oxygen levels further. Indeed, Caroline Slomp of the University of Utrecht told reporters that low oxygen is the third major problem of climate change - runner up behind temperature rise and acidification. That's because oxygen is less soluble in warmer water, and warmer surface waters don't mix so well with those beneath. Increased levels of nitrogen entering coastal seas from activities such as fertiliser use are also creating dead zones close to shore.
Not only are these areas suffering stress because of low oxygen, explained Lisa Levin of Scripps Institution of Oceanography, but they're increasingly likely to be exploited as fishing activities move outwards from continental shelves to continental slopes, oil and gas exploration continues and extraction of resources such as diamonds and phosphates begins in new sites. Levin is keen that we understand more about the resilience of ecosystems in these areas before further exploitation occurs.
"Oxygen minimum zones play key roles in ocean biogeochemistry and are an important repository of microbial animal biodiversity," she said.
With this in mind, Levin and colleagues did field-work off the western coast of Goa. Their aim was to see how oxygen availability affects the recovery of sediment-dwelling organisms after disturbance. Introducing colonization trays containing soft sediment to the sea-floor at three different depths revealed that recolonization was strongly oxygen-dependent.
The tray on the seabed at 542 m, where oxygen levels were lowest, was not colonized at all. Levin says that this was no surprise as the background community did not contain any animals. At 800 m, where oxygen levels were ten times higher, only a few colonizers - mainly worm species - moved in. And on the seafloor at 1147 m, where oxygen levels were ten times higher again, there was much more extensive colonization. This time the incomers were mainly from one opportunistic polychaete worm species (Capitella) that is known as a pollution indicator.
By Liz Kalaugher, EGU General Assembly in Vienna
Although they have a common goal - lowering the carbon footprint of energy systems - carbon capture and sequestration (CCS) and geothermal energy could one day end up in competition for both suitable geological sites and funding. Frank Schilling of the Karlsruhe Institute of Technology, Germany believes there's a solution; he reckons that the two technologies could be combined to the benefit of both.
"Our storage capacity is limited so we must use the resource wisely," he told the press at the EGU Assembly in Vienna.
Not only could the two technologies share expertise in drilling technology and reservoir management, he believes, but geothermal could enhance the storage potential of CCS. A typical geothermal energy system removes hot water (around 40 degrees C or higher) from thermal aquifers around 1000 m below the ground, extracts the heat and returns cold water to the depths.
Since this cold water is denser than the hot water it's replacing, it potentially provides more pore space for storing carbon dioxide. In turn the addition of carbon dioxide could prevent any problems for the sub-surface caused by the introduction of negative pressure.
For geological formations where there are multiple barriers at different depths, an alternative combined system could see hot water removed from the thermal aquifer and carbon dioxide pumped in. Following heat extraction, the cold water could be returned to a higher level - the resulting negative pressure gradient would make leakage of carbon dioxide from below less likely. According to Schilling, halving the effective pressure on the caprock doubles the security of the system or doubles the storage space.
by Liz Kalaugher at the EGU General Assembly in Vienna.
Spring 2011 has seen the largest-ever degree of ozone loss over the northern hemisphere, journalists at the EGU General Assembly in Vienna heard this morning.
This year about 40% of the ozone column above the Arctic has disappeared, breaking the previous record of 30%. The cause? An unusual persistence of cold temperatures in the stratosphere into March, allowing longer lifetimes for the polar stratospheric clouds that enable conversion of pollutant gases into ozone-destroying chlorine.
Dubbed "mother-of-pearl" clouds because of their attractive appearance, polar stratospheric clouds form at temperatures below -78 ° C. The chlorine they help create, meanwhile, can only destroy ozone in the presence of sunlight, which reappears in the polar spring.
The ozone layer acts, according to Geir Braather of the World Meteorological Organization "like a suncream with factor 70" it cuts by 70% the amount of short-wave ultraviolet rays reaching the Earth's surface. So any disruption of this protection could have implications for humans.
As weather systems cause the polar vortex to shift, ozone-depleted air masses can move above Europe, Russia and North America. Indeed in 2005, when the second-largest ozone decrease took place, the ultraviolet index in March in one European country was five, bringing a sunburn time of 20-30 minutes for the fair-skinned. While this is not above summertime levels, it is unusual for spring and the researchers feel that people should be informed.
To date, air affected by the record-breaking ozone loss has hovered over Canada, eastern Russia and Scandinavia but has not extended down to the heavily-populated regions of Germany and central Europe, although this situation could change. The polar vortex is currently over central Russia and is forecast to be stable until April 9th.
At the south pole, where stratospheric temperatures are typically colder, springtime ozone loss of around 50% occurs each year. Fortunately, while the resulting ozone-depleted air sometimes reaches the southern tip of Chile, it generally does not extend above heavily-populated areas.
The more variable temperatures in the Arctic mean that some winters see ozone loss of just 5 or 10% whereas a "normal" winter could see 30% loss. Although this year's ozone loss has been unprecedented, it was not unexpected - scientists had predicted that such cold conditions in the stratosphere would lead to increased ozone loss.
While the Arctic was warmer than average at ground level this winter, temperatures in the stratosphere were colder. And when it comes to the stratosphere, the cold winters have been getting colder. "We don't know what's driving this long-term change," said Markus Rex of the AWI, Germany, who will be publishing his analysis of ozonesonde data in a Nature paper. Greenhouse gases could be a factor, but that's by no means certain.
Speaking on behalf of the World Meteorological Organization, Braather was keen to stress that this year's Arctic ozone loss record was not because the Montreal Protocol isn't working. Set up in 1987 this agreement has seen levels of ozone-depleting gases such as chlorofluorcarbons and halons above the Arctic fall by 10% of the amount that would bring them back to the 1980 benchmark level. Outside the poles the ozone layer is projected to recover by around 2030-2040. In the Antarctic recovery is expected by 2045-2060 and the picture is one or two decades rosier for the Arctic.
by Liz Kalaugher at the EGU General Assembly, Vienna
Back in 1895, the sudden collapse of the Altels cold (high-altitude) hanging glacier brought around five million cubic metres of ice crashing down onto the valley below. The event, the largest known ice avalanche in the Alps, killed six people and 170 cows, as well as causing the valley's entire summer harvest to fail.
Although there were various theories as to the cause - chief amongst them the increases in summer temperatures over the last few years - the exact mechanism that led a roughly semicircular region of ice to detach from the bedrock beneath was unclear. But now Jerome Faillettaz from ETHZ in Switzerland and colleagues have used a new numerical tool to show that the avalanche must have been due to a local decrease in the friction coefficient between the ice and bedrock, probably because of meltwater entering via a crevasse.
Applicable to landslides and rockfalls as well as ice on steep slopes, the tool uses a simple "blocks and springs" approach for modelling gravity-driven instability. As the blocks begin to slide, the brittle springs fail; the model accounts for factors such as creep, friction and glacier boundary conditions.
The researchers found that while altering the glacier geometry in the model or changing the support provided by the peak's side glacier did not cause ice break-up, a uniform change in friction coefficient across the whole of the glacier base made all of the ice fall. But when the team initiated a progressive decrease in friction in just one area of the glacier, a crown crevasse opened up and the ice failed with a similar semi-circular pattern to the 1895 event.
The research isn't only relevant to the past - it's likely that climate change will affect the stability of cold hanging glaciers around the world. Ice from the Glacier de Taconnaz on Mont Blanc could, for example, crash down onto the resort town of Chamonix. Faillettaz says that we need detection methods such as seismic monitoring to provide an early warning of such disasters.
Indeed Faillettaz and colleagues have trialled the use of seismic geophones on the Weisshorn cold glacier, which breaks up on a roughly ten-to-fifteen-year cycle and can disturb road and train access to Zermatt, Switzerland. The scientists believe they can detect acceleration of the ice in this way up to two weeks before any fracture. Although it's also possible to spot surface acceleration visually, the technique doesn't work in the bad weather conditions when such events often take place.
Faillettaz is cautious about the general applicability of the seismic method, however. In the case of the Weihorn, trapped water is not believed to be a factor behind the break-up; it may be harder to use seismic techniques to predict an Altels-type failure.