Recently in EGU Assembly Category
The news on the cryosphere hasn't been good at this year's EGU meeting, with poor prognoses for mountain glaciers, Arctic ice shelves and permafrost.
The situation seems particularly bleak for the ice shelves in Canada's far north. Luke Copland of the University of Ottawa, Canada, has been studying the northern coast of Ellesmere Island. Records from an expedition to the region in 1906 indicate that the ice shelves had an area of 10,000 square km. But by early 2008 there were only five ice shelves left, with a total area one-tenth of the figure a century ago. And by September of 2008 the Markham ice shelf had disappeared, leaving four ice shelves with a total area of just 750 square km.
Copland says there are several reasons for the changes, including the increase in winter temperatures of one degree C per decade, which prevents the ice shelves from rebuilding over the winter, the decreased extent of sea ice, which used to protect the outside edges of the shelves from tides and waves, and very warm summers and high winds in 2005 and 2008. He reckons it would take centuries to rebuild the ice shelves even if we stabilize temperatures at current levels and he would be surprised if the Arctic ice shelves survive.
Smaller and thinner than those in the Antarctic, these northern hemisphere ice shelves have become disconnected from the glaciers that created them. This has allowed freshwater pools around 40 m deep and known as epishelf lakes to build up behind the ice shelves on top of oceanic water, creating unique ecosystems. These ecosystems will disappear if the remaining Arctic ice shelves go.
Meanwhile Wilfried Haeberli of the University of Zurich, Switzerland, reported how he believes it's a realistic scenario that all mountain glaciers could have disappeared by 2050. The Pyrenees have lost 90% of their ice cover over the last few decades, for example. This will have big implications for water as rivers such as the Rhone and Rhine gain most of their water from glacier melt in July and August. "Most of my students will experience the loss of most of the European glaciers by the middle of the century," said Haeberli. "There is practically no hope for glaciers in the mountains."
Plans in Europe are underway for the first icebreaker for scientific purposes equipped with deep sea drilling gear. Dubbed the Aurora Borealis, if the ship makes it off the drawing board and becomes a reality it will participate in the Integrated Ocean Drilling Program (IODP) and help to boost the amount of core data available from the Arctic.
The 24 country IODP project already has two dedicated vessels - Japan's brand new Chikyu drill ship, which users riser drilling, and the modernised US riserless ship the JOIDES Resolution. At the moment the EU and Canada are contributing to the project with "mission-specific" operations in areas that are hard to reach for the other boats, such as shallow waters and ice-covered oceans.
"We can drill in the Arctic with the technology we have," said Catherine Mevel of the European Consortium for Ocean Research Drilling. "But this will be easier with a dedicated ship."
The Aurora is planned to be 200 m long and 50 m wide with two moon pools to enable drilling and scientific measurements of the ocean even if there is strong ice cover. Earlier expeditions to drill in the Arctic such as ACEX used three ships, including a Russian nuclear icebreaker, whereas the Aurora would be able to complete missions without support from other ships.
At the moment a group is investigating the specifications and funding options for the ship, which would cost around $50-60 m a year to run.
A dedicated workshop on scientific drilling at Vienna University will follow the EGU meeting this week.
Earlier in the week delegates at the EGU Meeting in Vienna heard Ray Bates call for climate scientists to define what they mean by climate feedback more precisely. Yesterday, Caitlin Buck of the University of Sheffield, UK, stressed the importance of detailing the uncertainty in climate change. Speaking at a press briefing in advance of two conference sessions on the topic, Buck explained how she believes that seeking single number estimates for future climate, for example the temperature in July in Austria in 2100, is unhelpful and not sensible.
"All pieces in the jigsaw are themselves uncertain," she said. "We should quantify those uncertainties and present them in easy ways. If we don't keep track of the uncertainty, we're losing the fuzziness at the edge of the information. At each step of the way we are losing the most extreme things that might happen and those are the things we really want to know about to help us make good decisions."
Rather than a single number for the "most likely" predicted temperature change by 2100, Buck would like to see scientists using bar charts of temperature change versus probability when they present their results to policymakers and the public.
At the same briefing, Ron Prinn of MIT explained how he has used a "wheel of fortune" approach to communicate to members of the US Congress the link between emissions cuts and the odds of different amounts of temperature rise. He drew up a separate wheel for each emissions cuts policy; the wheel was divided into temperature rise segments whose size was related to their relative probability of occurring. For example, a wheel for a "business as usual" scenario had much larger segments for the higher temperature rises than the wheel for a policy of stringent emissions cuts. "This shows the value of the policy even if there's lots of uncertainty in the data," he said.
A team in Germany has calculated that underground coal gasification combined with an above ground electricity plant and storage of the resulting carbon dioxide back underground can both cut carbon emissions and compete with other energy technologies in the European market. Thomas Kempka and colleagues at the research centres of GFZ Potsdam and RWTH Aachen see the technology as a bridging measure until use of renewable technologies becomes more widespread.
The technique works by drilling two wells into a deep coal seam. The first, an injection well, introduces oxygen and water vapour into the seam, encouraging gasification of the coal at high pressure to produce hydrogen, methane, carbon monoxide and carbon dioxide, which leaves the seam via the production well. The methane and hydrogen can be used as fuel directly, or to create methanol, and also to run a combined cycle electricity plant. Carbon dioxide removed at the surface can then be pumped back down and stored in coal seams where gasification has been completed.
The team reckons the maximum amount of carbon capture achievable in this way is 86%. This would create an energy generation technique 20% cheaper than nuclear electricity generation but with similar emissions. But even a 50% capture rate would bring coal emissions down to those of natural gas.
There has already been considerable research into carbon storage in empty coal seams and into coal gasification, which gives the researchers a head-start, although they reckon it will be 15-20 years before large-scale use of the process. The team has calculated that Europe would have enough coal for this Underground Coal Gasification-Carbon Capture and Storage technique to fulfil all its energy needs for 68 years, although they believe that is important to maintain a mix of sources.
Kempka and colleagues are currently using a 25 square km sample area in Germany that contains 7 coal seams for theoretical studies. They have calculated that to run a 600 MW power plant requires nine square kilometres of coal seam with an average thickness of 1.5 m. This would give a runtime of 20 years, with each seam being used up after about three years and becoming available for carbon dioxide storage. Storage during the first three years, meanwhile, could take place in saline aquifers.
Several aspects of the technique require further investigation, said Kempka at the EGU meeting, including environmental issues such as subsidence, aquifer pollution from compounds released during the gasification process, the safety of the carbon storage, and whether carbon dioxide can dissolve pollutants and transport them elsewhere underground.
While biofilms can corrode metals and speed up rock dissolution, it looks like the assemblies of bacteria could actually help to protect nuclear waste. Jean-Louis Crovisier of the Centre National de Recherche in Strasbourg, France, has found that A. Thiobacillus thiooxidans bacteria can reduce the amount of elements such as strontium and caesium dissolving from nuclear glass into a solution of water in the laboratory.
Speaking at the EGU meeting, Crovisier explained that scientists who believe biofilms damage glass and concrete-based materials are finding bacteria or biofilms in holes and assuming that they have caused the hole. He reckons that's equivalent to finding early humans living in caves and deducing that they made the cave.
The real test, according to Crovisier, is comparing changes in materials in the presence and absence of bacteria. He has found that both Pseudonomas bacteria and A. Thiobacillus thiooxidans lowered the concentration of caesium and strontium entering water from a nuclear glass, compared to sterile conditions. This indicates that the presence of a biofilm is actually protective, potentially trapping the elements.
As to whether biofilms could be used to protect nuclear waste in the field - it's still early days. Research so far has only taken place in the laboratory and much more is needed before any application - for example, scientists would need to find out the effects of a consortium of bacteria rather than just one species at a time. The species making up a biofilm would also be likely to change over the hundreds of years that the waste remains radioactive.
There's currently a video called "Don't panic: flaws in catastrophic global warming forecasts" circulating amongst the policy community. That's according to Ray Bates of University College Dublin, Ireland, speaking at the Vilhelm Bjerknes medal lecture at the European Geosciences Union meeting. The video explains that whereas most scientists assume that an unknown but stable system is dominated by negative feedback, climate scientists are an exception - they assume that the climate is dominated by positive feedbacks. As a result of this the video reckons there's no need to worry about climate change.
Bates reckons that this confusion hasn't been helped by the fact that climate scientists aren't always clear in their definition of what they mean by feedback. In fact, there are at least six different definitions of feedback and, depending on the definition used, the feedback in some systems may be either positive or negative.
"I believe there is an urgent need for us as climate scientists to agree on an accepted set of definitions for climate feedback," said Bates. "That will make it easier both for ourselves and for others outside to read our literature."
There's been widespread agreement that it's important to keep climate change from exceeding dangerous levels; the European Union has come up with 2 degrees C as a target. What's less clear is the emissions reductions scenario we will need to keep below this guideline.
And it's possible that the complex array of different emissions scenarios has delayed agreement. For example, in 2007 the G8 nations said they aimed to halve global emissions of carbon dioxide by 2050, while economist Nicholas Stern in his Stern review of the economics of climate change suggested 25% cuts by 2050 and an eventual reduction target of 80%.
With that in mind, a couple of groups of researchers are aiming to simplify the science involved to aid policymakers come up with decisions, as they will need to do at the climate negotiations in Copenhagen in December.
Chris Jones of the UK's Met Office and colleagues have developed the concept of the cumulative warming commitment. Jones, who detailed the work at the EGU meeting, sees it as a "framework for keeping policy simple" and a way of "taking the climate uncertainty out of the equation a bit".
The concept relies on the fact that warming is very sensitive to the cumulative amount of emissions but not to the pathway by which that amount of carbon was introduced into the atmosphere. The cumulative warming commitment provides a measure of the peak amount of warming per unit of cumulative carbon emissions. Reductions scenarios can be compared and analysed by integrating the total amount of emissions produced and converting this to a peak warming.
Of course, the 64 million dollar question is, what is the size of the cumulative warming commitment? Jones says that the value is uncertain, depending on factors such as climate-carbon cycle feedback, climate sensitivity and ocean heat uptake. But following analysis of data from C4MIP (the Coupled Carbon Cycle Climate Model Intercomparison Project), he and his colleagues have agreed on the most likely figure.
Using this most likely value indicates that the maximum cumulative amount of carbon we can emit and still keep to the 2 degrees C target is 1 trillion tonnes. "We are halfway there already," said Jones.
By focusing on the emissions budget as a whole, rather than the speed and timing of emissions cuts, the concept enables governments to make social and political decisions on how best to achieve their carbon goals. For example, a country might choose to delay action until later and then make faster cuts, although Jones stressed that could have detrimental economic implications.
Further details of the work will appear in a paper in Nature next week.
Meanwhile, Nathan Gillett of Environment Canada, has been developing the carbon-climate response metric, a measure of temperature change (rather than peak temperature) divided by cumulative carbon emissions. Again using C4MIP models, Gillett found that values for his CCR stabilised at 1.0 - 2.1 degrees C per trillion tons of carbon emissions for 2050-2100.
What's more, Gillett says that the constancy of the value means that it's possible to estimate CCR from observations by working out the carbon dioxide-induced warming caused by cumulative emissions over a given period. For the 1990s, he and his coworkers' best estimate of CCR, using a carbon dioxide-attributed warming of 0.48 degrees from a total 0.94 degrees of warming, is 1.5 degrees per trillion tons, which is in good agreement with the models.
More details of Gillett's work will appear in Nature in a couple of weeks.
The ozone hole that forms above Antarctica each spring will start to recover in about 20 years. That's according to David Hofmann of the US National Oceanic and Atmospheric Administration Earth System Research Laboratory, who spoke at a press briefing at this year's EGU meeting in Vienna, Austria.
Hofmann and colleagues have been taking ozone measurements from balloons above Antarctica since 1986. Their calculations of the rate at which ozone disappears as the sun rises above Antarctica in early September indicate that there are no signs yet of ozone hole recovery.
Indeed, Hofmann reckons that recent fluctuations in the size of the ozone hole each year have been down to meteorology. But on the plus side, for the last 6 or 7 years there's been no evidence of a continuing decline - ozone loss rates have stabilized and the "patient isn't getting any worse".
"If you're sick you're very happy when you start to get better but it still takes a long time," said Hofmann, continuing the human patient analogy.
When the sun first makes an appearance in the region on September 7th it kicks off the ozone-destroying properties of halogen molecules that have congregated on cold cloud surfaces during the winter month. By mid-October the resulting reactions have destroyed almost all the ozone in the band 14-21 km above the Earth. Later in the season, as the stratosphere warms and the clouds disappear, the ozone mostly returns.
During this first month of sunlight, Hofmann's team found that the ozone concentration declines by roughly 12% a day. "It's like uncompounding interest," he said. And, in the band 16-18 km high, the ozone loss rate peaks at 15-20% a day.
Between 1986 and the early 2000s, this loss rate increased but since then it has stayed about the same. That's probably due to a gradual decline in the amount of halogens in the atmosphere, following legislation restricting the use of chlorofluorocarbons, but the gases have a long lifetime.
As for the future, the team's model suggests that spring ozone loss rates will be roughly constant for 15 to 20 years. Then the ozone hole will begin to recover and will approach normal levels in 2060-2070.
"Three or four years ago I would have said we will probably detect the patient getting better in two to three years and recovering in 30-40 years," said Hofmann. But now the scientists have realised that recovery will be slower. "For us scientists who have to go to Antarctica and measure stuff it's job security," he added.
As generally seems to be the case, climate change complicates the picture further. As the troposphere warms, the stratosphere will cool to maintain the Earth's energy balance. This will create more clouds in the stratosphere and more activated chlorine to destroy ozone. "Global warming will probably keep the ozone hole going a couple of years longer," said Hofmann.