This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

April 2011 Archives

Almost all of the world's demand for electricity, transportation and heating energy could be met from renewable sources such as wind, solar and geothermal power by 2050, WWF International have said, in yet another study confirming that this goal was possible. It follows on from the '100% by 2050' global study by Delucchi and Jacobson, now published in Energy Policy, and several EU-focused '100% by 2050' studies: e.g www.roadmap2050.eu www.rethinking2050.eu and www.pwc.co.uk/eng/publications/100percentrenewable_electricity.html

WWF's report, produced with researchers at Dutch organizations Ecofys and the Office for Metropolitan Architecture, says the share of oil, coal, gas and nuclear in the global energy mix could be cut down to 5% by 2050, and energy saving measures can cut total demand by 15% from 2005 levels, starting from an assumed baseline of 520EJ/a. The programme would require €3.5 trillion p.a. by 2035 to modernise buildings and electricity grids and expand wind farms and solar parks. It would take until 2040 to pay off.

"This is insurance against the volatility of oil and gas prices and climate change," Stephan Singer, editor of the study and director of energy policy at WWF, told Bloomberg. He said it could be done using currently available technologies, but new technologies that aren't currently close to commercialisation could make it possible to get 100% of the world's energy by 2050.

Eco issues are put to the fore: there would be close scrutiny of hydro impacts and bio-energy use would be strictly controlled. Singer also noted that achieving the ramp-up in energy efficiency and renewable power would require behavioural changes, including eating less meat, using more public transport, and electrifying cars. New financing models will be needed to promote investments that generate long-term gains rather than immediate profits. "Sufficiency must be part of the solution - technology is not the sole provider. The global middle classes and the global rich of this world are not a blueprint model for the poor." 'The Energy Report': www.wwf.org.uk/researchcentre/researchcentre_results.cfm?uNewsID=4565

Quite apart from these essentially political prescriptions, there is a lot to chew on in this study and in the other similar exercises that have emerged recently. Time was when renewables were seen as interesting but marginal. Now their supporters are claiming that they could dominate the energy scene. Their projections may been seen as very technologically and economically optimistic, especially given the low-level starting point- renewables only supply about 18% of world power at present, and that's including hydro. But progress in the last few years has been very dramatic. We are already approaching 200 GW of wind, 40GW of PV solar globally, with prices falling rapidly. And many new improved technologies are emerging.

However, while some countries and regions are pushing ahead with quite radical technology deployment programmes, notably the EU, USA and China, overall the pace is relatively slow and there are many institutional biases and hurdles to overcome. But that's nothing new- most new technologies have to face similar constraints. So perhaps it's not surprising that the energy scenarios emerging from the oil industry do not have renewables supplying anywhere nears as much as those mentioned above.

For example, in 2008clu Shell produce a set of 2050 scenarios with renewables, including biofuels, only reaching about 37% by then in their 'Scramble' scenario, 30% is their more focused 'Blueprints' scenario - fossil fuels continued to dominate. Although, interestingly, nuclear was well behind renewables in both. See: www-static.shell.com/static/public/downloads/brochures/corporatepkg/scenarios/shellenergyscenarios2050.pdf

Shell has now produced a new set of energy scenarios, mostly running up to 2030, in which the future is seen as being even more challenging, with growth in energy demand rising dramatically and coal and especially gas rising to meet it, but oil more or less static. Biomass and nuclear build up equally, but most electricity supplying renewables less so: Shell stresses their technological and economic challenges.
www.shell.com/home/content/media/newsandmediareleases/2011/scenariossignalssignposts14022011.html and www-static.shell.com/static/aboutshell/ downloads/aboutshell/signals_signposts.pdf

BP has also issued its own 2030 projections, with coal and oil leveling off, gas rising, but overall the fossil fuels still dominating, although renewables, while small, expand, especially biofuels: 'Renewables (including biofuels) account for 18% of the growth in energy to 2030. The rate at which renewables penetrate the global energy market is similar to the emergence of nuclear power in the 1970s and 1980s': www.bp.com/liveassets/bpinternet/globalbp/ globalbpukenglish/reportsandpublications/statisticalenergyreview2008/STAGING/localassets/2010downloads/2030energyoutlook_booklet.pdf

Exxon Mobil (Esso in UK) also had a go, with similar results: www.exxonmobil.com/corporate/files/corporate/energyoutlookslides.pdf

Some of the difference between the oil company scenarios and those from green NGO's and academics is due to fossil fuel lobby assumptions about demand and the role of gas (including shale gas), with CCS also seen as coming on line in a big way. For example, in its new report Shell says 'Allowing natural gas rather than coal to grow to meet power demand is the surest, fastest and most comprehensive way there is to reduce CO2 emissions over the crucial next 10 years. Strong development of CCS programmes should help support such a strategy as part of a long-term vision for low-carbon energy supply'.

It's interesting to compare these oil industry views with those of the ostensibly independent International Energy Agency. In its new Energy Technology Perspectives report on 'Scenarios and Strategies to 2050', it too sees CCS as playing a major role. It includes a 'Blue Map' scenario in which, by 2050, renewables provide almost 40% of primary energy supply and 48% of power generation globally. And the IEA's Blue Map 'HI REN' Scenario has renewables supplying 75% of electricity by 2050. That's some way from 100%, but it does suggest that the oil companies may be over-cautious, at least on renewables. www.iea.org/techno/etp/etp10/English.pdf


Despite globally increasing awareness of climate change and a number of regional climate policies, growth in greenhouse gas emission is undamped. The emissions are not spread equally over nations. In fact, it is popular wisdom, that the US and China are the big elephants in the room, or in other words, the world's biggest emitters of CO2. In a seminal paper - just published in PNAS - Glen Peters, Jan Minx and colleagues investigate the temporal change in consumption-based CO2 emissions across world regions. The key result: Emissions embedded in products traded from developing countries, including China, to OECD countries exceed the reduction of these countries, as pledged in the Kyoto protocol. In other words: From a consumer perspective, seemingly climate-friendly countries of Europe lose part of their cutting-edge image.

There are other interesting insights. For example, the ratio of emissions of Annex-B (with Kyoto pledge) to non-Annex-B countries (no Kyoto pledge) was 2-1 in 1990 but more or less levelized to 2:2 by 2008, indicating tremendous change in the way our world economy is structured. As Jan Minx, from the Department Economics of Climate Change of the Technical University Berlin and co-author of the PNAS article explains: "Most of the change in in global emission patterns is mirrored in China and Russia: While Chinese emissions increased dramatically in the last two decades, significantly also fueling increasing consumption in OECD countries, emissions from Russia and Ukraine fell significantly after 1990". In this regard, one can understand inter-temporal carbon emission patterns, production and consumption-based, as signifiers of global structural change. While it is now China and Russia, other countries such as India could be of equal importance in the upcoming decades. The picture is completed by the exploding importance of global trade: whereas 4.3Gt CO2 were embedded in international trade in 1990 (20%), it was 7.8Gt CO2 in 2008 (26%). Hence, emissions embedded in trade grew faster than emissions of other source. The recent geo-history of carbon can then be summarized as: the rise of China plus the ongoing globalization of production processes.

Glen P. Peters, Jan C. Minx, Christopher L. Weber and Ottmar Edenhofer (2011). Growth in emission transfers via international trade from 1990 to 2008. PNAS; published ahead of print April 25, 2011,doi:10.1073/pnas.1006388108

This recent report from Lawrence Berkeley National Laboratory shows that there is evidence that California home buyers and sellers value photovoltaic solar systems installed on residential homes. This concept seems like it should obviously be true – that if you have an additional feature on an otherwise equivalent house it should sell for more money. However, various home owner associations around the United States often have arcane rules about what a homeowner can and cannot do to their home. Oftentimes installation of PV panels are frowned upon in some neighborhoods, and that is a tremendous shame. If someone is willing to pay for PV panels, which is still not the most effective investment for saving money from purchasing grid-based electricity, then that home owner should not have artificial barriers put into his/her way.

This and future studies will provide evidence that PV panels add value to homes just as do other features that do not provide or save energy (e.g. granite counter tops and tile floors). There are enough difficulties trying to develop and deploy new energy technologies, and we do not need attitudes to be one of these difficulties. There is very little that is inherently more beautiful about a normal roofing material versus a PV panel or even solar hot water panel. If you get the chance, tell your home owner association to remove barriers to energy efficient and energy-producing systems. I for one made sure to move into a neighborhood that didn't have a homeowner association.

Tidal assessment

| | Comments (4) | TrackBacks (0)

I have reported in previous blogs on some of dozens of tidal current devices of various types and scale under development in the UK and elsewhere. Some of the projects are now well established, having been fully tested at sea, and some have been deployed at full scale- notably Marine Current Turbines 1.2MW SeaGen. However, most are still at relatively early stages of development, with the claims about potential energy outputs and generation capacities being still being unproven, and some are just speculative design concepts and proposals.

When looking at novel proposals care clearly has to be taken to assess the credibility of the claims being made. For example, generation capacities are sometimes claimed which could only be achieved, given the size of the device, at very high water speeds: the energy output is proportional to the square of the turbine radius (which defines the swept area), and the cube of the water speed. At the Tidal Today.com Tidal Summit last year Peter Fraenkel from MCT showed a chart comparing some existing devices on the basis of swept area. Not surprisingly, MCT devices, being well developed at full scale, came out on top, but some of the other rankings were interesting.

While SeaGen obviously won overall, amongst what might be considered to be the other, albeit less developed, front runners, the double rotor Atlantis AK1000 came out best, followed by Tidal Generations device, while Open Hydro didn't do well. The now abandoned Stingray oscillating hydrofoil did quite well, as did its, in effect, follow up, the double hydrofoil, Pulse Tidal see-saw design . Hammerfest Strom's turbine did even better. But then the chart didn't cover any of the ducted turbine designs, or Voith's new turbine, rated at 1MW, or the new multi-turbine Hydra Morild II) which is rated at 1.5 MW, which has just been installed for testing off Norway. That will have the largest rated capacity so far.

Projected capacity or claimed energy outputs are of course only part of the issue. What really matters is whether the devices are viable in engineering and economic terms. On the former all we have to go on is their success to date- and perhaps inevitably in some cases there have been problems. For example, the Atlantis AK1000 suffered blade failure and is having to be re-installed at EMEC on the Orkneys, and Open Hydro's test device, installed in the Bay of Fundy, lost all its blades.

You'd expect problems during development- that's how technology improves- but some device concepts may be more fundamentally limited. For example, in the new NATTA DVD on Tidal Energy, Peter Fraenkel argues that propeller type devices will always be best, as the wind and indeed hydro industry have clearly shown, and he was fairly dismissive of other approaches. Pulse tidal has claimed that its hydroplane will be better in shallow water, since its swept area is larger than a propeller of similar scale. But then it will only extract energy efficiently in the middle of its up/down traverses, much less at the end of each cycle, so the output may be lower. There could also be problems with multi-turbine designs due to wake interaction between the rotors and rotor efficiency losses with ducted designs.

Clearly there is still some way to go before all the engineering issues have been resolved (by testing at sea) so that we can have a more solid basis for cost comparisons. But MCT is doing very well, and the tidal field is certainly an exciting one - bursting with innovative ideas. Let's hope for some more successes.

In addition to operational and economic issues, if tidal current devices are to be used on a wide scale then a key issue will be their environmental impacts. Most studies of tidal current turbines so far have suggested that impacts will be low. Even large arrays will not impede flows significantly and the rotor blades will turn slowly, slower than wind turbines, and much slower than the turbines in tidal barrages and lagoons or hydro plants, and so should not present a hazard to marine life- fish will be unaffected. Certainly experience with MCT's SeaGen has not indicated any problems. For example, no seal deaths have been attributed to the turbines since their installation in 2008. A sonar system has been used to detect the approach of any marine mammals, and shut the turbines down. However all structures put in the sea will have some impact, and this needs to be, and already is, carefully assessed when considering possible locations.

There can also be interactions between tidal projects if they are located near each other. The Energy Technologies Institute (ETI) developing a model of the UK's tidal energy resources to improve understanding of these interactions. The Tidal Modelling Project will investigate the interaction between tidal energy extraction systems located at different positions around the UK, and how energy extraction at one site might affect the energy available and nature of the tidal energy resources at other sites.

Of course if we have tidal projects sensibly sited at various points around the coast then there can also be positive interactions in terms of overall power availability, since high tide and , maximum tidal flow, will occur at a different times at each site.

The Tidal Energy DVD can be obtained from NATTA: www.natta-renew.org. A short taster is at www.youtube.com/watch?v=nsntWXR63Sc

In March, in the wake of the Fukushima disaster, the German government shut down all of Germanys oldest nuclear plants and, like many other countries, set a review of safety and policy in motion. Then in April, Secretary of State for the Environment and Nuclear Safety, Jürgen Becker, told Reuters: "A decision has been taken to shut down eight plants before the end of this year and they definitely won't be reactivated. And the remaining nine will be shut down by the end of the decade."

This policy was then backed by the German Association of Energy and Water Industries, BDEW, which said that nuclear should be phased out by 2020 or at the latest by 2023. It called on the government to set everything in motion to speed up the transition toward a stable, ecologically responsible and affordable energy mix without nuclear energy. 'The catastrophe at the Fukushima reactors marks a new era and the BDEW therefore calls for a swift and complete exit from using nuclear power.'

The association represents about 1,800 utilities, among them the operators of the country's 17 nuclear reactors, which, when all were running, generated 26% of Gemany's electricity. The two biggest operators, E.ON AG and RWE AG, opposed to the decision, but were outvoted.

The new approach is likely to be popular. According to a public opinion survey conducted by GfK Marktforschung in April, after Fukushima, only 5% of German consumers now consider nuclear energy to be a viable option in the longer term- down from 10% in January- with only 4% saysing it was necessary to invest in this type of energy generation for reasons of climate protection. By contrast, there is widespread and growing support for further investment in the expansion of renewable energy, with 86% backing solar (Jan: 83%), and 80% wind energy (Jan: 72%)

Can they do it? German Environment Minister Norbert Röttgen told der Spiegel that he was confident that it could be done given the rapid growth of renewables and the potential for energy saving, but 'everyone will have to invest in the energy turnaround. The expansion of renewable energy, the power lines it requires and the storage facilities will cost money. That has to be clear. But after the investments are made, the returns will follow - I don't doubt that.'

He went on ' First we'll have to focus on retrofitting buildings. The €460 million ($653 million) currently budgeted for that program won't be enough. But every euro in government subsidies will trigger seven or eight euros in private investment, which also translates into tax revenues. Everyone can benefit in the long term, from citizens to the economy to the environment.'

In terms of renewables, there would be no need to cover Germany with wind farms ' We achieve the biggest capacities by replacing smaller wind turbines on land with more powerful ones and by generating wind energy in the North and Baltic Seas'. He concluded 'The events in Fukushima marked a turning point for all of us. Now we jointly support phasing out nuclear energy as quickly as possible and phasing in renewable energies'.

Germany already gets 17% of its power from renewables, and the potential for expansion is certainly there long term. In addition to backing a nuclear phase out, last years 'Energiekonzept' review, produced by the Federal Environment Ministry, BMU, looked to renewables supplying 35% of electricity by 2020, 50% by 2030, 65% by 2040, 80% by 2050. It also planned major increases in grid integration with the rest of the EU. It saw offshore wind as a major growth area- it wanted 25 GW in place by 2030. At present it has around 27GW in place but mostly on land, plus around 16GW of solar PV. In addition to a large hydro contribution, including pumped storage facilities, major new geothermal and biomass projects are on the way, with biogas seen as key new option, replacing imported Natural Gas. The review also called for primary energy consumption to be halved by 2050, and overall, the review aimed for a 40% by 2020 CO2 reduction target.

With nuclear to be removed by around 2020, the renewables expansion programme and energy saving initiatives will have to be accelerated-although the original plan assumed the then planned gradual nuclear phase out. It won't be easy. But the political will now seems to be there to try. And it's argued that although the initial capital cost may be relatively high, the overall cost will fall, as renewables begin to replace expensive imported fossil fuel and prices fall under the Feed In Tariff system . www.wupperinst.org/uploads/txwiprojekt/EEGExpand_report.pdf

What about elsewhere? The UK only gets 18% of its electricity from nuclear, so a phase out ought to be easier than in Germany- if the political will was there. We have the world best renewable resources after all- far more than Germany. France gets 74% of its electricity from nuclear, so it's a harder nut to crack, but opposition to nuclear has mounted (it's now running at 42% in favour of a phase out, as against 55% opposed) and a 2006 study suggested a phase out was technically possible by 2040: see www.ieer.org/reports/energy/france/

Japan got 29% of its electricity from nuclear before the accident, and imports most of the rest of its energy, but it now has a massive incentive to change. There have been huge anti nuclear protests Japan - 17,500 people marched on April 10th. As I mentioned in a previous Blog, a 2003 study saw a transition to 100% renewables as feasible- and green energy technology has moved on a lot since then: www.energyrichjapan.info

And globally, there are now scenarios showing that 100% from renewables by 2050 or even earlier is possible: www.stanford.edu/group/efmh/jacobson/Articles/I/susenergy2030.html and www.wwf.org.uk/researchcentre/researchcentre_results.cfm?uNewsID=4565 Germany may be showing the way, but it is not alone. For example, Denmark has set out a vision for energy supply in 2050, aiming to become independent of coal, oil and natural gas by 2050- it has no nuclear plants. www.kemin.dk/en-US/Sider/frontpage.aspx

Some EU countries (e.g. non nuclear Austria) are already getting 50% or more of their power from renewables, and there are scenarios suggesting that the EU as a whole could get to almost 100% renewables by 2050, at reasonable costs. e.g. see: www.roadmap2050.eu and www.pwc.co.uk/eng/publications/100percentrenewable_electricity.html

A non-nuclear future? There are still many who see it as both impossible and also as undesirable. China, may have halted its nuclear programme temporarily and, like most countries, is reviewing it policies. But although programmes may slow, the non-nuclear vision has yet to be widely accepted. So what happens in Germany is likely to be crucial. If they can do it, others may follow.

After the Fukushima disaster, Electricite de France CEO Henri Proglio said 'Nuclear is a formidable source of energy.' Maybe a nuance there lost in translation? Regardless, views do seem to have changed. In a public opinion poll in France just after the Japanese disaster, 55% said they were not in favour of a proposal by France's main green party to drop nuclear power, but 42% were in favour. This in the most pro-nuclear country in the world.

As you might expect, in anti-nuclear Spain a 40-year old Spanish plant built to the same design as Fukushima's reactor 1, became engulfed by calls for a shut down. In anti-nuclear Germany all the old plants were shut and there were massive anti-nuclear demonstrations, a swing to the Green Party in the regional elections, and talk of closing all the plants by 2020, a position now backed by the German Association of Energy and Water Industries (BDEW). And even in allegedly pro-nuclear UK, support for nuclear fell 12% ( from 47% to 35%). A safety review followed in the UK, and stress tests were carried out across the EU. Meanwhile China halted all new nuclear projects and initiated a review of policy..

Given that the core containments seem mostly to have held so far, the Fukushima accident wasn't seen as being of the same order as Chernobyl, although it did involve several reactors, rather than just one, as well as several spent fuel storage ponds, including material stored above the reactor, some of which was evidently scattered around. But we have yet to see what exactly the final impact will be. http://www.nytimes.com/2011/04/06/world/asia/06nuclear.html?_r=1

At the time of the quake/ tsunami in Japan, there were 3,400 tons of spent fuel in seven storage pools at Fukushima, some of it still very active, plus 877 tons of active fuel in the cores of the reactors. That totals 4,277 tons of nuclear fuel at Fukushima- the storage pool above reactor 4 alone contained 135 tons of spent fuel. For comparison, the Chernobyl reactors had about 180 tons when the accident occurred in 1986 and about 6% of that was released into atmosphere. We don't know yet what percentage was released in the air, land and sea at Fukushima- it will presumably be much lower in percent terms- but Ieakages are still ongoing.

Chernobyl recounts

Although there were deaths due to the explosions, so far no radiation deaths have been reported at Fukushima, and some commentators have argued that it will remain so. This hopeful view was buttressed by a new report from the United Nations Scientific Committee on the Effects of Atomic Radiation published in February, which says that the known death toll from Chernobyl was just 28 fatalities among emergency workers, plus 15 fatal cases of child thyroid cancer by 2005, some of which may have been avoided if iodine tablets had been taken (as they have now in Japan). And it says 'To date, there has been no persuasive evidence of any other health effect in the general population that can be attributed to radiation exposure'. It doesn't speculate about future deaths 'because of unacceptable uncertainties in the predictions', but previous IAEA/WHO reports have talked of around 4000. www.unscear.org/docs/reports/2008/AdvancecopyAnnexDChernobyl_Report.pdf

For a very different view see 'Chernobyl: Consequences of the Catastrophe for People and the Environment' published in 2010 by the New York Academy of Sciences and authored by Russian biologist Dr. Alexey Yablokov; Dr. Alexey Nesterenko, a biologist /ecologist in Belarus; and Dr.Vassili Nesterenko, a physicist and at the time of the accident director of the Institute of Nuclear Energy of the National Academy of Sciences of Belarus. Its editor is Dr. Janette Sherman, a physician and toxicologist involved in studying the health impacts of radiation.

It concludes that, based on records now available, some 985,000 people died, mainly of cancer, as a result of the Chernobyl accident, between when the accident occurred in 1986 and 2004. More deaths, it projects, will follow. www.wagingpeace.org/articles/dbarticle.php?articleid=141

High estimates like this raised some hackles, including some bitter comments from the Guardian's George Monbiot, a new convert to the nuclear cause. However, there are clearly differing view on the impacts of radiation, one view being that some low level emitters, if ingested/breathed in, can cause much more damage than is usually assumed. A review of the New York Academy's report in the journal Radiation Protection Dosimetry .concluded that it makes clear 'that international nuclear agencies and some national authorities remain in denial about the scale of the health disasters in their countries due to Chernobyl's fallout. This is shown by their reluctance to acknowledge contamination and health outcomes data, their ascribing observed morbidity/mortality increases to non-radiation causes, and their refusal to devote resources to rehabilitation and disaster management.'

The debate over numbers will no doubt continue. One might hazard a guess that the truth is somewhere in between the extremes. For example the independent 2006 TORCH report estimated the final death toll as likely to be 30,000-60,000, which seems more credible. www.chernobylreport.org

However it's clearly an area of continuing dispute, which will no doubt become even more fraught given the estimate by Prof. Chris Busby that there could eventually be over 400,000 deaths from Fukushima: http://llrc.org/fukushima/fukushimariskcalc.pdf

While predictions like this may be provocative, the reality seems to be that we just don't know for sure, or at least can't agree, and it's that uncertainty that may be the most worrying thing to many people.

What next?

We now await the various safety reviews and policy responses Some other countries have already moved quickly (some say too quickly) to change, or reassert, existing policies. But it seems unlikely that the UK government will make radical changes. Although he accepted that there could now be more financial problems, Energy Secretary Chris Huhne was still upbeat. He told the House of Commons on 24th March 'We have to put an emphasis on safety. That is why we commissioned Dr Mike Weightman's report'. However, although he said that 'we will have to wait to see its results and base the debate on the facts', he added ' I do not anticipate that it will lead to enormous changes'. And later on he was quoted as saying 'There is no intention for us to do anything but learn the lessons... for example, about the back up for cooling.'

That seems to be the very minimum necessary, and even that would surely require the government to reconsidered the conclusions of its 'Nuclear Justification' exercise: the still to be completed 'Generic Design Assessment' for the new plants has already been extended. One thing seems clear, if we still do go ahead, with all the proposed new plants and their waste stores being at sea level on the coast, the extra cost of improving safety, tightening regulation and upgrading insurance cover and evacuation procedures could be large.

Huhne has said that: 'We can do the 80% reduction in emissions by 2050 without new nuclear, but it will require a big effort on carbon capture and storage and renewables.' It could well be that the balance has now tipped and that this alternative approach, with energy efficiency also included, could be preferable- and could cost less. U.S. Energy Secretary Steven Chu recently claimed that wind and solar may compete with fossil fuels, without subsidies, within the next decade. Can the same be said of nuclear?

Road transport is entering a phase of major structural shifts. The age of cheap oil seems to end, while at the same time climate change puts major doubts on the societal benefit of our current mobility. A number of alternative fuels and technologies enter the stage, and most of them promise less GHG emissions and reduced oil dependency. While it is true that electric cars, lighter and more efficient vehicles, and - perhaps - some sort of biofuels can contribute to our global challenges, their success crucially depends on reliable, effective, and efficient policy frameworks. The US and the EU implemented policies regulating road transport and its GHG emissions. However, these policies are adapted to fossil fuels, and not to alternative fuels. There is a considerable risk, that these policies are ineffective to decarbonize alternative fuels. A publication of our group - just published in Energy Policy - addresses this issue.

So what is at stake?

The key observation is that the GHG emissions of alternative fuels are usually not end-of-pipe but occur upstream, and a varying. Ultimately, by looking at the end product - electricity, hydrogen, or biofuels - you cannot know the real carbon footprint. E.g., does the electricity come from coal power plants or solar panels? The life-cycle emissions of biofuels vary depending on agricultural production process, refining, and direct and indirect land use changes, and can possibly exceed those of gasoline.

The current regulation, however, regulates cars in terms of GHG intensity per km (e.g., in the EU and California). Awkward constructions are used to include electric cars in this regulation, usually by some sort of default parameters. But the carbon footprint of electricity varies a lot. And - in some countries - consumers can choose their electricity provider, and hence their carbon footprint.

As a consequence, cars are better regulated in terms of energy efficiency (MJ/km). That allows a level playing field across both car technologies and fuels, and addresses the issue car manufacturer can do something about: the required energy need for cars. Such a measure, if unbiased by property-based metrics - will not only induce technological innovation but additional pressure towards light materials and smaller cars, conversing harmful trends of the last two decades.

A complementary measure can then address all GHG emissions. Our analysis reveals that biofuel mandates are completely inadequate to reduce carbon contents of bio/agrofuels. Indeed, if pushed into markets without differentiation, the cheapest and most harmful biofuels are often preferred.

The Californian Low Carbon Fuel Standards, but also part of the RFS2 and EU legislation are better by requiring GHG emission threshold for alternative fuels, or at least biofuels. However, indirect land use effects are not at all (US, EU - up to now) or inadequately (California) addressed. Furthermore, rebound effects easily compromise the efficiency of these instruments.

Instead, some sort of price instruments, and quantity regulation of GHG emissions from road transport is most effective and efficient. More on this issue in the next blog.

F. Creutzig, E. McGlynn, J. Minx, O. Edenhofer (2011) Climate policies for road transport revisited (I): Evaluation of the current framework. Energy Policy 39(5): 2396-2406

by Liz Kalaugher at the EGU General Assembly in Vienna

At last year's EGU meeting several late-breaking sessions covered Iceland's Eyjafjallajökull volcano, which was still erupting. This year there was plenty of time to schedule sessions well in advance and, when it came to ash, more of on emphasis on its effects on the ground rather than the consequences of airborne ash for aircraft safety.

As a result of the eruption, farmland 15 km south of the volcano was coated with an ash layer more than 10 cm deep. Pierre Delmelle from the University of York, UK, believes he and his colleagues are the first to study the physical effects of ash on soil. There's usually more attention paid to chemical effects but Delmelle reckons that for the Icelandic volcano, the physical effects may be just as important.

Delmelle found that, when fine ash was ploughed into the soil, its permeability to water decreased, probably because of a change in pore size distribution. In the absence of ash, the soil exhibited a hydraulic conductivity - a measure of the ease with which water can flow through it - of around 0.9 mm/second. The figure for soil containing fine ash that incorporated sulphates and fluorides, however, was just 0.25 mm/s.

Water flow through soil is important for agriculture as it affects the distribution of nutrients, and soil moisture is a key factor for healthy plant growth. While such a reduction in permeability is unlikely to be hugely detrimental to the well-draining soils in Iceland, in volcanic areas such as Indonesia, or following a super-volcano, it could lead to water-logging.

With regards to chemical effects, the main concern for soil is the presence of fluoride in the ash, which can harm plants, livestock and people when it gets into the food chain. Delmelle found that ash from the second phase of the volcano's eruption - from 18th April until the end of May - contained eight times more soluble fluoride than ash emitted in the first phase, between 14th and 18th April. This initial ash had less than 200 mg of soluble fluoride per kg.

Delmelle believes that the steam present during the first phase of the eruption scavenged fluoride from volcanic gases. In the second, water-free, phase the ash was able to take up this fluoride instead.

However, it seems that ash from the two phases contained similar levels of acid-soluble fluoride, particularly fluorapatite. Since acid conditions occur in the guts of cows, sheep and humans, this is potentially an issue of concern, although Delmelle does not believe that it will cause diseases such as fluorosis in Iceland.


EGU 2011: Oxygen - how low can it go?

| | Comments (29) | TrackBacks (0)

By Liz Kalaugher at the EGU General Assembly in Vienna

Oxygen minimum, or "dead", zones are found below just two per cent of the surface of the world's oceans but they're responsible for roughly one-quarter to one-half of marine nitrogen removal. Once oxygen levels drop, standard lifeforms cannot survive and bacteria that use nitrogen rather than oxygen as fuel can take over.

It's been hard to measure the precise threshold for this changeover, but now a new oxygen sensor that's one hundred times more sensitive has revealed that it takes place at much lower oxygen levels - just 0.3 microM - than scientists believed.

Using the sensor, Tage Dalsgaard of Aarhus University Denmark and colleagues found oxygen concentrations of less than 0.01 microM (0.3 microgrammes per litre) over a distance of 2500 km along the coast of Chile and Peru. Previous best estimates had indicated levels of 1-2 microM per litre, Dalsgaard told a press conference at the EGU General Assembly. The team only found nitrogen-removing processes taking place when oxygen levels were less than 0.3 microM; these reactions occurred at a greater rate deeper into the dead zone.

Some of the most extensive oxygen minimum zones are found in the Eastern Tropical North Pacific, Eastern Tropical South Pacific and the Arabian Sea. The zones form when nutrient-rich waters from the depths rise to the surface and enable a bloom in plankton growth. Once the plankton reach the end of their lives, decomposition of their bodies as they sink to the depths consumes a large amount of oxygen. In a typical oxygen minimum zone in the open ocean, the top 50 m of water are oxygenated, the next 250 m contain little oxygen and levels of the gas rise again towards the seafloor.

Although the zones are a natural phenomenon, climate change is likely to reduce oxygen levels further. Indeed, Caroline Slomp of the University of Utrecht told reporters that low oxygen is the third major problem of climate change - runner up behind temperature rise and acidification. That's because oxygen is less soluble in warmer water, and warmer surface waters don't mix so well with those beneath. Increased levels of nitrogen entering coastal seas from activities such as fertiliser use are also creating dead zones close to shore.

Not only are these areas suffering stress because of low oxygen, explained Lisa Levin of Scripps Institution of Oceanography, but they're increasingly likely to be exploited as fishing activities move outwards from continental shelves to continental slopes, oil and gas exploration continues and extraction of resources such as diamonds and phosphates begins in new sites. Levin is keen that we understand more about the resilience of ecosystems in these areas before further exploitation occurs.

"Oxygen minimum zones play key roles in ocean biogeochemistry and are an important repository of microbial animal biodiversity," she said.

With this in mind, Levin and colleagues did field-work off the western coast of Goa. Their aim was to see how oxygen availability affects the recovery of sediment-dwelling organisms after disturbance. Introducing colonization trays containing soft sediment to the sea-floor at three different depths revealed that recolonization was strongly oxygen-dependent.

The tray on the seabed at 542 m, where oxygen levels were lowest, was not colonized at all. Levin says that this was no surprise as the background community did not contain any animals. At 800 m, where oxygen levels were ten times higher, only a few colonizers - mainly worm species - moved in. And on the seafloor at 1147 m, where oxygen levels were ten times higher again, there was much more extensive colonization. This time the incomers were mainly from one opportunistic polychaete worm species (Capitella) that is known as a pollution indicator.

By Liz Kalaugher, EGU General Assembly in Vienna

Although they have a common goal - lowering the carbon footprint of energy systems - carbon capture and sequestration (CCS) and geothermal energy could one day end up in competition for both suitable geological sites and funding. Frank Schilling of the Karlsruhe Institute of Technology, Germany believes there's a solution; he reckons that the two technologies could be combined to the benefit of both.

"Our storage capacity is limited so we must use the resource wisely," he told the press at the EGU Assembly in Vienna.

Not only could the two technologies share expertise in drilling technology and reservoir management, he believes, but geothermal could enhance the storage potential of CCS. A typical geothermal energy system removes hot water (around 40 degrees C or higher) from thermal aquifers around 1000 m below the ground, extracts the heat and returns cold water to the depths.

Since this cold water is denser than the hot water it's replacing, it potentially provides more pore space for storing carbon dioxide. In turn the addition of carbon dioxide could prevent any problems for the sub-surface caused by the introduction of negative pressure.

For geological formations where there are multiple barriers at different depths, an alternative combined system could see hot water removed from the thermal aquifer and carbon dioxide pumped in. Following heat extraction, the cold water could be returned to a higher level - the resulting negative pressure gradient would make leakage of carbon dioxide from below less likely. According to Schilling, halving the effective pressure on the caprock doubles the security of the system or doubles the storage space.


by Liz Kalaugher at the EGU General Assembly in Vienna.

Spring 2011 has seen the largest-ever degree of ozone loss over the northern hemisphere, journalists at the EGU General Assembly in Vienna heard this morning.

This year about 40% of the ozone column above the Arctic has disappeared, breaking the previous record of 30%. The cause? An unusual persistence of cold temperatures in the stratosphere into March, allowing longer lifetimes for the polar stratospheric clouds that enable conversion of pollutant gases into ozone-destroying chlorine.

Dubbed "mother-of-pearl" clouds because of their attractive appearance, polar stratospheric clouds form at temperatures below -78 ° C. The chlorine they help create, meanwhile, can only destroy ozone in the presence of sunlight, which reappears in the polar spring.

The ozone layer acts, according to Geir Braather of the World Meteorological Organization "like a suncream with factor 70" it cuts by 70% the amount of short-wave ultraviolet rays reaching the Earth's surface. So any disruption of this protection could have implications for humans.

As weather systems cause the polar vortex to shift, ozone-depleted air masses can move above Europe, Russia and North America. Indeed in 2005, when the second-largest ozone decrease took place, the ultraviolet index in March in one European country was five, bringing a sunburn time of 20-30 minutes for the fair-skinned. While this is not above summertime levels, it is unusual for spring and the researchers feel that people should be informed.

To date, air affected by the record-breaking ozone loss has hovered over Canada, eastern Russia and Scandinavia but has not extended down to the heavily-populated regions of Germany and central Europe, although this situation could change. The polar vortex is currently over central Russia and is forecast to be stable until April 9th.

At the south pole, where stratospheric temperatures are typically colder, springtime ozone loss of around 50% occurs each year. Fortunately, while the resulting ozone-depleted air sometimes reaches the southern tip of Chile, it generally does not extend above heavily-populated areas.

The more variable temperatures in the Arctic mean that some winters see ozone loss of just 5 or 10% whereas a "normal" winter could see 30% loss. Although this year's ozone loss has been unprecedented, it was not unexpected - scientists had predicted that such cold conditions in the stratosphere would lead to increased ozone loss.

While the Arctic was warmer than average at ground level this winter, temperatures in the stratosphere were colder. And when it comes to the stratosphere, the cold winters have been getting colder. "We don't know what's driving this long-term change," said Markus Rex of the AWI, Germany, who will be publishing his analysis of ozonesonde data in a Nature paper. Greenhouse gases could be a factor, but that's by no means certain.

Speaking on behalf of the World Meteorological Organization, Braather was keen to stress that this year's Arctic ozone loss record was not because the Montreal Protocol isn't working. Set up in 1987 this agreement has seen levels of ozone-depleting gases such as chlorofluorcarbons and halons above the Arctic fall by 10% of the amount that would bring them back to the 1980 benchmark level. Outside the poles the ozone layer is projected to recover by around 2030-2040. In the Antarctic recovery is expected by 2045-2060 and the picture is one or two decades rosier for the Arctic.


Detailed proposals for the governments Renewable Heat Incentive (RHI) have emerged, after a long delay. Heating accounts for 47% of total UK final energy consumption and 46% of carbon emissions, so it's not a marginal issue. The RHI cover biomass, ground source and water source heat pumps, solar thermal and bio-methane, and participants will receive quarterly payments for 20 years from the date they enter the scheme.

It will be introduced in two phases. The first phase is expected to start after the regulations have been processed by Parliament in July. It will offer long-term tariff support for non-domestic projects, i.e. for the big heat users - the industrial, business and public sector - which contribute 38% of the UK's carbon emissions. But under this phase there will also be grant support of around £15m for households through an interim 'Renewable Heat Premium Payment,' for 'well-developed' projects, essentially as a 'test drive' for the more diffuse domestic sector.

The second phase of the scheme will see households also able to apply for long-term tariff support, this transition being timed to align with the 'Green Deal' loan system, which is intended to be introduced in October 2012.

Unlike the feed-in tariffs for renewable forms of electricity, which are paid for through higher energy bills, the renewable heat incentive will be paid for from taxes - in all, £860m government funding has been allocated. The proposed tariffs (still subject to consultation) range from 1.9 pence / kWh for small biomass projects, to 8.5p/kWh for solar thermal, but these prices will be degressed' (i.e. reduced) in stages over time to match the development of the market and avoid a boom-and-bust situation. Details of that are promised soon.

The RHI only meets the extra costs above that for installing conventional systems, not the total costs. Even so, it should yield a rate of return on outlay of about 12%. So it could be quite popular- if participants have the upfront capital available. However, DECC says that 'We do not intend to allow agents, such as installers, suppliers or other third parties, to apply for support from the scheme on an applicant's behalf.' So you are on your own!

And there are some other limits. DECC says 'by domestic installations, we mean installations where a renewable heating installation serves a single private residential dwelling only. This does not include multiple residential dwellings served by one renewable heating installation (e.g. district heating) nor residential dwellings which have been significantly adapted for non-residential use'.

Rural areas won't get special treatment, despite DECC accepting that 'a higher proportion of rural than urban areas tend to lack access to the gas grid and organisations not connected to the gas grid, for example small rural businesses, tend to have higher heating costs due the use of more expensive fuels'. It simply says that 'those off the gas grid will have the potential to benefit most from the RHI' and 'those in rural off-gas grid areas may have better access to biomass in particular and not face the same installation and biomass fuel supply barriers as those in urban areas.'

The RHI will operate via Ofgem, who will provide accreditation and will carry out equipment inspections. In that context there some interesting technical conditions/requirements e.g. heat pumps must have a COP of 2.9 or above (but air sourced units will not be supported initially) and biomass sources must meet eco-eligibility criteria. As a condition of receiving support, participants will also be required to maintain their equipment to ensure it is working effectively: Ofgem may check this periodically.

All biomass, ground and water source heat pumps and solar thermal plants of 45kWth capacity or less will need to be certified under the Microgeneration Certification Scheme (MCS) or equivalent schemes. The MSC , not too popular in some circles. will be upgraded.

The RHI will only support 'useful heat', with Ofgem determining eligibility according to RHI regulations. In outline, acceptable heat uses are said to be 'space, water and process heating where the heat is used in fully enclosed structures'. The heat must be supplied to meet an 'economically justifiable heating requirement i.e. a heat load that would otherwise be met by an alternative form of heating e.g. a gas boiler'. This should be an 'existing or new heating requirement i.e. not created artificially, purely to claim the RHI'. The only exception is for bio-methane injection into the gas grid, with no specifications on how it is then used

Heat used for cooling counts towards the renewables targets under the EU Renewable Energy Directive and therefore, provided it meets all other eligibility criteria, it will be eligible for RHI support, but not passive solar or exhaust air heat pumps.

Renewable heating systems that replace an existing renewable heating system will be eligible for the RHI support, despite the risk that some people may therefore scrap old but viable systems to get the RHI. More commonly, renewable heating capacity is likely to be expanded, and the extensions are eligible for the RHI up to the (joint old/new) total capacity threshold.

And finally, the RHI is not intended as a mechanism to support 'innovative technologies in development or early deployment'- but happily, deep geothermal is allowed.

The delay until Oct 2012 for the domestic sector scheme raised some eyebrows and some saw the solar tariffs as too low, but otherwise the RHI was generally welcomed.

It could involve some quite big schemes. Public sector and not-for-profit organisations, such as schools, hospitals and charities, can use the RHI, and DECC says 'the support provided by the RHI will also enable communities to come together to find local solutions tailored to local energy needs. The opportunities are many, from setting up anaerobic digestion plants using local waste to establishing community- owned biomass co-operatives sourcing fuel from sustainable local woodlands'

It adds 'In some situations, district and community renewable heating, whether as a central boiler for an apartment building, or as a network of pipes delivering heat from a central installation to a number of local households or businesses, can be a cost-effective alternative to installing individual heating systems in properties. By supporting this sort of application, the RHI will encourage investment and give developers confidence to install centralised plant'.

So we could be seeing some local district heating networks like those elsewhere in Europe, powered using renewable sources. But the RHI won't pay for the pipes! http://www.decc.gov.uk/en/content/cms/news/rhi_wms/

by Liz Kalaugher at the EGU General Assembly, Vienna

Back in 1895, the sudden collapse of the Altels cold (high-altitude) hanging glacier brought around five million cubic metres of ice crashing down onto the valley below. The event, the largest known ice avalanche in the Alps, killed six people and 170 cows, as well as causing the valley's entire summer harvest to fail.

Although there were various theories as to the cause - chief amongst them the increases in summer temperatures over the last few years - the exact mechanism that led a roughly semicircular region of ice to detach from the bedrock beneath was unclear. But now Jerome Faillettaz from ETHZ in Switzerland and colleagues have used a new numerical tool to show that the avalanche must have been due to a local decrease in the friction coefficient between the ice and bedrock, probably because of meltwater entering via a crevasse.

Applicable to landslides and rockfalls as well as ice on steep slopes, the tool uses a simple "blocks and springs" approach for modelling gravity-driven instability. As the blocks begin to slide, the brittle springs fail; the model accounts for factors such as creep, friction and glacier boundary conditions.

The researchers found that while altering the glacier geometry in the model or changing the support provided by the peak's side glacier did not cause ice break-up, a uniform change in friction coefficient across the whole of the glacier base made all of the ice fall. But when the team initiated a progressive decrease in friction in just one area of the glacier, a crown crevasse opened up and the ice failed with a similar semi-circular pattern to the 1895 event.

The research isn't only relevant to the past - it's likely that climate change will affect the stability of cold hanging glaciers around the world. Ice from the Glacier de Taconnaz on Mont Blanc could, for example, crash down onto the resort town of Chamonix. Faillettaz says that we need detection methods such as seismic monitoring to provide an early warning of such disasters.

Indeed Faillettaz and colleagues have trialled the use of seismic geophones on the Weisshorn cold glacier, which breaks up on a roughly ten-to-fifteen-year cycle and can disturb road and train access to Zermatt, Switzerland. The scientists believe they can detect acceleration of the ice in this way up to two weeks before any fracture. Although it's also possible to spot surface acceleration visually, the technique doesn't work in the bad weather conditions when such events often take place.

Faillettaz is cautious about the general applicability of the seismic method, however. In the case of the Weihorn, trapped water is not believed to be a factor behind the break-up; it may be harder to use seismic techniques to predict an Altels-type failure.