This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

October 2009 Archives

Most countries in the EU now use guaranteed price Feed-In Tariffs (FIT) to support renewable energy projects, with different prices being fixed for each type of technology. The FITs have proved to be very effective at getting capacity installed rapidly at relatively low costs. For example, Germany has installed 25 Gigawatt (GW) of wind generation capacity so far under a FIT scheme , whereas the UK, with its competitive Renewable Obligation Certificate (ROC) trading scheme, has only achieved 4 GW, with some of that actually being supported by grants (for offshore projects). And this in a country with a far better wind regime than Germany.

Moreover, it cost consumers more. With the FIT, a developer of a new wind farm can get loans from banks at low interest rates, since their income stream is defined years ahead. With the ROC system, since they have no idea what the future income stream will be, banks will only offer high rates – so consumers have to be charged more. Thus, consultants Ernst and Young note that, in 2005/2006, the UK's ROC system cost consumers 3.2 pence/kilowatt hour, whereas in 2006 the German Feed-In Tariff only cost consumers 2.6/p/kWh – despite having a much bigger wind capacity in areas with generally much less wind than in the UK. And also despite the fact that the German FIT has also supported a lot of expensive solar PV projects, very much more than the UK.

With the UK committed to getting 15% of is total energy from renewables 2020, which means they would have to supply maybe 30% of its electricity, something had to be done. The UK governments remains wedded to the market-orientated ROC system, and it has made some changes to it – e.g. creating "technology bands" with different numbers of ROCs for each type of technology. That may help to some extent – making it a bit more like a FIT. But the government eventually conceded that a fixed-price FIT system might be better for small-scale projects. There was some debate about how small "small" should be, but a ceiling of 5 MW was chosen – large enough to include some small community projects.

The governments proposals were for a fixed "Clean Energy Cashback" payment from the electricity supplier for every kilowatt hour (kWh) generated (the "generation tariff"); i.e. for self-generated power you use, plus a guaranteed minimum payment additional to the generation tariff for every kWh exported to the wider electricity market (the "export tariff"). The export tariff will be market determined – it's currently at £0.05/kWh, for electricity delivered to the grid. Proposed generation tariff levels were set at 36.5p/kWh for retrofitted PV solar systems up to 4kW; and 28p/kWh for systems up to 10 kW, while wind projects would get 30p/kW for turbines below 1.5 kW and progressively less for larger units, down to 4.5p/kWh for wind turbines between 500 kW and 5 MW. Hydro projects would get 4.5–17p/kWh depending on size. Anaerobic digestion and biomass were also eligible (getting up to 9p/kWh), so was AD fired combined heat and power (11p/kWh), but not landfill gas or sewage gas, which are deemed already commercially viable.

As with the German FIT, UK FIT prices will be reduced, or "degressed", in annual stages to reflect expected reductions as the technology develops and the market for it builds. But only for some of the technologies. The annual degression was set at 7% for all solar PV projects, 4% for wind turbines below 1.5 kW, 3% for those in the 15–50KW range. The rest would have no price degression.

In some ways it is quite generous, give that, if you can afford to install the equipment then you will get money from your electricity supply company, even for the power you consume yourself that you have generated. They are in effect paying you not to use or buy their power! They will presumably pass the extra costs on to their consumers across the board- who will in effect be subsidising those who can afford the equipment. While many commentators have supported FITs for large efficient projects like wind farms, using them for expensive options like PV and micro wind does seem likely to raise some problems of social equity. The high extra cost to consumers has already led to capacity caps being imposed for PV in FIT schemes elsewhere e.g. in Spain. But the overall scale of the UK FIT is quite small – overall it is expected to led to a total of 8 TWh generation by 2020 – about 2% of UK electricity by then. So maybe extra cost of PV will not matter.

Initial reactions were mixed. The Renewable Energy Association (REA) commented: "The 2% figure is really lacking in ambition. The potential for microgeneration is much, much larger." The UK's largest solar panel provider Solarcentury said that the FIT would "accelerate the market a bit, but we will not grow explosively as has been the case in other countries such as Germany, France and Spain". They put the rate of return under the UK FIT at only about 4%. However, the Guardian's (23 July 2009) rough estimates of FIT yields suggested that a 15 kW wind turbine at a good site could get a 12% p.a. return, but at 7%, solar PV was only a marginally attractive investment. Well, yes, but banks currently offer less on money invested, so those with money may well see FITs worth the trouble. For an interesting survey of consumer attitudes to investing in FITs, visit

In its subsequent submission to the consultation exercise the Renewable Energy Association suggested that "all technologies should benefit from the same rate of return," which should ideally be 10%. REA argued that "the 5–8% proposed is simply insufficient". It added: "Although, 8% might be adequate for some householders, it is not sufficient to engage the commercial sector."

But what about community scaled projects? Friends of the Earth felt that: "For community-scale or larger on-site projects the rates [tariffs]are inadequate." The REA later concurred. In its submission to the consultation exercise it said: "The Tariff levels for wind appear to decline dramatically over 500 kW," which meant that, according to one community scheme developer they asked: "There will be no schemes over 500 kW at the current proposed Tariff levels." ROCs would be used instead, though REA said: "It is clear that ROCs have not been effective at stimulating community schemes – hence the need for user-friendly Tariffs. We hope government will want to ensure the success of community schemes."

Of course the FITs run for 25 years and should help community schemes a bit. But clearly there were a lot of disagreements about the details – especially on PV. The FIT as it stands is only predicted to yield around 0.5% of UK electricity from PV by 2020. Friends of the Earth felt that the "degression for solar PV is quite aggressive" at 7% per year, (much higher than in Germany) and that since the bonus payment for export to the grid will fluctuate with the "market price", it will be discounted by banks providing debt for projects financed under the feed-in tariff. The We Support Solar lobby group claimed that adding 10p/kWh to the tariff would deliver more than six times more capacity.

And then REA piled in with a long list of suggested upgrades: "Tariffs, generation and export, need to be index-linked to ensure that they retain their value for their full life. Tariff degression should not be applied until the third anniversary of the scheme, to ensure a robust start. The generation tariff, as well as the export reward should be exempt from income tax, for household installations. Enhanced Capital Allowances should be extended to all renewable technologies to support their growth in the commercial sector. Onsite renewable technologies should be exempt from assessment for business rates, council tax and stamp duty. Existing installations should be eligible for the tariffs." And it added: "In line with BWEA, we recommend increasing the cut off band for 15–50 kW, to 15–100, and moving the 50–250 kW band to 100–500 kW'. Quite a shopping list! And all before April 2010, when it's supposed to start.

Moreover, widening the agenda, REA noted that several technologies had been omitted from the consultation document and said: "Tariffs should be set for geothermal, gasification and pyrolysis, biofuels, and wave and tidal energy from the outset." And finally it said that it was "concerned with the low level of awareness about the scheme. It is vital to communicate with potential investors to ensure that proposals are effective from the perspective of a range of key investors. Important groups we have spoken to, including commercial companies, were not aware of the scheme or unclear on key aspects of the design proposals".

It will be interesting to see how the government responds.

DECC Consultation (now closed):

REA's input:

The Atlantic keeps cropping up when we try to understand why glaciers change. If you look at the right kind of map, you can see that the Atlantic, including the Arctic, is an enormously long inlet in the shore of the world ocean. (The Bering Strait, between Alaska and Siberia, is too shallow to make a difference.)

The ocean is hot near the equator and not so hot near the poles. The heat flows down the temperature gradient, which drives the ocean currents. The water has to go somewhere once it has got where it is going and has surrendered the heat to the atmosphere. So it sinks. The sinking has a natural explanation: now that the water is colder, it is also denser.

At this point, the first two of several intertwined complications alter the picture. First, as it surrenders heat, the north Atlantic also surrenders water vapour. The warmer the water, the faster it evaporates. Second, during evaporation the salt stays behind, making the ocean water denser than it was to begin with, and that makes it yet more likely to sink.

The water that sinks gets back where it started from by flowing southwards at depth. Eventually it finds its way out of the Atlantic inlet and wells up, or is dragged up by the wind, in broad regions of the low-latitude ocean, where it is reheated by the Sun and thus begins the cycle over again. This is the essence of the meridional overturning circulation, or MOC. The Atlantic part of this circulation, predictably abbreviated AMOC, is a sensitive part of the machine because its northern end is where most of the Northern Hemisphere overturning happens. There is nothing comparable in the north Pacific.

The water vapour released from the north Atlantic adds greatly to the poleward deliveries of vapour through the atmosphere, and helps to make the northern shores of the Atlantic snowy. The glaciers around the north Atlantic today, and the much bigger ice sheets we had during the ice ages, owe a lot to the AMOC.

But now we come up against the third of the intertwined complications. After the water vapour has condensed and fallen on the bordering landmasses, it flows back sooner or later to the ocean, where it dilutes the salt that helps the AMOC through the crucial sinking part of its cycle. In principle, a large enough return flow of fresh water from rivers and glaciers could reduce the density of the surface waters sufficiently to stop them from sinking, in which case the whole AMOC would stop.

And now, enter the fourth of the intertwined complications: in one word, us. If we heat the whole system by enough to shrink the Greenland Ice Sheet significantly, flooding the north Atlantic with fresh water, we raise the prospect of just such a switching-off of the AMOC.

All the climate models suggest that, if the AMOC collapsed, the northward heat transfer would also be greatly reduced and the shores of the north Atlantic would suffer cooling. But fears of a new ice age being triggered by a collapse of the AMOC, itself triggered by a collapse of the Greenland Ice Sheet, are not realistic.

In the first place, these collapses would happen in a context of global warming, and again the climate-model evidence shows that they would not suffice for the job. Secondly, we have lots of palaeoclimatic evidence for abrupt changes in the AMOC, which are leading candidates to explain Dansgaard-Oeschger transitions during the last ice age, and the cold snap 8,200 years ago. They didn't last all that long, and they were all reversible. Thirdly, models of ancient climates suggest persuasively that the AMOC is not implicated as a mechanism for starting ice ages.

And finally, the models agree that, without actually collapsing, the AMOC is nevertheless very likely to weaken over the next century. Even decanting the Greenland Ice Sheet into the ocean would not switch it off. But several metres of sea level rise, and a weaker AMOC in a warmer world, are enormous problems in themselves. That they are not harbingers of a colder world is not a good reason for relaxing.

On Saturday, more than 5200 events took place worldwide, all of them promoting the number 350: understood as the upper limit for carbon dioxide in the atmosphere. We are already above. The slideshow on the website gives an amazing display of actions that took place. Altogether, the action day seems to be a huge success, mobilizing an incredible number of people and raising awareness.

For example, in Berlin several hundred people, climate pirates, gathered, all disguised as Angela Merkels to push the German chancellor towards progressive climate policies.

This and similar actions were reported by newspapers all over the world. Andrew Revkin of the New York Times gives voice to critical opinions who point out that the number 350 is unrealistic and only promotes fatalistic attitudes. Others think that focus should be on ways towards climate protection rather than arguing over numbers.

Both opinions are justified but, to some extent, miss the point.

Given the risk of very high damage to humanity, it is a questionable attitude to aim for a target that leaves Bangladesh drowned, so to speak. In fact, this is itself a fatalistic position which seems intellectually hard to justify. Though the 350 target is very ambitious, it might be not impossible. The two camps arguing for either 350 ppm or less ambitious targets are motivated more by belief systems that frame model assumptions rather than hard facts – and it may be impossible to know precisely.

It is true, that the 350 campaign provides no precise solutions. However, you can probably ask any climate activist, and she or he can come up with a list of legislation and actions that lead the path towards climate protection. There is not so much a knowledge barrier 1, but there are political barriers. In the words of the current US president in the context of the current US climate legislation (also this Saturday): "The closer we get, the harder the opposition will fight and the more we'll hear from those whose interest or ideology runs counter to action."

He also gets the fatalistic thing right: "There is also another myth we must dispel, And it is one far more dangerous than any attack made by those who wish to stand in the way of progress – and that's the idea that there is little or nothing we can do.

"That is the pessimistic notion that our politics are too broken and our people too unwilling to make hard choices. Implicit in this argument is the sense that we've lost something important   that fighting American spirit, that willingness to tackle hard challenges and the determination to see those challenges to their end."

The point of the 350 campaign seems to raise awareness and increase political pressure towards action, and that may be exactly the right thing to do.

There is still an argument against the 350 number: It is a number!

Number target are always one-dimensional and reduce complex circumstances to something overly simply. Focusing on narrow numbers invites gaming behaviour, i.e. where agents just head for numerical targets, and ignore the context. For exam ple, we don't want to protect our climate by clouding our skies with air pollutants. In other words, it is not only about ppm levels, it is about sustainability. In that sense, leaving space to live for happily raging sea otters may be alright, too.

(1) and neither are cost the real barrier, but this is a different issue

Wind farms can reduce local bird numbers by up to half, according to a new study by the Royal Society for the Protection of Birds in conjunction with the Scottish Natural Heritage. It looked at 12 upland wind farms in the UK during the breeding season for a dozen common species including rare species such as hen harriers and skylarks. The research, published in the Journal of Applied Ecology, found 7 species showed 'significantly lower frequencies of occurrence close to the turbines.' The breeding population of buzzard, hen harrier, golden plover, snipe, curlew, wheatear and meadow pipit were reduced by up to half within 500 m of the turbines. It suggested that the most likely cause of the decline is the fact that birds are less likely to live and breed near wind farms because of the noise and development. Collisions with turbines was also suggested as a possible cause, but was thought to be less likely.

The British Wind Energy Association concurred with that: "This study shows there is a potential problem with displacement, but it is not yet proved that there is a problem with bird mortality rates." It added 'Wind farms and turbines are the most benign form of energy generation and the industry has found that wind farms simply do not pose a threat if they are properly sited and follow procedure. The threat of global warming could be a far greater threat to the population of birds than wind farms'.

The RSPB have recently backed wind farms as long as they are properly sited, and recommended a spatial planning approach as used elsewhere in the EU. The Daily Telegraph (26/9/09) however claimed that RSPBs pro-wind stance had caused 'many members to leave in protest because of concern about the developments ruining the view in remote areas and contributing to the decline of birds'.

Nevertheless, James Pearce-Higgins, senior conservation scientist with RSPB Scotland and lead author of the study, told them that it still supported wind farms. But developments should not be put in the wrong area – where they can harm birds. 'There is an urgent need to combat climate change, and renewable energy sources, such as wind farms, will play an important part in this. However, it is also important to fully understand the consequences of such development, to ensure that they are properly planned and sited. Our results emphasise the need for wind farms to avoid areas with high densities of potentially vulnerable species such as curlews and golden plover, and help offer a way forward by informing the likely extent of positive habitat management which may help to offset the impacts of development.'

It's not just birds though, it's also bats that can have problems and it seems in larger numbers. But researchers at Aberdeen University, funded by the People's Trust for Endangered Species ( are currently making good progress on using radar to deter bats from colliding with in the turbine blades.

But collisions may not actually be the problem. A study of bat deaths at a local wind farm by the University of Calgary reported by Science Daily found that the majority of migratory bats in this location were killed because a sudden drop in air pressure near the blades caused injuries to the bats' lungs known as barotrauma. Although the respiratory systems in birds can withstand such drops, the physiology of bats' lungs does not allow for the sudden change of pressure.

TransAlta, Canada's largest publicly traded provider of renewable energy, initiated a follow-up study at the same site to determine what could be done. They tested a revised operating procedure   slowing turbine blades to near motionless in low-wind periods significantly reduces bat mortality. Prof. Robert Barclay, who led the University of Calagary study, commented "Biologically, this makes sense as bats are more likely to fly when wind speeds are relatively low." It was found to reduce bat deaths from wind turbines by up to 60% without significantly reducing the energy generated from the wind farm.

Ref: Baerwald et al. 2009 'A Large-Scale Mitigation Experiment to Reduce Bat Fatalities at Wind Energy Facilities'. Journal of Wildlife Management 73 7 ; 1077 DOI: 10.2193/2008-233

Science Daily.

On a recent trip to Brazil to understand water-resources management with respect to biofuel crop and other agriculture, I learnt much more about Brazilian energy policy. While in some cases, water-resource management is in its infant stages, in many others reductions of water usage have made great strides.

The total water flow in a sugar-cane-ethanol distillery is approximately 22 m3 per tonne of sugar cane processed, but new plants can be designed to withdraw only 1 m3 per tonne of cane. Most distilleries withdraw less than 5 m3 per tonne. Additionally, Dedini (the Brazilian vertically integrated company that sells industrial components and turn-key sugar-cane processing plants) has a plant design that can take green sugar cane, harvested by machine and without burning, and actually produce clean, fresh water instead of requiring it as an input.

Speaking of machine harvesting, the state of Sao Paulo, where more than 50% of cane is grown, has mandated that all sugar cane is harvested mechanically (i.e. by tractor) by 2014. The reason is to improve air quality during harvest season when the cane would otherwise be burnt before manual harvest. The cane is burnt to remove the leaves and "trash" from the stalk of the sugar-cane stalk and this leaves the main stalk with the sucrose and fiber. So much cane was being produced and harvested in a relatively small region that air quality was becoming unsafe for residents.

This practices of using mechanical harvesters for harvest was actually used as one of the criteria guiding the designation of Agriecological Zones for future sugar-cane agriculture. To prevent any perception of "food v fuel" argumetns for sugar cane in Brazil, the goverment has now set up the zones where sugar cane should be expanded. A sugar-cane developer cannot recieve goverment support via low-cost loans unless exapnding into these agricultural zones, and agriculture is generally too expensive without this government support. So, because mechanical harvesting is assumed not possible on lands with slopes greater than 12°, that slope limit was used as the criterion number. The other two criteria for determining the agricultural zones were climate (rainfall, temperature profile, etc.) and soil quality. The vast majority of sugar cane in the south and central parts of Brazil requires no irrigation except for possibly some during initial planting, and those areas in the new zones are anticipated to require little irrigation (˜200–300 mm/yr) if any at all, and it is not clear if the economic payoff will induce the investment in irrigation infrastructure.

Another, more social, aspect of Brazilian energy policy is the promotion of small farmers throughout Brazil for growing various crops for vegetable oils for biodiesel. The "pro-alcohol" programme was seen to leave out the small farmer as it is a large-scale industrial crop. Because sugar cane to ethanol is estimated to have a much higher energy return on energy invested than many oils to biodiesel, it is not apparent if people expect to make monetary returns similar to the industrial scale ethanol industry. Nontheless, there is an attempt to include more rural communities and farmers into Brazilian energy policy – for better or for worse.

Thus, from air quality to soil quality, Brazilian energy policy is promoting its cash crop of sugar cane. As the current land area used for Brazilian sugar cane is approximately 8–9 million ha, and the land used for cattle pasture is 180–200 million ha, we don't have to worry about Brazilian biofuel development as a specific driver for removal of the Amazon rainforest. The Amazon is clearly restricted via the agriecological zoning for sugar cane. On the other hand, increasing pressure for beef may have a part to play as policing such as vast area is difficult to impossible. But there is a push for increasing the density of cattle on land in Brazil to prevent expanded land clearing for pasture.

In summary, there is sufficient land zoned for sugar cane for Brazil to produce approximately 4–5 times as much ethanol than is produced today (˜6.2 billion gallons in 2008). There is also generally a better climate (rain and temperature) and soils for first-generation and possibly second-generation biofuels feedstocks than in North America or Europe. Thus, it is important that the developed world understand its own agricultural practices for energy-related biomass and determine whether domestic water and soil resources are better preserved by importing and investing in Brazil or investing at home. But then perhaps this brings up a new set of domestic social sustainability questions ...

In glacier monitoring, one of the things we worry about is undersampling. The measurements are sparse, and we have to interpolate, that is, make plausible guesses about the glaciers we can't measure. Gaps in coverage mean that there is always a chance that new measurements in remote areas will change the picture. One of these areas is the Subantarctic islands, scattered across the Southern Ocean and holding about 8,000 km2 of glacier ice in all. Our knowledge of this ice has been fragmentary until recently. Could the Subantarctic be an exception to the global rule of glacier shrinkage?

The knowledge base is beginning to improve, and we can now say that the answer is "No". For example, with Étienne Berthier, of the Laboratoire d'Études en Géophysique et Océanographie Spatiales in Toulouse, I am writing a chapter on the Subantarctic for a book about GLIMS, the Global Land Ice Measurements from Space initiative. Étienne kindly sent me an April 2009 ASTER satellite image of the west coast of Kerguelen, in the southern Indian Ocean.

Glaciers in Kerguelen, 1963-2009 (1963 outline in yellow)
Glaciers in Kerguelen, 1963-2009 (1963 outline in yellow)

The small protruding glacier tongue in the lower part of the picture belongs to Glacier Pierre Curie, which now ends a kilometre from the sea but in 1963 had a calving front about 600 m wide. The two stubby tongues in the upper part are Glacier Pasteur, whose calving front was 1700 m wide in 1963 but is now only barely in contact with tidewater. In another few years, it will have retreated away from the shoreline of Anse des Glaçons (the cove of ice floes, a place name which will provoke nostalgia one day).

The retreat of these two adjacent outlets of the Cook Ice Cap doesn't count, nowadays, as startling news. Berthier and his co-authors recently reported that Cook Ice Cap shrank at 2.4 km2/yr, half a percent per year, between 1963 and 2001. At this link you can watch an animation of the shrinkage of Glacier Ampère, on the opposite side of the ice cap from Pasteur and Pierre Curie. But if you picked any two neighbouring glaciers almost anywhere in the world, the odds are that they would have shrunk at something like that rate, or perhaps a bit less. So now we know that Kerguelen was not one of the out-of-the-way places where a surprise was awaiting us. The Kerguelen glaciers even follow the widely-observed tendency of accelerating shrinkage (that is, faster recently than earlier).

Another out-of-the-way place about which we now know a lot more is Heard Island, in the Indian Ocean southeast of Kerguelen at 53° South. It has tidewater glaciers mainly because it rises to 2,755 m above sea level. My other co-author, Shavawn Donoghue of the University of Tasmania in Hobart, finds that Heard Island's 30 glaciers are dwindling just as are those of Kerguelen. Six have parted company with the sea during the decades since the first air photos in 1947, leaving a dozen still delivering icebergs to the ocean. Gotley Glacier, which drains the summit crater of Big Ben, is still standing in the sea as it has done for as far back as we have information.

Calving glaciers are more challenging than ordinary ones when it comes to documenting change. Those that manage to advance far down a fiord can misbehave spectacularly. More often, however, as with Gotley and its 11 neighbours, the icebergs break off as soon as the ice reaches sea level, so the calving front doesn't change much.

Change in a previously unknown region that turns out to be globally typical – "Subantarctic Glaciers Not Surprising" – is difficult to sell as a motive for political action. What we are after here, apart from a conversation piece for your next cocktail party or trivia game, is something that will inject the necessary urgency into the deliberations of the politicians and policymakers. They will assemble in Copenhagen this December for the most important negotiating session in the history of the human race, the fifteenth Conference of the Parties to the UN Framework Convention on Climate Change. Faced with the evidence, they seem to have got the scientific message, but it hasn't really clicked yet. Graphs that fall off the bottom of the page haven't done the trick. Neither Gotley's continued stillstand nor Pasteur's impending loss of tidewater status are likely to make the communications breakthrough, but you never know.

Everyone usually likes animal stories and I couldn't resist this news item from the Seattle Post-Intelligencer, 7 Oct., as part recycled by

A helicopter equipped with radiation detecting equipment has been used to scan almost 4000 hectares of the USA's Hanford nuclear site in search of radioactive rabbit droppings. The helicopter was able to map each of the slightly radioactive stools with GPS coordinates. Liquid wastes containing radioactive caesium and strontium salts were stored in underground tanks at Hanford, which rabbits routinely burrowed into. They developed an appetite for the radioactive salts, which resulted in slightly radioactive droppings. Hanford was a plutonium production complex with nine nuclear reactors and associated processing facilities which played a pivotal role in US defence for more than 40 years starting. An estimated 50 million gallons of liquid wastes from Cold War plutonium production processes- laced with radioactive caesium and strontium salts- were dumped in a 13.7 sq. mile area south of central Hanford's 177 underground radioactive waste tanks. That dumping ended more than 40 years ago. The site is now undergoing environmental cleanup managed by the US DoE. The droppings will be put into landfill at the Hanford site.

The UK also has had problems- at Sellafield in Cumbria. At one point I recall it had to employ a marksman to dispatch birds that picked up doses and were at risk of contaminating local gardens. But now, perhaps more seriously, the Nuclear Installations Inspectorate (NII) and the Environment Agency are worried about the potential for a 'major event' arising from part of Sellafield's 'high hazard, high risk' area. One of the high risk plants is B30, the original fuel storage pond which is open and known among workers as 'Dirty 30'. NII inspector Mark Foy told the Whitehaven News (7/10/9) 'we are concerned that the risk of a major event caused by further degradation of legacy plants, or increased time at risk due to deferrals, is far too high. We have written to Sellafield Ltd to advise that every effort should be given to addressing and reducing the risks at the earliest opportunity'. Let's hope that while they are sorting it out the birds stay away.

Moreover, even if they or other species don't live in that area, there are still evidently risks from the fallout from the Chernobyl disaster back in 1986, which famously contaminated land in Cumbria and Wales. In all, the UK Dept. of Health issued control orders covering more than 200,000 sheep. Sheep with higher than the permitted level of radiation had to be marked with a special dye that did not wash off in the rain, and had to spend months grazing on uncontaminated grass before being passed as fit to go into the food chain. But even now problems remain. The Council of the EU recently noted that 'Radioactive caesium contamination of certain [agricultural] products originating in the third countries most affected by the Chernobyl accident still exceeds the maximum permitted levels of radioactivity laid down in Regulation (EC) No 733/2008.'

The half-life of Caesium-137 is around 30 years, so we still have some time to go before the activity level has fallen off to half what is was initially. As far as is known, the risks from this type of low level contamination are small, but as we contemplate on one hand trying to clear up all the old nuclear sites, and on the other, building new plants, it may be wise to think about the long term implications of relying on nuclear energy. Particularly since there have been continuing worries that the UK Nuclear Installations Inspectorate does not have the staff to deal with the proposed nuclear expansion programme.

According to a report in the Observer (21/6/09) the NII evidently has had enough problems responding to the 1,767 safety incidents that occurred between 20001-8, about half of which were subsequently judged by inspectors as serious enough 'to have had the potential to challenge a nuclear safety system'. Clearly it's not just animals that have problems with nuclear power...

Of course, we and animals can also have problems with other energy technologies - for example, as I explored in an earlier Blog, birds and bats sometimes collide with wind turbine blades. I'll be looking at recent developments in relation to that in a new Blog soon.

Dynamic thinning is even more exciting than basal lubrication, but recent work by Hamish Pritchard and colleagues shows up the difference. For basal lubrication of the flow of glacier ice, just water will do, but it is now clearer than ever that for dynamic thinning what you need is warm water. Or rather, and this is not the same thing, if you supply warm water what you get is dynamic thinning.

Dynamic thinning is thinning over and above what can be expected from an imbalance between the ability of the climate to generate meltwater and the ability of the glacier to replenish the stock by flow. There is a simple reason for getting excited when we observe dynamic thinning. It means that the ice must be moving faster, and shedding more mass at its terminus, than it did before the dynamic thinning began.

Most glaciers are thinning at present. Strictly speaking, we mean that their surfaces are getting lower, but we can correct for the possibility that part of that is due to change of the bed elevation. Measuring the change of surface elevation is within the grasp of several methods. In particular, the ICESat laser altimeter on which the Pritchard work relies can give very accurate estimates.

In Greenland the most thinning you can expect from the climate – snowfall minus melting – is a few m/yr. In Antarctica, less than that is usual, and there are plenty of outlet glaciers down there where even today there is no melting even at sea level, and all of the loss is by discharge across the grounding line. Dynamic thinning, and the implied faster flow and faster discharge, are signs of trouble.

What Pritchard and colleagues have shown is that dynamic thinning was widespread around the margins of Greenland and Antarctica during 2003–2007, and is on the increase. Both of these findings are scary. There ought not to be any dynamic thinning in a well-regulated world. Perhaps their most telling finding is that there is a definite difference between fast-flowing tidewater glaciers and slower parts of the ice-sheet margins.

Roughly, the slow-flowing margins are showing just run-of-the-mill accelerating loss with no evidence for dynamic thinning. We are learning rapidly about basal lubrication as one of the reasons for the run-of-the-mill acceleration. But if it were the main or only reason then there would be no difference between the land-terminating and the tidewater margins. In fact, all hell is busting loose on the tidewater glaciers. Pritchard and colleagues have now documented dynamic thinning in the north of Greenland as well as the south, and in the east of Antarctica as well as the west – but only on tidewater outlets, and only on fast-flowing tidewater outlets.

These results point squarely at the ocean as the culprit. The better the ocean is at melting the base of a fringing ice shelf, or at sapping a grounded calving front, the faster does the glacier go in its efforts to maintain the supply of ice, and the greater the resulting dynamic thinning inland. At this point, our explanations of what is happening begin to get rather hand-waving. Measuring the temperature of the ocean near to tidewater glaciers is an extreme challenge, and we know very little about whether it is changing. But there are indeed some signs that warmer water is getting at the ice.

For example, David Holland and colleagues have shown dramatic maps, based on measurements by fishing boats, of the arrival of warm water in Jakobshavn Fiord in western Greenland. The warm Irminger Current, deriving from the Gulf Stream, curls clockwise around southern Greenland. Between 1996 and 1997, it flooded Jakobshavn Fiord with water 2–4 °C warmer than what was there before. In 1997, the floating tongue of Jakobshavn Glacier, the largest outlet of the Greenland Ice Sheet, began to disintegrate, and simultaneously the dynamic thinning of the glacier began. By 2007 it was 30 m thinner 70 km inland. At 15 km inland, it was more than 200 m thinner. It is still thinning, and nobody can tell when it will stop thinning.

Not all of the dots connecting global warming to the dynamic thinning of Jakobshavn Glacier have yet been joined up, but I don't know anybody who is betting on this being the wrong explanation, or, after the demonstration by Pritchard and colleagues, on Jakobshavn being an isolated example. This makes the world look even more complicated, and for the moment at least it makes predicting the future contribution of glaciers to sea-level rise even harder.

Everyone is in favour of energy efficiency. Using energy more efficiently is usually seen as the cheapest way to reduce/avoid carbon emissions. Certainly, given that energy has been relatively cheap until recently in countries like the UK, we have paid little attention to conserving it, so that there are substantial energy saving gains that can be made. However, once we have picked the easy and cheap "low-hanging fruit", the next wave of energy savings are likely to be harder and more expensive to achieve. Energy-efficiency advocates, such as Amory Lovins, argue that, much as with many another technologies, as the market for them expands then they will get cheaper. But it is not clear if this is the case for all energy-efficiency systems. And even it is the case, then we come to a potentially much larger problem – the so-called "rebound effect". But simply, if some commodity or service gets cheaper then we are likely to use more of it.

If for example, a consumer cuts energy bills by a few hundred pounds each year by investing in domestic energy saving (e.g. better building insulation), they are likely to spend this money on other energy intensive good and services, such as a holiday abroad by jet plane, thus wiping out some or all of the energy and carbon saving gains. Similarly, for drivers who replaces a car with a fuel-efficient model, only to take advantage of its cheaper running costs to drive further and more often.

There has been a long debate on exactly what the scale of this and other types of rebound or "take back" effect might be in practice. A report last year from the UK Energy Research Centre concluded: "For household heating, household cooling and personal automotive transport in developed countries, the direct rebound effect is likely to be less than 30% and may be closer to 10% for transport." Moreover, "direct rebound effects for these energy services are likely to decline in the future as demand saturates", although it warned that "indirect effects mean that the economy-wide reduction in energy consumption will be less".

In terms of how to respond to rebound effects, the report points out: "Carbon/energy pricing can reduce direct and indirect rebound effects by ensuring that the cost of energy services remains relatively constant while energy efficiency improves. Carbon/energy pricing needs to increase over time at a rate sufficient to accommodate both income growth and rebound effects, simply to prevent carbon emissions from increasing. It needs to increase more rapidly if emissions are to be reduced." That could be painful for consumers, already hit as they are by rising fuel bills.

One proposal to achieve reduction in energy used without relying on direct pricing is personal carbon rationing – with consumers being allocated carbon credits to limit their overall consumption. These credits would be tradeable, so that consumers who managed to use less could sell any excess to those who were less frugal. The annual allocations could then be gradually reduced.

Last year, Defra conducted a pre-feasibility study into Personal Carbon Trading but concluded that it was an idea ahead of its time, although the government remains interested in the concept and in further research. But costs were seen as a possible major problem. Defra said: "Estimates of the likely set-up costs of the type of scheme explored ranged between £700 m and £2 bn, and the running costs £1–2 bn per annum." However, proponents have argued that retail loyalty-card schemes have demonstrated that large-scale schemes can be set up and operated without large costs.

There was also a hint from Defra of political concerns about pressing ahead with a mandatory rationing system at present: "There is scepticism that such a scheme would be fair, that government could be trusted to manage it or that it would deliver emissions reductions. In addition there was little evidence that people would be likely to trade – a crucial element of the scheme."

The debate has continued. While enthusiasts say, not unreasonably, that personal rationing schemes would have an immense educational value, making people aware of their carbon debts, some people are likely to see personal carbon rationing as yet another unwarranted government imposition. But the Institute for Public Policy Research (IPPR), in its new report Plan B? The Prospects For Personal Carbon Trading, says that since "unlike food rations during the war, carbon credits would be tradeable" and "that could give an edge". Even so, the IPPR admits that it would be a drastic and expensive move, "costing in the region of £1.4 bn a year to administer the millions of carbon accounts that would be needed. It is also likely to be unpopular."

A recent UKERC report from the University of Oxford-based Environmental Change Institute seems to dispute that last point, arguing that the scheme could be made attractive if was framed "as a budgeting process to give individuals ownership and control over their emissions and because it is a familiar many individuals through other aspects of their personal administration such as income management". It adds: "Consumers are already familiar with complimentary currencies such as Air Miles or loyalty points; therefore it is likely that they are prepared for understanding currencies other than money."

But it identifies "trust" as a potential issue: "Limits need to be set by a trusted authority and be transparent in how they are calculated to help public acceptability, and enable forward planning by consumers. They also need to be set at an appropriate level and with the appropriate aids to make it possible to live within the limits. The public should be convinced that scientific rationale for emissions reduction is sound and for the collective benefit."

It doesn't help that the newspapers are awash with reports of fraud and corruption in the wider carbon-trading world. For example, there is currently an international police inquiry into an alleged £1 bn emission-trading scam (Observer, 4 October).

Certainly with a system involving tens of millions of consumer transactions, there could be a lot of potential for fraud, illicit carbon permits, as well as black-market fuel, all of which could be socially divisive. For example, with the value of the credits gradually rising, the poor might be tempted to sell off their credits to the rich and then they might have to buy in unregistered energy/fuel illegally – a recipe for a spiv economy, requiring heavy regulation and policing to keep a check on rogue traders and fake credits.

So the result, in the worst case, could be increased emissions: the rich and energy profligate would simply buy in credits from the poor, to escape the overall cap limits, while the poor would try to buy in dirty "off-list" energy. Of course, in the best case, at least for the climate, given the increasing cost, the rich may cut back to some extent, and/or invest in efficiency/self generation, while the poor may decide to do without some energy services to continue to sell off their credits. So in that case you might get some reduction in emissions but at what social cost? To avoid negative outcomes, plainly there would be a need for massive support for the take-up of cheap, low-energy system by consumers, especially poor consumers. In this context is worth noting that the governments new Feed In Tariff for micro renewables (below 5 MW) is expected only to yield a contribution of about 2% of UK electricity by 2020.

It might of course be possible to do better than that, but it's not clear that operating at the individual consumer level via caps and rationing is the best way ahead. It's far easier to put caps on the relatively few energy generation and supply companies. A far as individuals are concerned, what we need is to ensure that money saved through energy efficiency is invested in renewable energy, so that we avoid the rebound effect and actually capture all of the carbon savings. Whether personal carbon rationing/trading will do this effectively and equitably remains uncertain.

IPPR Report

ECI/UKERC report

During my first ever field season, I studied a small drainage basin on Devon Island in the High Arctic of Canada. There was a glacieret, a tiny glacier, in the upper part of the basin, and it was mostly pink. As I had been taught that glaciers are whitish or brownish, depending on how much sediment they contain, this pinkness surprised me.

Later I learned that coloured ice is not all that unusual. Glacier ice can be intensely blue if it is free of air bubbles, which are the source of the whiteness. It can be black, or at least look black, if it is floating on water and is transparent. But pink? In northwest Greenland there is a 70-km stretch of the margin of the ice sheet called the Crimson Cliffs. It turns out that the pink colour comes from the carotene manufactured by bacteria that contrive to get a living from the glacier surface. Relatives of this bacterial carotene explain the pinkness of flamingos (they will insist on eating shrimps, which eat the pink bacteria), and the carrotiness of carrots.

Biologists are a bit like glaciologists in that they are willing to study almost anything, so bacteria on glaciers are not new to science. But are they any more than an arresting curiosity?

In recent years, microbiologists have become quite excited about evidence for a so-called deep biosphere. The distribution and abundance of hydrocarbon molecules in deep environments suggest that there must be organisms manufacturing some of the molecules. These deep environments include the beds of glaciers. From the Arctic, the Alps and elsewhere, strong circumstantial evidence has accumulated for ecosystems consisting of distinctive subglacial microbes. Last year D'Elia and colleagues showed photomicrographs of bacteria and possible fungi from ice that had accreted from the water of Subglacial Lake Vostok in Antarctica. The microbiologists seem to be satisfied that they are not looking at samples contaminated by near-surface organisms.

Apart from provoking us to rethink the meaning of "life", subglacial microbes have implications. Wadham and colleagues explore the question of what they might have done to the climate if they were active beneath the ice sheets of the last ice age. There was plenty for them to eat, in the form of overridden rotting vegetation, which some of them would have converted to methane. When the ice sheets waned, the methane, a greenhouse gas, could have had substantial climatic impact when released to the atmosphere. Apparently the release would have had to be episodic to have made a big difference. This is pure, though constrained, conjecture – but what fascinating conjecture it is.

There is a potentially enormous payoff if we can develop an understanding of how organisms can thrive at the beds of glaciers. They may help to stretch the envelope of hospitability yet further, because the most impressive glacier in the known universe is one that is not on the Earth's surface at all.

Europa, one of the Galilean satellites of Jupiter, has an outer shell consisting mainly of water ice. We cannot be sure of the thickness of the shell, but it is probably a few tens of kilometres and may be as little as just a few kilometres. The most interesting things about this lithospheric glacier may be that, firstly, it is undoubtedly floating on an ocean of what is almost certainly liquid water, and secondly, that there is compelling evidence of resurfacing. That is, images of the surface of Europa show features that make sense only if the underlying ocean has on occasion managed to rupture its ice cover and spill out onto the surface of the satellite. So Subglacial Lake Vostok, and the beds of glaciers generally, are intensely interesting from the standpoint of the search for extraterrestrial life.

Remember that oxygen, a deadly poison, is irrelevant, and the one so-far-universal common denominator of known life-hosting environments is liquid water. Ice will not do, and nor will steam. The beds of glaciers have what is needed, and they host life. The first of these two assertions appears to be as true on Europa as it is on Earth.

There is one way we could save ourselves, and that is through the massive burial of charcoal.
James Lovelock

Converting biomass into charcoal type char which can be used to improve soil fertility, while also trapping carbon dioxide, certainly has major attractions. But a key issue is whether, in net climate terms, the loss of (some) biomass for direct conversion to energy is balanced by the gain from CO2 entrapment and extra CO2 absorption by more fertile soils – especially if the combustion route also used geo-sequestration i.e. CCS?

A parametric study of bio-sequestration by Malcolm Fowles at the Open University, suggested that from a global warming perspective we should displace coal with biomass if the latter's conversion efficiency is much over 30%. Otherwise we should sequester carbon from biomass rather than generate energy.

However, this was only a preliminary study and he felt that a more comprehensive analysis might shift the balance more towards bio-sequestration. He did not include carbon savings from hydrogen and other pyrolysis products, or crucially from reduced soil emissions- that's hard to assess after all. And costs were not included in his model, although qualitatively and intuitively he felt bio-sequestration should be cheaper than geo-sequestration by CO2 capture and storage. (Fowles, M. (2007), "Black carbon sequestration as an alternative to bio-energy", Biomass and Bioenergy 31: 426–432, doi:10.1016/j.biombioe.2007.01.012).

Clearly though there are lot of unknowns – for example as to the permanence of bio-sequestration – how long will the carbon stay trapped in the soil? Some say thousand of years, based on historical examples of charcoal use. But then that was in traditional "no til" agricultural contexts: farming methods would now have to change if we wanted to avoid releasing the stored carbon.

There are also strong views about the likely impact if biochar production was adopted on a wide scale. While some see it as a major way to deal with climate problems, the fear of vast agri-business plantations worries some people, Guardian correspondent George Monbiot especially, although even he accepts that there could be niche uses.

Biochar can be produced by pyrolysis at around 500 degrees C, either slowly (over days, the traditional approach e.g. in kilns), which results in about equal amounts of biochar (about 35% of the original biomass), liquid and gaseous fuels; or rapidly (e.g. flash pyrolysis, in seconds), which gives less biochar (about 15% converted) less gaseous products, but more liquid "bio-oil" products (about 75%). In addition there is high temperature (800 °C) gasification, which typically, over hours, yields a low proportion of solids (only about 10% biochar), but a high proportion of gaseous products (about 85%).

Clearly with fast pyrolysis or gassifiation the processing throughputs can be larger, but slow pyrolysis gives you more biochar in the mix. For example, BEST Energy in Australia, have developed a slow pyrolysis approach called Argichar, in which between 25 and 70% by weight of the dry feed material is converted to a high-carbon char material, while also generating syngas: see


How much carbon sequestration might be achieved? Globally, according to Professor Tim Lenton, from UEA: "Biochar has the potential to sequester almost 400 billion tonnes of carbon by 2100 and to lower atmospheric carbon dioxide concentrations by 37 parts per million." How does that compare to other approaches, like Carbon Capture and Storage? Biochar production removes CO2 from the air, while CCS aims to remove it from the exhaust gases of power plants – in large quantities. According to Bruce Tofield, from the Low Carbon Innovation Centre, UEA: "In the UK biochar might yield a few million tonnes CO2 saving with current biomass sources – CCS needs to aim for over 100 m tonnes."

However, that doesn't mean turning biomass into biochar is a bad idea, and some environmenalists are quite enthusiastic. In The Renewable World, a new book from the World Future Council, Herbie Girardet and Miguel Mendonca (Green Books) are very keen on techniques for improving soil fertility and biological carbon dioxide absorption, and talk of "carbon farming". They note that "by pyrolysing one tonne of organic material which contains about half a tonne of carbon, about half a tonne of CO2 can be removed from the atmosphere and stored in the soil, while the other half can be used as carbon neutral fuel". However they add that "a major question that needs an urgent answer is how enough organic matter can be made available to produce significant amounts of biochar. Opponents argue that farming communities in developing countries may be forced to produce fast-growing tree monocultures on precious agricultural land to produce biochar to counter climate change for which they are not even responsible". But they point to sewage as an example of a less contentious feedstock.

There are no doubt many other niche sources of biomass like this, as well as novel sources like algae, although there may also be competing uses (e.g. sewage gas is one of the cheapest renewable energy sources for electricity generation). But then we are back with the question of which is most effective at reducing carbon dioxide?

The Royal Society's recent review of Geoengineering commented: "It remains questionable whether pyrolysing the biomass and burying the char has a greater impact on atmospheric greenhouse gas levels than simply burning the biomass in a power plant and displacing carbon-intensive coal plants." It concludes: "Biomass for sequestration could be a significant small-scale contributor to a geoengineering approach to enhancing the global terrestrial carbon sink, and it could, under the right circumstances, also be a benign agricultural practice. However, unless the sustainable sequestration rate exceeds around 1 GtC/yr, it is unlikely that it could make a large contribution. As is the case with biofuels, there is also the significant risk that inappropriately applied incentives to encourage biochar might increase the cost and reduce the availability of food crops, if growing biomass feedstocks becomes more profitable than growing food."

That is a point picked up by James Bruges in the new Schumacher society report The Biochar Debate (Green Books). He argues for a global Carbon Maintenance Fund, rather than just awarding carbon credits. But that is rather going ahead of ourselves. First we have to see if the biochar option makes sense. The Royal Society pointed out that so far there was not enough research on the topic. Defra has commissioned the UK Biochar Research Centre (UKBRC) to review the impacts of biochar. Hopefully that will provide some answers.

More at