This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

May 2011 Archives

The USA Today is not known to provide the most in-depth analysis of US news, but a recent article I read while traveling caught my attention. The article discussed the 'high' gasoline prices of $4/gallon in the United States and the economic hardship caused by spending more money on energy. I will not discuss the many reasons, some beyond personal choice and some not, that relatively lower gasoline (or petrol) prices in the US cause economic difficulty versus different prices in the EU.

The interesting quote in the article was from a woman living in New Jersey that worked in Washington, DC. She commuted 230 miles (one way) twice a week for work, and now sold her Mercedes-Benz for a Nissan Versa compact car. The higher transportation costs clearly are imposing new priorities for her. This woman's quote was as follows:

"We're the victims of circumstance, but I don't necessarily understand what the circumstance is…"

This was actually quite a refreshing quote to me in that it signals that at least one American recognizes that she does not immediately need to singly blame the oil industry, the government, or even consumers like herself. It also presents some evidence of self-realization lacking in most Americans. The fact that she made a choice to a more fuel-efficient car for her long commute signifies that she realizes via her consumption patterns she is a large part of the solution. Demand for oil increases, largely now from Asian economies, and new oil production locations struggle to replace existing production declines but from lower quality and higher cost (lower net energy) resources. We need our government to level with the public that alternatives to conventional petroleum, including unconventional resources such as oil sands and oil shale (if it can ever produced economically in the US), all cost more.

The only way worldwide production will even struggle to stay level is to keep production costs near current levels. And it is these current levels of oil price near $100/BBL that are forcing the US economy to reconfigure itself. Part of this reconfiguration is for consumers to switch to lower-consuming lifestyles. Part of this reconfiguration is higher efficiency standards from the government. And yes, part of the solution is likely more offshore drilling, but not because it is likely to decrease gasoline prices, but because it will show the inability of US production to significantly impact oil price any longer. Even Energy Information Administration (EIA) projections estimate that significantly expanded US offshore oil production will impact gasoline prices on the order of single cents per gallon. This amount is in the noise of any meaningful impact. Most EIA and International Energy Agency (IEA) projections also are now estimating constant or slightly declining oil consumption in the US. Part of this consumption trend is due to demand destruction via price, and part is due to transitioning to other fuels such as electricity and ethanol. At the moment (and perhaps always), these non-fossil transportation alternatives are also more expensive than gasoline from $100/BBL oil.

Future choices by US consumers and governments from local to federal levels need to consider resilience to energy prices instead of only efficiency such that energy price impacts are mitigated rather than amplified.

More information has begun to emerge as to actually what happened at Fukushima, with it now being clear that full fuel melt-down did occur, perhaps even starting before the tsunami hit, although we are still some way off knowing what the longer term impacts will be, and what should done to avoid a recurrence. Indeed a global agency that covers nuclear safety, based on the 72-nation Convention on Nuclear Safety (which was set up after the 1986 Chernobyl meltdown), has decided to delay reporting on its assessment of the accident until August next year, saying that 'the lessons-learned process cannot be completed until sufficient additional information is known and fully analyzed.' This makes the UK safety review, to be completed by this coming September, seem a little hasty, but then the government evidently want to get on with processing the nuclear consents programme - the Generic Design Assessment has been delayed until the NII safety review is complete.

However we do have some information. Japan's Nuclear Safety Agency has claimed that the release of radioactive materials from Fukushima was equal to 10% of those from Chernobyl. That was why they re-set the accident classification at level 7 on the UN's International Nuclear Events Scale- i.e. involving "major release of radioactive material with widespread health and environmental effects requiring implementation of planned and extended countermeasures". Safety agency official Hidehiko Nishiyama said, however, that the two events were markedly different. "In Chernobyl, there was acute exposure to a high level of radiation, and 29 people died from it. This is not the case in Fukushima. In Chernobyl, reactors themselves exploded. In Fukushima... the reactors themselves have stayed intact, although we are seeing some leakage." There are likely to be disagreements about this assessment. For example there were crucial releases into the sea, which may have very significant long-term effects.

There are also likely to be disagreements about who was to blame. UK Nuclear consultant John Large has produced a report for Greenpeace looking at the accident and says that much of blame must fall on the operating company, TEPCO. He say they 'would be well aware of the ways and times over which an unattended and un-cooled reactor core would run its inevitable course to a fuel melt and, thus, pose a threat to security of the primary and secondary containments. It follows that TEPCO would also have been aware and would have had, surely, plans and procedures, including spare equipment, with which the fuel melt could have been managed within the known timeframes to stability and a safe resolution"

However 'whatever plans TEPCO had in place, if it had any at all, have failed. In fact, certain of TEPCO's actions in the aftermath of the explosions have been confused and, some might opine, lacking discipline of purpose to the extent that expedient decisions have been made without proper forethought and judiciousness to avoid knock-on consequences: for example, the injection of seawater may have resulted in salt deposits sufficient to foul cooling flows in the lower regions of the RPV [reactor pressure vessel] ; the liberation of hydrogen from seawater is more rampant than from freshwater and radiolysis of oxygen from the cooling water could provide stoichiometric conditions and ignition with hydrogen in the absence of air in the containments; and the latest and most recent announcement to deploy a nitrogen purge to the Unit 1 reactor seems yet another ill-explained and unjustified desperate measure'.

He went on 'The situation relating to the violent destruction of the Unit 4 spent fuel pond is even more surprising. This is because it is a relatively straightforward calculation to predict the boil-down time to when the fuel is uncovered (several days) at which the risk of hydrogen generation and deflagration occurs, so just why the simple and obvious expedient of providing cooling water via a temporary pump (i.e. a fire tender) was not implemented by TEPCO in a timely manner is baffling. In other words, the station blackout that occurred at Fukushima Dai-ichi was a prescribed event for which TEPCO should have had in place procedures and countermeasures - obviously, adequate plans and countermeasures were not in place so, in this respect, the nuclear safety culture at Fukushima Dai-ichi was fundamentally flawed'.

He concluded 'If it is the case that, at Fukushima Dai-ichi, TEPCO failed then, it follows that the Japanese nuclear safety regulator NISA also failed because it permitted TEPCO to operate a hazardous nuclear complex in an unsafe way and without adequate emergency plans with which to counter the inevitable. If this is correct, then the Japanese nuclear safety culture is fundamentally flawed which means, because the same nuclear safety rules, limits and conditions are almost universally adopted internationally, that the demonstration and regulation of nuclear safety worldwide is equally and, perhaps, irrevocably flawed'.

Dr Large's May 2011 update is even more forthright-TEPCO must have known about the meltdowns early on, a claim backed by Greenpeace: (scroll down to Large's PDFs)

While debates on who or what was at fault will no doubt continue, the nuclear industry has clearly suffered a major blow, with nuclear looking to be unreliable and costly. In terms of reliability, the recent accident isn't the first time the Fukushima plants have been in the news. As US energy expert Paul Gipe has pointed out, several of the reactors were shut down from 2002 to 2005 for safety inspections as a result, evidently, of TEPCO's falsification of inspection and repair reports.

He notes that the Fukushima plants generated, on average, 30 TWh per year. He says: 'The key word here is "on average". Despite nuclear power's reputation as reliable base load generation, the Fukushima plants were anything but reliable over the four decades that the plants were in operation. Annual generation was surprisingly erratic'.

He went on 'Take Unit 6, the most modern unit, for example. In 2004 generation dropped from 4.6 TWh in 2003 to 1.1 TWh, and both were a far cry from the reported generation in 1997 of more than 9 TWh. Similarly, Unit 5's generation fell from 6.2 TWh in 1999 to 1.6 TWh in 2000.'

And on cost, with the 150,00 or so evacuees not likely to be able to return home finally until perhaps December, or even January, there is talk of $130bn in compensation/damages, but that's just the start. The clean up could cost $300bn or more. Who will insure nuclear plants now? Who will invest in them?

It's perhaps not surprising then, that following on from Germany's decision to speed up is nuclear phase out, Italy has now frozen its plan for new nuclear plants, as has Switzerland. Meanwhile Thailand is contemplating calling off five planned plants, while Malaysia, which was to have its first plant operating by 2020, is set to abandon the plan. A US project in Texas lost financial backing, maybe the first of several as investors look at the liability risks. It does seem that the nuclear renaissance is unravelling.

Last year the IEA forecast that global demand for nuclear energy would rise from 6% of primary energy in 2008 to 8% in 2035. But, as the FT reported, Nobuo Tanaka, the chief executive of the International Energy Agency, has warned that the role of nuclear power in global energy supply may be less than previously forecast, following the events in Japan. "Building nuclear power or expanding nuclear power may mean more costs or more delay. That means the nuclear option may not play as big a role as we predicted."

The climate science rap

| | Comments (2) | TrackBacks (0)

By Michael Banks

Well it had to come didn't it? There have been quite a few science raps over the last few years touching on nuclear physics, the American astronomer Edwin Hubble and even the Large Hadron Collider at the CERN particle-physics lab, so it seems about right there is now one about climate change.

The rap video for I'm a climate scientist was produced by the Australian current affairs television programme Hungry Beast.

Featuring lines such as "climate change is caused by people, Earth unlike Alien has no sequel", the video features a raft of climate scientists doing their best Beastie Boys.

I will let you decide whether using rap as a means of communicating climate science is a worthwhile endeavour.

There has been much opposition to shale gas extraction via the new techniques of directional drilling, hydraulic fracturing and pressurised gas collection/release, with dramatic footage of water from domestic taps catching light due to dissolved gas in areas of the US where shale gas 'fracking' projects are underway. It certainly makes for powerful videos. See: and

However, some say this is not new and happened long before shale gas extraction started. Maybe so, but there are other eco issues, including ground water contamination with chemicals that are forced deep underground, along with water and sand, to fracture the shale and release its gas.

In addition there's the shear volume of water need for fracking. Moreover shale gas is still a fossil fuel, so burning it produces CO2- and any leakage from fractured strata could be serious for the climate, since CH4 is a much more powerful greenhouse gas.

The Tyndall Centre has called on the UK government to put a moratorium on shale gas operations in the UK until the environmental implications are fully understood. It was also worried that a rush to exploit shale gas could divert effort away from developing a long-term sustainable low carbon economy. reportfinal200111.pdf

Similarly, in a RIIA report 'The Shale Gas Revolution: hype or reality?' Paul Stevens, senior research fellow at Chatham House, commented 'in a world where there is the serious possibility of cheap, relatively clean gas, who will commit large sums of money to expensive pieces of equipment to lower carbon emissions?'

Global shale gas reserves are put at 456 trillion cubic meters compared with 187 trillion cu.m for conventional gas, according to a 2010 World Energy Council report. Over 60% of shale gas deposits are in North America and Russia. Stevens says producing shale gas using horizontal 'fracking' costs $3 or less per million British thermal units whereas it's said that conventional gas drilling cost about $10 per million Btu.

So what hope is there for renewables- until shale gas runs out? But there are some uncertainties about whether the shale gas revolution will be as big as some predict, and Stevens says 'if it fails to deliver on current expectations- and we will not be sure of this for some time- then in ten years or so gas supplies will face serious constraints'.

Certainly some say the energy costs of extraction may make it counterproductive as the reserves thin out. So there could be a boom and bust scenario.

And the eco costs may be large. A leading critic is Professor Robert Howarth from Cornell University. In a draft paper last year he claimed that hydraulic fracturing may contribute significantly to greenhouse gas (GHG) emissions. He said that there were significant GHG emissions from the well-drilling, water-trucking, pipeline-laying, and associated forest- felling. Combining the effects of combustion, production, distribution, and leaked methane gives the fuel about the same GHG emissions as coal- 33 grams of CO2, compared to 31.9 grams for coal.

The analysis was partly based on a methane leakage estimate of 1.5% of natural gas consumed, a figure assumed by the federal government, methane of course having much more climate impact per gram than CO2. Although he admitted that it was only a preliminary assessment,. it drew fire from the America's Natural Gas Alliance, who dismissed his assertions as 'preliminary and speculative and not backed by hard data'. It added 'Natural gas is twice as clean as coal and is available here in America in significant abundance today. Alongside the development of renewables, natural gas has a key role to play in transitioning our nation to a low-carbon economy.'

Howarth attracted similar criticisms when earlier this year, more of his results were widely quoted: in a new paper he claimed that generating electricity from shale gas produces at least as much climate impact as coal-fired power, and perhaps more.

To make sensible comparisons you have to take account of the fact that methane's residence time in the atmosphere is much less than that for CO2, but Howarth did offer data for both 20 years and 100 years. 'Compared to coal, the footprint of shale gas is at least 20% greater and perhaps more than twice as great on the 20-year horizon, and is comparable over 100 years'.

Howarth has articulated what seems to be a common US view: 'My strong belief is that shale gas has been promoted far beyond the objective evidence of what it can and cannot do. It is time to step back, and objectively analyse whether this is a reasonable energy technology for our future. It is also time to analyse how environmental issues associated with the technology might be reduced, and at what cost.'

Some US cities including New York and Pittsburg are trying to halt local drilling for shale gas, with Philadelphia calling for at least a temporary ban on new wells in the watershed that serves the city. There have also been concerns in Texas. See:

Fracking in the EU

So far it's mainly been a N. American issue. But successes in the USA have led to prospecting across Europe. A report by IHS CERA said that unconventional gas reserves, including shale gas, in Europe could total 173 trillion cubic metres. Interestingly, France imposed an interim moratorium on shale gas extraction earlier this year, and may ban it entirely, after the government backed a draft bill that would outlaw the controversial process.

It's estimated that the UK could meet around 10% of its current gas needs from shale gas, if it can be extracted at a commercial rate. And with exploratory drilling starting last year in Lancashire, the Energy and Climate Change Select Committee has been looking into its implications for the UK. The results should emerge soon, but in its evidence to the Select Committee, Scottish and Southern Electricity plc said that, while shale gas was a viable if relatively small option for the UK (compared to the US), 'there is a concern that with limited capital for investment in the energy industry, significant development of policy incentives to encourage development of shale gas resources in the UK, alongside uncontrolled growth in gas-fired generation could decrease investor certainty on UK policy direction towards renewables, CCS and/ or nuclear. Although this would lead to a short-term gain in carbon emission reductions, it would be to the detriment of the long-term decarbonisation of the UK power sector'.

All the submissions are at:

Most environmental groups have clearly made up their minds. Jenny Banks, climate and energy policy officer at WWF-UK, called on the British government to halt shale gas exploration. 'It would be ridiculous to encourage shale gas when in reality its greenhouse gas footprint could be as bad as or worse than coal. We need to reject this source of gas, and have a clear plan to move away from our dependency on fossil fuels and harness the full potential of renewable technologies.'

Friends of the Earth offered a quite nuanced view to the Energy and Climate Select Committe: 'available data suggests that the carbon footprint of shale gas is smaller than that of coal used in electricity production, although it is higher than that of conventional gas. Therefore if shale gas was to displace existing coal electricity generation then there would be a net carbon reduction. However, as some coal is being displaced anyway via the LCPD, new shale gas would more than likely be displacing other types of electricity generation such as renewables'.

However there are other views. For example, a new report from the Global Warming Policy Foundation, which is headed by Lord (Nigel) Lawson, argues that the eco-hazards 'are much smaller than in competing industries'. It claims that 'a single shale gas well uses in total about the same amount of water as a golf course uses in three weeks'. Perhaps not the best comparison! It concludes 'A surge in gas production and use may prove to be both the cheapest and most effective way to hasten the decarbonisation of the world economy, given the cost and land requirements of most renewables'.

The debate continues.

For a good review of EU shale gas developments, see:

'When storms come, some build walls, some are thrown by the wind, others build windmills'. So said Chinese philosopher Lao Tzu. But sometimes you can overdo it.

In my last Blog I reported on criticisms from, basically, anti wind power groups that there was not enough reliable wind. However, there are, perhaps perversely, also criticisms that there is sometimes too much wind- which means that, under current market and grid balancing arrangements, compensation is paid for the loss of potential earnings. For example, six Scottish wind farms were paid a total of nearly £900,000 to stop producing energy because the grid network could not absorb it, for several hours between April 5-6th.

£308,000 went to Scottish Powers Whitelee wind farm in East Renfrewshire, £265,000 to RWE/nPower's Farr wind farm south of Inverness, £140,000 to SSE Renewables' Hadyardhill in South Ayrshire, and £130,000 to Scottish Power's Blacklaw in Lanarkshire, while the Millennium wind farm in the Highlands and Beinn Tharsuin, north of Alness, got £33,000 and £11,500 respectively.

The Renewable Energy Foundation (REF) said the payments ranged up to 20 times the value of the electricity that would have been generated if the turbines had kept running. That was for the Farr wind farm. The payments include compensation for the loss of the Renewable Obligation subsidy element. Dr Lee Moroney, planning director for the REF, which has criticised subsidies to renewables in the past, said: 'The variability of wind power poses grid management problems for which there are no cheap solutions. However, throwing the energy away, and paying wind farms handsomely for doing so, is not only costly but obviously very wasteful. Government must rethink the scale and pace of wind power development before the costs of managing it become intolerable and the scale of the waste scandalous.' -for-wind-farms-discarding-electricity-5th-6th-april-2011

The National Grid said 'On the evening of the 5th into the 6th of April, the demand for power was low but the nuclear generating plants in Scotland were running as expected. There was also heavy rainfall, which mean hydro power plants were operating well too'. A fault in the transmission system also meant the surplus energy could not be transferred to England.

DECC described the incident as 'unusual' and said 'In future we need greater electrical energy storage facilities and greater interconnection with our EU neighbours so that excess energy supplies can be sold or bought where required.'

The Scottish government said 'electricity generated by renewables accounted for 27.4% of Scotland's electricity use. National Grid is responsible for balancing the supply of electricity from all sources across the grid to match demand and generators will sometimes be required to reduce output as part of that process. At the same time, the Scottish and UK governments have been working with the National Grid and others in the industry to strengthen grid capacity and address access constraints'. That will certainly be necessary if Scotland is to achieve the new heroic target promoted by the SNP during the recent Scottish elections of getting 100% of electricity from renewables by 2020!

The basic problem is that turbines can be, and have been, built much faster than connectors, although payment for curtailment is common to all plants with similar supply and grid balancing contracts- fossil fired standby plants get paid when idle too. But under the ROC system wind gets larger basic payments and some projects have it seems negotiated good curtailment deals. Crucially, under current policies, the nuclear output is not curtailed. Maybe it should be, making room for wind. But then nuclear would be (even more?) uneconomic, and if it got a curtailment fee, (even more?) subsidised.

This problem will only get worse as more wind, plus other renewables, and more nuclear are put on the grid. The Climate Change Committee (CCC) has just outlined a possible strategy in which, by 2030, nuclear and renewables would each have about a 40% share. This would require an additional 2-3 nuclear reactors on top of those developers are already planning to build. The CCC do look to smart grids and storage to help deal with grid balancing, but much more will be needed- and ideally new market and contractual arrangements. But, even so, it is hard to see how we can avoid expensive curtailment events during low demand periods if we have large amounts of inflexible nuclear and variable wind both seeking to feed into the grid. Perhaps the new government White Paper on the Electricity Market Reforms, expected maybe in June, will have some answers.

The overall renewable energy target seem likely to remain in place, unless we re- negotiate it down with the EU, as a new report fro the right of centre Policy Exchange suggested. However, the CCC report suggested cutting back on offshore wind targets, as did the Policy Exchange- which claimed it was too expensive. The offshore wind target has already been drastically cut from the 30-32GW envisaged at one time, to around 13GW by 2020, as spelt out in last July's UK National Renewable Energy Action Plan. Will even that now be viewed as excessive? And if so where is the compensating expansion? The new Renewable Heat Incentive is fine as far as it goes, although most of it won't start until next year, and it's just for heat. As for electricity, PV solar has just been savaged with FiT tariff cuts for large projects, the Marine Renewables Deployment Fund has been axed, and last year Lord Marlan said 'there should be no dramatic increase' in on-land wind generation above current targets. The White paper should make for interesting reading.

For more on renewables policy, see:

In a recent post, I reported on current climate policies for road transport - these policies are challenged by increasing market shares of alternative fuels and technologies. These fuels have a carbon footprint that is increasingly determined by upstream processing, recovery, and general equilibrium effects. Notably, biofuels such as corn ethanol induce deforestation which itself is a major source of front-up CO2 emissions (ignoring other effects such as biodiversity loss). Not all biofuels, however, have the same carbon footprint. How can biofuels for transport and their life-cycle greenhouse gas emissions then be effectively regulated? As pointed out in the corresponding paper, fuel mandates but also the Californian Low Carbon Fuel Standard or the European Fuel Quality Directive (FQD) are insufficient for not accounting for a series of general equilibrium effects. The European Commission, in fact, is currently reviewing its stance, and may review the FQD in this summer. But what kind of measure can improve the biofuel conundrum? The twin paper by Flachsland et al. studies a potential role of emission trading also for the transport sector. Now, this is an effective measure in terms of bringing a level-playing field across all fuels. In fact, if a cap is global in scope, and covers all relevant sectors, we could sleep quietly: wherever emissions appear they will be accounted for, and it is not relevant whether they are induced by biofuels or not. It matters only that they get a price tag, and that people try to reduce emissions. However, a global cap is not in sight, and an inclusion of agricultural emissions also seems utopian currently. Hence, we are still stuck with the GHG accounting problem for biofuels (and potentially relevant general equilibrium effects for other fuels such as electricity) when transport is included in the European Emission Trading Scheme. One possibility is to treat biofuels by default as having at least gasoline-equivalent emissions (see DeCicco, 2009). Fuel providers can then prove that their feedstock has neglible indirect effect and low fertilizer (N20) and processing emissions, gaining credits if certified. This would reverse the burden of proof. Properly implemented, it can be seen as an implemented precautionary principle - better safe the forests and its carbon stock then being sorry big time.

Pöyry's 'North European Wind and Solar Intermittency Study' (NEWSIS) has found that "The creation of an offshore 'super grid' and a major upgrade of energy interconnections are not the silver bullet solutions to Europe's energy needs". It says that the introduction of improved connectivity would only partially alleviate the volatility of increased renewable energy generation. Basically it claims that "Wind and solar output will be highly variable and will not 'average out'", even over wide areas- it looks at the NW of Europe- England, Wales , Scotland (but oddly not Ireland, North or South), France, Belgium, Luxembourg, Austria, Poland, Germany, the Czech Republic, Switzerland, Denmark, the Netherlands, Sweden and Norway.

It assumes existing wind and solar expansion plans go ahead, but says that, as result, "by 2030, wholesale market prices in some countries will have become highly volatile and driven by short term weather patterns". although it adds "countries with large amounts of hydro - in particular the Nordics - are much less affected by increased price volatility."

Crucially it says that "heavy reinforcement of interconnection doesn't appear to offset the need for very much backup plant, however. This surprising observation comes from the fact that weather systems - in particular high pressure 'cold and calm' periods in winter - can extend for 1000 miles, so that periods of low wind generation are often correlated across Europe. Hence interconnection helps when it is windy in one country and still in another, but when it is calm across many countries together, interconnection is much less helpful".

So it concludes "interconnectors are not a complete solution.' But actually no one said that interconnectors were 'a complete solution' - there would also be a need for backup, storage, and demand side management, plus inputs from other renewables. The report does say that "in many ways 'demand-side' solutions are most suited to matching the needs of intermittency" and it looks at key aspects of 'Flexible demand' e.g. flexibility from non-heat sources (such as washing machines, tumble dryers etc.) and optimised charging of electric vehicles, including for example "the speed of response, the timescales for being turned off, speed of deployment and likely behavioural characteristics". But it claims that, while 'demand-side' involvement may be attractive, "the wide range of likely deployment patterns and technological developments will further complicate investment decisions- and quite possibly slow deployment".

Overall then a pretty negative vision, coming to very different conclusions to those from most other studies- apart from those from the UK Renewable Energy Foundation (who loved it!) and some anti-wind utilities (who helped with it). For example, a new Germany academic study, published in Recent Advances in Energy and Environment, concluded that pan-EU supergrid links could halve the need for backup:

An earlier EWEA Tradewind study found that, for the 2020 medium scenario (200 GW, 12% wind penetration), aggregating wind energy production from multiple countries strongly increased the capacity credit, the amount of capacity that can be relied on to meet peak demand, almost doubled it to 14%, which they say corresponds to approximately 27 GW of firm power in the system.

And the EREC /Greenpeace's Energy [R]evolution '24/7' report concluded that "during the last 30 years, the potential power production from wind during winter time throughout Europe in the Energy [R]evolution scenario would have only dropped below 50GW 0.4% of the time, equivalent to once a year if the average duration of the event is 12 hours". See: and for the full report: Documents/Publications/global%20energy%20grid%20scenario.pdf

But then, unlike Pöyry, they were all looking across the whole of Europe, North and South- a wider footprint- and in some cases (e.g. Gregor Czich's seminal supergrid work) the windy east and North Africa as well, with a full supergrid network linking in wind, solar, biomass, geothermal, hydro and other renewables.

Czich's study is soon to published by the IET. The pan -EU grid balancing issue is followed up in more detail in a recent report from Greenpeace: of the grids.pdf

No one is suggesting that there will not be a need to balance wind variations, or that at times there will relatively little wind in many places, but most studies seem to agree that this problem can be dealt with at low cost. Those who are basically hostile to significant reliance on wind power, tend to make much of this issue, often focusing just on one country, and on what they see as the hopelessly low on-land wind turbine load factors. For example, Scottish environmental charity, the John Muir Trust (JMT), recently released a report produced by Stuart Young Consultancy claiming that, rather than the 30% load factor often cited, between Nov. 2008 to Dec 2010 UK on-land wind farms operated below 20% of capacity more than half the time and below 10% of capacity over one third of the time. Overall they achieved 24% average. In 2009 it was 27.18%, in 2010 21.14%.

Average wind speeds have certainly fallen in recent years, but this may just be a short-term climate variation. The JMT report however saw the recent low average load factors as fundamental, and challenged industry claims that periods of widespread low wind were 'infrequent'. The JMT report claimed that the average frequency and duration of a 'low wind event' was once every 6.38 days for 4.93 hours- and the analysis found that there were 124 times when winds dropped so much that just 4% of expected output was generated.

The report noted: 'Very low wind events are not confined to periods of high pressure in winter. They can occur at any time of the year.' During each of the four highest peak demands of 2010, wind output reached just 4.72%, 5.51%, 2.59% and 2.51% of capacity, according to the analysis. It concluded wind behaves in a "quite different manner" from that suggested by average output figures or wind speed records.

The report concluded: 'It is clear from this analysis that wind cannot be relied upon to provide any significant level of generation at any defined time in the future. There is an urgent need to re-evaluate the implications of reliance on wind for any significant proportion of our energy requirement.'

Dr Lee Moroney, the Renewable Energy Foundation's planning director, said: 'Experience is teaching us that wind power is not only highly variable over short timescales, but also from year to year and even in regions which have previously performed well. This finding has important economic implications for the conventional generators acting in the support role for wind. These face radical uncertainty about income from one year to the next.'

However, Jenny Hogan, director of policy for Scottish Renewables, said 'We have no confidence in these unofficial figures. Last time Stuart Young completed research on wind farm output an independent analysis showed serious discrepancies. He claimed the load factor for wind for the period of November 2009 to November 2010 was 22%, however GL Garrad Hassan, an independent consultancy firm, found on average it was in fact 24.8%. We recognise this is lower than the 30% average load factor. However, this was anticipated as it had been an exceptionally calm year.'

She added: 'Yet again the John Muir Trust has commissioned an anti-wind farm campaigner to produce a report about UK onshore wind energy output. It could be argued the trust is acting irresponsibly given their expertise lies in protecting our wild lands and yet they seem to be going to great lengths to undermine renewable energy which is widely recognised as one of the biggest solutions to tackling climate change - the single biggest threat to our natural heritage'.

Stuart Young, the JMT report's author, is, it seems the chairman of the Caithness Windfarm Information Forum - described on its website as 'group of people concerned about the proliferation of windfarms in Scotland'.

'Analysis of UK Wind Generation' is at