This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

April 2013 Archives

UK Marine Renewables

| | TrackBacks (0)


A report by RenewableUK says the Electricity Market Reforms could act as a springboard for the growth of wave and tidal energy, or could undermine investor confidence in marine power at a crucial stage of the industry's development. It also highlights challenges such as delays in getting grid connections for wave and tidal projects, and the high cost of transmission charges.

The report, "Conquering Challenges, Generating Growth", lays out the progress made so far: 12 full-scale single devices with a capacity of 9 megawatts deployed in UK waters generating clean electricity - more than the rest of the world combined. It notes that commercialisation of the tidal sector is just around the corner, with the deployment of the first arrays (multiple devices) beginning in 2014, and an expected increase to 100-200MW of wave and tidal installed by 2020. Major engineering firms such as Siemens and Alstom are working with the UK and Scottish Governments, universities and electricity companies to develop British marine power. The Crown Estate has awarded leases for more than 1.8 gigawatts of capacity at nearly 40 sites in UK waters. The British Isles has 50% of the total European wave energy resource and 25% of tidal energy resource - these technologies could generate up to 20% of the UK's electricity needs. Based on independent research, RenewableUK estimates that wave and tidal energy could be worth £6.1 bn to the UK by 2035, creating nearly 20,000 jobs - up from the 1000 employed now

However, this growth could be stifled if the Government fails to get the details of Electricity Market Reform right. The most crucial factor is the level of financial support technologies will receive. The report states that the initial strike price for the first generation of tidal arrays should be set at £280 to £300 per megawatt hour. For wave technology, the initial strike price should be £300 to £320/MWh. This will catalyse the marine energy industry, leading to economies of scale and learning through experience, which will lower the strike price for the second generation of arrays in 2018. Also, under EMR, contracts would only last for 15 years - the report argues that this must be extended to 20 years to give investors an adequate return - otherwise the strike price would have to be higher. It's notable that there is talk of EDF being offered 40 year CfD contracts for its nuclear projects.

RenewableUK's Wave and Tidal Manager, David Krohn, said: "The wave and tidal energy industry has reached an exciting period as it moves from single device demonstrator projects to the first small proving arrays. The world's leading projects are being developed in the UK waters thanks to a comprehensive package of support granted by the UK and Scottish governments, which has ensured that the UK leads the world in wave and tidal energy. However, there are significant hurdles that need to be overcome to ensure the sustained growth of the industry. It's time to get real about the potential risks so that we can work with Government and others to find the solutions as early as possible. Wave technology in particular will need tailored capital support in the coming years if we are to maintain pole position in this promising and strategically important sector. It is essential that Electricity Market Reform provides a level of support that will allow the most cost effective projects to be taken forward.' www.renewableuk.com/en/publications/index.cfm/wave-and-tidal-energy-in-the-uk-2013

It is true that wave and tidal stream technology is still relatively expensive, but costs are falling, thanks to the support via the revised Renewables Obligation and special development grants, amounting in all to 25-30p/KkWh. By contrast, onshore wind gets 10p/kWh or less and offshore wind 15-17p/kWh. Max Carcas, previously a device developer with Pelamis, told the Financial Times (28/1/13).'While 25-30p/kWh seems quite expensive it is often forgotten that there has never been a new energy technology that has been economic 'out of the box. The cost of generating from wind and solar energy has fallen by about 80 % since the mid-1980s, The fact that the opening costs of marine energy are lower than many preceding energy technologies puts this sector in a very good position to be competitive in the longer term.'

So it is good to hear that the Crown Estate is investing up to £20m in two new wave or tidal energy projects. The funding will be used to construct arrays (multiple devices). Projects with an installed capacity of at least 3 MW are eligible to apply, as long as they are expected to reach final decision on investment by March 2014. RenewableUK said: 'The funding will accelerate a crucial step forward - from successful individual devices to the deployment of full-scale arrays in the water. It will also help to attract the right level of private investment to commercialise the sector'

In parallel, two tidal power projects in north Wales and Scotland are to receive grants worth £10m each. Sea Generation Wales, a joint venture between Siemens, and RWE,, will develop a 10MW project off Anglesey, with five 2MW of MCT's SeaGen type generators, running by 2015, while MyGen, in a joint venture between Morgan Stanley and International Power, an independent power generation company, will develop an 86MW project in the Pentland Firth between the northern Scottish mainland and the Orkney Islands. This is the first phase of a 400MW array which will feature the turbines developed by Atlantis Resources and Tidal Generation Limited, the partly Rolls Royce initiated project now owned by Alstrom. More should follow. For example Siemens, is planning a MCT SeaGen project at Kyle Rhea in the Orkneys, expected to be operating tidal farms of 20MW to 50MW by 2020.

But it's not all good news. Neptune Renewable Energy's Proteus tidal stream ducted- vertical axis turbine has been found to be technically flawed and therefore not commercially viable. Their full-scale demonstrator was deployed in the Humber estuary in January 2012 and has been subject to much testing and a number of modifications. But it became apparent that the device would not be able to achieve a high enough level of electrical output, despite indications to the contrary resulting from earlier work done at fortieth and tenth scale. The plan was to supply the Deep marine attraction in Hull with around a third of its power.

The company said 'Since November, a significant amount of work has been carried out, some independently, to establish the reasons for the technical problems and to understand whether the company was facing issues of adjustment and tuning, rather than a challenge to the overall concept of using a vertical axis turbine within a duct, in estuarine locations. This work has included looking at an alternative lift turbine, rather than drag turbine," said the company'. But having looked at the options they have decided to abandon the project and liquidate the company. Source www.neptunerenewableenergy.com/

More positively, Pulse Tidal has secured a site for its 1.2MW commercial demonstration of its oscillating double hydrofoil system at the SW Marine Energy Park, Lynmouth. That's a very novel design, so it will be interesting to see how it fares.

Biomass battles

| | TrackBacks (0)


The 'Dirtier than Coal' report from Friends of the Earth, Greenpeace and RSPB, argued that burning trees would produce more CO2 net than burning coal, in part due to the delay before CO2 was reabsorbed by new planting. See my earlier Blog: http://environmentalresearchweb.org/blog/2013/02/biomass-burning--worse-than-co.html and www.rspb.org.uk/Images/biomassreporttcm9-326672.pdf

It referred to a report by North Energy Associates (NEA) and Forestry Research (FR) produced for DECC, which tried to answer the question: 'Is it better to leave wood in the forest or harvest it for timber, other wood products (e.g. panel boards) and/or fuel?' They concluded that: 'Management of UK forests for wood production can contribute to UK carbon objectives e.g. to 2050...Using wood for bioenergy can also reduce carbon emissions, compared to burning fossil fuels for energy....These results suggest that policy should support managing UK forests to produce wood for products and bioenergy'.

However, the NEA/FR report specifically rejected the 'whole tree burning' scenario picked by the FoE/Greenpeace/RSPB report's researcher Tim Searchinger, in favour of using trimmings, offcuts and other wastes with no other market: www.gov.uk/government/uploads/system/uploads/attachment_data/file/48346/5133-carbon-impacts-of-using-biomanss-and-other-sectors.pdf

The Biomass Energy Centre, which is a government backed advice and information agency, has produced a critique of the Searchinger study, which it says 'appears to have been neither peer-reviewed nor submitted to any journal for formal publication, contains no new research but numerous factual errors and misinterpretation'. For example, he 'chooses just one scenario from the peer reviewed study by FR and NEA for DECC, amongst the hundreds examined,' allowing him 'to allege that biomass is "dirtier than coal", ignoring all the other scenarios that show carbon saving benefits in the form of lower GHG emissions ranging from marginal to very substantial'.

The Biomass Energy Centre says.'The DECC study shows that there are many ways of using forests and that, for managed forest in the UK, almost all of the scenarios provide considerably greater GHG emissions reductions than simply leaving the trees unharvested. It also shows that there is an unrealistic scenario, selectively picked by Searchinger, which is slightly less good under certain circumstances, in GHG emissions terms, than leaving them unharvested. This scenario does not make economic sense, and does not represent UK practice. In addition, the UK government sustainability requirements demand genuine GHG emissions reductions, so this scenario would not attract government support in the form of ROC payments and, hence, there is no incentive to power generators to use it'.

The Biomass Energy Centre point out that UK sawmill practice basically involves the selection and use of timber of sufficient size and quality for construction or joinery, purposes, and it backs this approach as being both economically and environmentally sound. The left overs have many uses including for paper, wood-based panels and sometimes for fuel. They insist that the use of whole trees including all of the roundwood for energy in the UK is simply not an economically realistic approach. What may be viable is the use of forestry thinnings, chips, sawdust, offcuts, etc. In addition, 'Broadleaf trees also have a significant proportion of branches (conifers typically much less), while conifer tops taper to wood of small diameter. If these types of wood are not used for wood-based panels (for which they are not always suitable), paper or energy, they will simply be disposed of, and will still break down to carbon dioxide. Consequently the appropriate use of these types of wood for energy, displacing fossil fuel, is beneficial.'

They note that 'the research undertaken by FR and NEA for DECC and related research by FR clearly demonstrates that the production of mixtures of sawn wood, wood-based panels, paper and fuel (including a proportion from small, young thinnings as whole trees) results in significant overall greenhouse gas benefits'.

Clearly, although they don't meet the re-absorption delay issue head on, they don't see using marginal and left over wood as a problem, and also claim that, even if biomass fuel prices rose, that would be unlikely to divert high value timber from other markets, so it would not be a problem for other uses of wood, including those which ensured continued carbon sequestration. www.biomassenergycentre.org.uk

One time Greenpeace and FoE senior energy campaigner, Stewart Boyle, now active in the biomass industry, also joined in the critique. He produced a heartfelt Blog for the Renewable Energy Association claiming that the NGOs had got it badly wrong. He also saw the co-firing of wood chips in coal plants like Drax as a helpful interim step and was worried about DECC's attitude. www.r-e-a.net/blog/modelling-our-way-to-biomass-paralysis-22-03-2013

Not everyone thinks that importing wood chips from North America to co-fire in large old inefficient coal plants like Drax is a good idea. The High Renewables scenario for 2050 produce for British Pugwash avoided all biomass imports: www.britishpugwash.org/recent_pubs.htm

But in the interim some might be condoned, while we get busy developing smaller scale high efficiency Combined Heat and Power plants, linked to district heating networks, and fed by UK sourced biomass or biogas. Short Rotation Coppice still has its advocates, while AD biogas production from farm and food wastes has a lot of attractions.

The latter would avoid land use conflicts, but if we want to go for green gas and biomass in a big way they may not be avoidable. The Pugwash 70% renewables scenario had 10% of UK land area being used for biomass by 2050, with the implication being that faming practices and diets might have to change. But 2050 is a long way off and by then we may well be generating (and storing) synthetic green gases, using the excess electricity that wind, wave and tidal projects will generate, when energy demand is low, to power electrolysers. Germany has started doing that now, in part as a way to balance variable renewables like wind and PV solar. But this 'power to gas 'idea also helps avoid excessive land use for biogas production, with an extension being to use CO2 captured from the air to convert electrolytic hydrogen into methane. See, for a UK input by ITM: http://www.itm-power.com/wp-content/uploads/2013/04/Platts-April13.pdf

Meanwhile, although burning trees is out of the question, we do need to see what biomass can offer. UKERCs estimate for SRC was 4% of UK electricity and that's just for starters. http://www.ukerc.ac.uk/support/tiki-index.php?page=Biomass+Resources+and+Uses

Heat pumps v CHP/DH

| | TrackBacks (0)


There has been a long and interesting debate over whether Heat Pumps or Combined Heat and Power plants linked to district heating networks are the best option for efficient low carbon home heating.

In theory a heat pump, working like a refrigerator in reverse, can deliver heat with around three times the energy value of the electricity fed in to run it, though in practice they may not always achieve these high levels of return, especially in cold damp weather (Roy et al, 2010). But heat pumps do offer a way of upgrading low-grade heat, from whatever source, including the air, ground, water, direct solar and geothermal, and if they are run using electricity from renewable energy sources, their carbon emissions will be low.

Steam-raising thermal power plants by contrast are much less efficient. However they can be operated in Combined Heat and Power (CHP) mode, so that some of the heat that would otherwise be wasted in the conversion process is captured for use in district heating networks (DH). CHP/co-generation can increase the energy conversion efficiency up to 80% or more, compared with around the 35% typical of conventional stream raising plants, and so it makes a lot of sense, as long as there is a suitable local heat load e.g. a city or urban area. It has been claimed that CHP plans linked to district networks are far more efficient than heat pumps, especially small domestic scale heat pumps.

Heat pumps have an energy output/input Coefficient of Performance (COP) of maybe 3, but since they are using some of the heat that would otherwise be wasted, CHP plants linked to DH can deliver a COP equivalent of up to 9, or more, depending on the grade of heat that is required (Lowe 2011). And if CHP plants use a renewable source of heat, like geothermal or biomass, their carbon emissions, already low/kWh, should be even lower, and cost less/tonne of CO2 saved than heat pumps (Kelly and Pollitt 2009). Though, there is some debate on this; it may depend on the carbon content of the electricity used by the heat pump and the grade of heat that you want out (Woods 2011; MacKay 2013).

It is true that installing district heating (DH) mains can be disruptive, more so than installing heat pumps in individual houses. But once installed and linked to the central heating radiators of houses and other buildings, unlike with domestic heat pumps, there is no in-house device to maintain. Moreover, once installed DH pipes can be fed with heat from any source, as they become available, including solar and geothermal energy and it become a major infrastructure asset. That helps make large scale solar heating linked to heat stirs viable in Denmark- it's claimed to be much more economic than individual house solar heating, if you already have the DH network.

Note also that heat can actually be sent quite long distance without significant losses: for example Oslo's district heating network is fed via a 12.3 km pipe from a waste burning plant in the city outskirts, and in Denmark there's a 17km link from a CHP plant to the city of Aarhus, while heat is delivered by a 200 MW capacity heat main to Prague from a power station 65 km away.

With a CHP/DH system at Odense claimed to have a COP equivalent of near 12, three times that of a heat pump, surely CHP/DH wins hands down? Well no, sadly it's more complex than that. These systems are not operating in a vacuum. If there is a lot of spare electricity available at night (e.g. from a large nuclear and/or wind programme) then heat pumps can use it, and in off- gas grid areas, which are also to be unlikely to be candidates for DH, domestic heat pumps can be very useful .

It has also been pointed argued that, while CHP may convert 1 unit of combustible fuel into, typically, 0.5 units of heat and 0.3 units of electricity, and so is overall 80% efficient, a heat pump converts 1 unit of electricity into ~2 or more units of heat, so its 200% efficient. Ah, but where did the electricity come from? If it's from a fossil or nuclear plant running at 35% efficiency, you are back down to 70%. And retaliating further, the CHP buffs say that, in theory since CHP delivers heat that would otherwise be wasted, its COP is infinite- since it gives us heat for no new fuel input!

It certainly can get complicated. For CHP operation, taking heat out from the near final stages of a power turbine reduces its electrical generating efficiency slightly, but to confuse things further, stream is sometimes taken off at several stages at different temperatures- and times. So there is no one efficiency figure: CHP plants can vary the heat to power ratio depending on market and weather conditions. And of course you can use heat pumps with heat from CHP! And big heat pumps can be good in some locations, taking heat from rivers, lakes or even the sea, as is done in Sweden. The debate goes on...

References

Kelly, S. and Pollitt, M ((2009) 'Making Combined Heat and Power District Heating (CHPDH) networks in the United Kingdom economically viable: a comparative approach' Energy Policy Research Group, Cambridge University: EPRG Working Paper 0925: www.eprg.group.cam.ac.uk/wp-content/uploads/2009/11/eprg09251.pdf

Lowe, R (2011) 'Combined heat and power considered as a virtual steam cycle heat pump', Energy Policy Volume 39, Issue 9, pp 5528-34 http://dx.doi.org/10.1016/j.enpol.2011.05.007

MacKay, D (2013) 'Sustainable Energy without the Hot Air'- on line book, p147 et passim http://www.withouthotair.com/

Roy, R., Caird, S. and Potter. S (2010). 'Getting warmer: a field trial of heat pumps', The Energy Saving Trust, London.

Woods, P and Zdaniuk, G (2011) 'CHP and District Heating - how efficient are these technologies?' CIBSE Technical Symposium, DeMontfort University, Leicester UK 6/7th September

EGU2013: tree spotted in poster halls

| | TrackBacks (0)

It's not often you see vegetation at the Austria Center Vienna, particularly inside the poster halls. But this year Rolf Hut of Delft University of Technology in the Netherlands positioned one of his research subjects, a tree, next to his poster display.

Whilst an unexpected encounter with plants can be pleasant for conference delegates, for those interested in measuring the moisture content in the top 5 or 6 cm of soil by satellite, vegetation can be a problem. The water it contains may be a source of noise in the radar backscatter signals they need, particularly as plants' water content tends to fluctuate during the day.

And that's where the tree comes in. Hut and colleagues are measuring the natural vibration frequency of trees in order to assess changes in their water content. Assuming the tree has a constant stiffness, any alteration in this frequency indicates a change in mass, and hence water content.

By 'plucking' the tree, which had accelerometers attached to its trunk, Hut was able to show delegates the principle of his technique. The oscillation data provided by the accelerometers enabled calculation of the tree's Eigenfunction, or natural frequency. Leaving the tree in the wind would also see it start to oscillate at its natural frequency, Hut said.

Hut's colleague Bouke Kooreman has been testing the approach in Ghana, where good satellite data are available.

Ultimately, understanding how vegetation water content changes during the day and its effect on radar backscatter could not only help remote soil moisture measurements but also provide a new technique for measuring plant water stress remotely.

Hut, whose work was featured in the session on 'Innovative techniques and unintended use of measurement equipment', has also used the Kinect motion detector for the Xbox 360, which incorporates a 3-D scanner, to improve the accuracy and efficiency of determining in situ soil moisture content.

EGU 2013: climate change hard to reverse

| | TrackBacks (0)

It's early days, but scientists are developing techniques to remove carbon dioxide from the atmosphere, either directly through technologies such as artificial trees or, less directly, by biomass burning with carbon capture and storage. Even if these methods are implemented, however, the Earth will feel the temperature effects of climate change for centuries to come.

That's according to Andrew MacDougall of Canada's University of Victoria, who gave a press conference at the European Geosciences Union's General Assembly in Vienna. His simulations using the University of Victoria Earth-System Climate Model indicate that without any artificial carbon removal, and assuming that fossil fuels run out, around 60-75% of near-surface warming will remain 10,000 years into the future.

With a middle-of-the-road scenario for carbon dioxide removal, however, a 20th century-like climate could be restored by the late 24th or early 25th century, MacDougall found. But simulated surface air temperature would still be above the pre-industrial temperature by the end of the 30th century, even for the fastest carbon removal scenario he modelled, as oceans gradually release their stored heat.

Restoring climate will require removal of more carbon from the atmosphere than was originally emitted by man, MacDougall said. In some scenarios, 115-190% of anthropogenic emissions will need to be sequestered. Currently land and the oceans remove around half of the carbon dioxide man emits to the atmosphere each year. Once atmospheric carbon levels fall, this stored carbon will start to emerge. In addition, melting of permafrost as temperatures rise has released methane and carbon dioxide to the atmosphere. "There's no easy process to put this back in," said MacDougall.

The simulations indicate that it's much easier to return ocean pH levels to normal than temperatures. But sea-level rise from melting of the Greenland ice sheet seems largely irreversible - while atmospheric carbon levels of less than 350 ppm could stabilise the ice sheet, water from the oceans would only be refrozen into the ice very slowly.

MacDougall simulated carbon concentrations that followed the representative concentration pathways RCP 2.6, 4.5, 6.0 and 8.5 used in the IPCC's forthcoming fifth assessment until they reached their peak, in 2050, 2150 and 2250, respectively. Then he reduced carbon concentrations at the same rate that they had increased, as well as restoring pasture and croplands to their pre-industrial area.


EGU 2013: stormy times ahead

| | TrackBacks (0)

Thunderstorms are getting stronger and more frequent, according to Eberhard Faust of Munich Re, speaking at a press conference at the European Geosciences Union General Assembly in Vienna.

In 2011 losses from thunderstorms east of the Rockies reached a record value of $47 billion, with two cities hit by outbreaks. For comparison, Hurricane Sandy caused losses of $60 billion.

Together with scientists from the German Aerospace Center (DLR), Faust examined data for severe US thunderstorm losses east of the Rockies from March to September each year from 1970 to 2009. Both the mean level of loss and the variability went up. Some have ascribed this rise to an increase in the value of building stock. But by correcting for socio-economic changes, Faust found that the change was due to altered thunderstorm activity.

Faust sees this increase in thunderstorm activity as due to changes in climate, which have boosted humidity at low levels of the atmosphere and increased seasonal aggregated potential convection energy. These storm changes are consistent with the modelled effects from manmade climate change, he said, but he currently can't make a call on whether they are down to natural climate variability or to man.

To come up with the results, which are published in Weather, Climate and Society, Faust and colleagues looked at thunderstorm severity potential, a measure of the potential energy in the atmosphere available for convection, and the strength and direction of the wind between ground level and 6 km. The team counted occurrences of thunderstorm severity potential of more than 3000 J per kg. The mean value of occurrences increased by a factor of two between1970-1989 and 1990-2009, while standard deviation, a measure of variability, changed by a factor of 1.5.


EGU 2013: Texan wind farms raise temperatures

| | TrackBacks (0)

When Liming Zhou of SUNY at Albany, US, and colleagues found a link between Texan wind farms and warmer temperatures during summer nights, many argued that the effect was simply because the wind farms were sited on top of mountain ridges. But now, by comparing temperatures above wind farms with those for similar wind-farm-free ridges nearby, Zhou is confident that the raised temperatures he found are caused by operation of the wind turbines.

Speaking at the European Geosciences Union General Assembly in Vienna, Zhou explained how he and his colleagues looked at an area in West-Central Texas containing four of the world's largest wind farms between 2003 and 2011. The average temperature increase about 1.1 km above the wind turbines at night in summer was up to 1 °C, as measured by MODIS kit onboard satellites. During the day, the presence of wind turbines did not seem to affect temperatures. In winter, when the wind turbines were generally operating at lower speeds, the night-time warming effect was less pronounced.

Zhou believes that at night-time the turbulence from wind turbine operation brings warmer air higher in the atmosphere to lower levels. During the day, the atmosphere is much less stable so the wind turbines do not have as great an effect.

Zhou stressed that the temperature effects of the wind farms are small and local, compared to the global temperature changes being caused by burning of fossil fuels. Zhou's latest results, which check out the mountain ridge effect, are in review for publication.

EGU Abstract: Assessing Possible Climatic Impacts of Large Wind Farms Using Satellite Data

EGU 2013: a bumpy ride for transatlantic flights

| | TrackBacks (0)

If, like me, you're a nervous air passenger, the news from today's European Geosciences Union General Assembly wasn't good. Speaking at a press conference just 15 minutes after the publication of his paper, Paul Williams of the University of Reading, UK, revealed how climate change is likely to bring stronger and more widespread clear air turbulence for transatlantic flights.

A doubling of carbon dioxide concentrations, which could well occur by the 2050s, would increase the average strength of clear air turbulence by 10-40%, Williams and colleague Manoj Joshi of the University of East Anglia, UK, calculated. The amount of airspace containing significant turbulence would also increase by 40-170%; Williams said the most likely figure would be 100%.

To come up with these results, the pair employed the GFDL-CM2.1 climate model to simulate 20 years' worth of data for pre-industrial and doubled carbon dioxide concentrations, using 21 separate measures of turbulence. They analysed clear air turbulence during the winter, when it's at its most intense, along the North Atlantic flight corridor, one of the busiest in the world, with 300 flights in each direction each day.

This turbulence isn't just a problem for scaredy-cats. It can injure, or even kill, passengers and aircrew, it can damage planes, for example breaking off engines or parts of the wing, and it currently costs society about $150 million each year in injuries, damages and investigations. As a result of the additional turbulence, airlines may have to reroute flights, boosting fuel consumption, increasing air pollution, and potentially causing delays and increasing ticket costs.

So why the increase? As climate changes, the atmosphere is warming above our heads as well as at ground level, explained Williams, leading to higher wind speeds. Or, to put it another way, a stronger jetstream is destabilising the atmosphere and the random fluctuations in upwards and downwards winds push against aircraft wings.

The atmosphere strikes back

Worse still, unlike the turbulence caused by clouds or storms, clear air turbulence is hard to detect. You can't see it, and satellites or aircraft electronic systems don't pick it up. Some ground-based radar systems can detect it, but only the very powerful ones, said Williams, as could radiosondes on balloons, which measure the amount of wobble along a single trajectory.

In their calculation of the amount of airspace containing significant turbulence, Williams and Joshi used the industry definition of moderate-or-greater turbulence, which produces an acceleration in the plane of 5 m/s2 or more, a force equivalent to half a G. This is enough to bounce the aircraft around, make it hard to walk and knock over drinks, said Williams.

"Aviation is partly responsible for changing the climate in the first place," he said, in a University of Reading press release. "It is ironic that the climate looks set to exact its revenge by creating a more turbulent atmosphere for flying."

The study is published in Nature Climate Change.

A silly tidal idea?

| | TrackBacks (0)


In a press article, one-time Welsh Secretary Peter Hain puts the case for the new privately funded tidal barrage across the Severn proposed by Hafren.. He said it would 'make a greater contribution to tackling climate change than any other green energy project', supplying 'fully 5% of the UK's electricity (16.5 TWh pa. year) of clean, low carbon, predictable and therefore base load energy'. www.clickonwales.org/2013/01/new-severn-barrage-would-exploit-two-way-tides.

In fact, it would not generate 5% of UK electricity. 16.5TWh is 4.5% of 2011 demand, and when the barrage might be working (it will take ~10 years to build), demand may have risen above the 2011 level of 365TWh, even given efforts to cut it. Moreover, it is not base load. Not all the annual output could be used. Due to the daily lunar cycle, peak output would often occur when there was no demand for it, and, at other times, there may be high demand when there is no output, the peaks also shifting by 50 minutes or so each day and varying in size with the spring/neap cycle.

Operating two ways, on both the ebb and flood cycles, as proposed by Hafren, does extend the period during which power can be produced, they say up to 16 hours per day, but the two way turbines needed will be expensive, wear out faster and suffer more breakdowns. That all adds to the cost, which overall is put at £25 billion, to be raised it seems from sovereign wealth funds.

Major capital intensive projects like this can have long pay back times, but also run for long periods, so the economic viability depend on the financing arrangements. Most studies have suggested that it would be hard to finance a big barrage given the rates of return expected in the private sector. Hain however says it can be done and even claims the barrage 'will reduce overall consumer electricity bills by 3.5% a year on average over its life. So it would be the cheapest electricity source in the UK, 50-75% cheaper than coal, gas, wind or nuclear for over 100 years.' But, tellingly, he adds, 'an electricity contract will have to be negotiated containing the usual price support mechanism for all renewable energy projects over 30 years'. So there would be a long interim subsidy, much like the nuclear industry is seeking.

Is it worth it? The Sustainable Development Commission's study of the previous barrage concept found that it would only save around 0.9% of UK emissions, and the new version would be no better. You would get far better results from almost any other green energy investment. Frontier Economics' study for WWF showed that big barrages would cost much more than any other supply option- even nuclear. http://assets.wwf.org.uk/downloads/frontiereconomicsbarrage_repo.pdf And it would also impose massive environmental costs.

Hafren have tried to address the environmental impacts by using, larger number of slower speed turbines and by operating two -ways, on a lower head. But it still blocks the entire estuary and its eco-impacts will still be huge, despite the modifications, which add to the capital and operating costs.

The 150 page proposal they submitted to the government did not fare well. The government had indicated that, although it would not provide financial support for large barrages, it was open to companies to develop proposals for private projects. A meeting with David Cameron had reputedly gone well and it was said Lord Heseltine had backed the idea, as one his proposed major infrastructure projects. However Greg Barker, energy and climate change minister, has now indicated that Hafren had not provided enough information to be convincing. Too many questions remained over the project's affordability, environmental impact and its effect on the Port of Bristol, upstream of the proposed barrage. He was quoted by the BBC as saying 'The information that the department has seen so far doesn't allow us to assess if the proposal is credible'.

Labour MP John Robertson, was less charitable. He said the amount of information provided by Hafren Power, was 'embarrassing'. In a session of the Energy and Climate Change select Committee which has been looking at the issue, he told the minister: 'I'm surprised you haven't just thrown it out completely' There is a 41 page version of Hafren's case at www.hafrenpower.com

Baker made clear that it was unlikely that the hybrid bill required for the project would be considered before the election in 2015, but that, although Hafren's evidence was not sufficiently compelling, the government had left the door open.

For the moment it looks a very unlikely contender. Unless very large energy storage facilities are built, or major new undersea inter-connectors constructed to export the excess power to the continent, a big multi GW barrage would just not fit in the UK energy system. It's even worse in that respect than large nuclear plants- they are only 1.6 GW. Most environmental groups have opposed large barrages strongly, on the basis of their eco impact. But the strategic case against them has also been recognized. For example, the Green Party has come out against the proposed new variant, preferring less invasive lagoons and tidal reefs/fences instead:https://docs.google.com/file/d/0B5-HODLqTnX7OXAyR2x2XzBYZ0E/edit?pli=1

A recent report by Regen SW and consultancy firm Marine Energy Matters says that tidal lagoons and tidal fences, tidal stream technology, wave and wind power, could be far less harmful to the environment, and provide up to 14 GW of low carbon energy capacity, at least twice that of the proposed Hafren Barrage. The UK's potential tidal current resource alone is put at about 8GW. There is also an interesting new proposal for 250MW tidal lagoon off Swansea. www.tidallagoonswanseabay.com

A multi-technology, multi-site, approach would be modular (lots of faster to install, smaller projects- so easier to finance) and more flexible (better matched to daily demand patterns). By contrast, with a single large barrage you get large pulses of often unusable energy, and in terms of construction, it's all or nothing. You can't build half of one, and if it turns out to have problems (e.g. silting up), you may be stuck with expensive remedial measures, or even complete failure.

James Hansen on pacts with the devil

| | TrackBacks (0)

We are increasing the stakes of the climate Faustian bargain, believes James Hansen, who retires from NASA this week, through higher levels of fossil fuel particulate and nitrogen pollution, which mask greenhouse gas warming in the short term. The Faustian bargain sees aerosols reduce the net human-made climate forcing; they only maintain this level of reduction, however, if we allow air pollution to increase as emissions rise. Once people decide to reduce particulate air pollution for health reasons, the "devil's payment" will be extracted via increased global warming.

The original Faust, the story goes, entered a pact with the devil and received magic powers for 24 years before the devil claimed his soul, leaving Faust in eternal damnation.

"The more we allow the Faustian debt to build, the more unmanageable the eventual consequences will be," writes Hansen in a perspective article (PDF) in Environmental Research Letters (ERL), adding that plans to build more than 1000 coal-fired power plants and develop some of the dirtiest oil sources on the planet should be "vigorously resisted". According to Hansen, we are already in a deep hole and it is time to stop digging.

As a result of increased coal use, annual carbon emissions from fossil fuels have risen at about 3% per year over the last decade, double the rate of the thirty years before. But the airborne fraction of carbon dioxide - the ratio of annual carbon dioxide increase in the air divided by the annual fossil fuel emissions - has declined since 2000, say Hansen and his colleagues, who believe this is due to an increase in carbon sink uptake linked to the rise in coal use. The nitrogen emitted by burning coal can fertilise the biosphere and boost carbon uptake by vegetation, and the aerosols released can also make sunlight more diffuse, aiding photosynthesis.

An addition of 5 Tg of nitrogen a year from fossil fuels, with an increase in net ecosystem productivity of 200 kg of carbon per kg of nitrogen, would give an annual carbon drawdown of 1 Gigatonne per year, calculate the researchers. This would roughly explain the post-2000 anomaly in airborne carbon dioxide.

Hansen says that if greenhouse gases were the only climate forcing, he and his colleagues' conclusion that actual greenhouse gas forcings are slightly smaller than IPCC scenarios, along with Rahmstorf's conclusions that actual climate change has exceeded IPCC projections, would tempt the team to infer that climate sensitivity is on the high side of what's generally been assumed. "Although that may be a valid inference, the evidence is weakened by the fact that other climate forcings are not negligible in comparison to the greenhouse gases and must be accounted for," he writes, along with co-authors Pushker Kharecha and Makiko Sato from the NASA Goddard Institute for Space Studies and Columbia Earth Institute.

Human-made aerosols are the key culprits as it's not easy to sharpen up their effect. That said, under Hansen's "Faustian bargain", it looks like aerosols are reducing the net climate forcing of the past century by about half.

At first glance the increase in fertilization of the biosphere and additional aerosol cooling from the Far East seem to be good news, say the researchers. "Both effects work to limit global warming and thus help explain why the rate of global warming seems to be less this decade than it has been during the prior quarter century," they write. But increased carbon dioxide doesn't necessarily mean that the biosphere is healthier or that the higher carbon uptake will continue indefinitely. "Fertilisation of the biosphere affects the distribution of the fossil fuel carbon among these reservoirs [atmosphere, ocean, soil, biosphere], at least on the short run, but it does not alter the fact that the fossil carbon will remain in these reservoirs for millenia."

Since deleterious effects of warming are apparent even though man has only burned a small portion of total fossil fuel reserves, and only about half of the warming due to gases now in the air has appeared because of inertia in the climate system, it seems difficult to avoid passing the "guardrail" of no more than 2°C of warming agreed in the Copenhagen Accord, say the researchers. "What is clear is that most of the remaining fossil fuels must be left in the ground if we are to avoid dangerous human-made interference with climate," they write.