This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

January 2011 Archives

During the annual State of Union address on January 25, 2011, United States' President Barack Obama spoke briefly about energy policy and a future energy transition. I will focus on a short excerpt of the speech here:


State of the Union: "We need to get behind this innovation. And to help pay for it, I'm asking Congress to eliminate the billions in taxpayer dollars we currently give to oil companies. (Applause.) I don't know if -- I don't know if you've noticed, but they're doing just fine on their own. (Laughter.) So instead of subsidizing yesterday's energy, let's invest in tomorrow's.

Now, clean energy breakthroughs will only translate into clean energy jobs if businesses know there will be a market for what they're selling. So tonight, I challenge you to join me in setting a new goal: By 2035, 80 percent of America's electricity will come from clean energy sources. (Applause.)

Some folks want wind and solar. Others want nuclear, clean coal and natural gas. To meet this goal, we will need them all -- and I urge Democrats and Republicans to work together to make it happen. (Applause.)"


First the President is calling for elimination of subsidies to oil companies. Some of these subsidies include decreased royalties and depreciation rules that are not too dissimilar to non-oil energy generation projects. The point of the excerpt I will briefly focus on here is the President's challenge to generate 80% of US electricity from "clean energy" sources by 2035. The President then defines these clean energy sources where the only one not in commercial production is "clean coal" which we can assume is discussing the capture and sequestration of carbon dioxide from coal-fired power plants.

2009_US_electricity_generation_by_source. Crea...

Image via Wikipedia

If you add up the amount of electricity from the "clean energy sources" that President Obama lists, you see that in 2009 approximately 54% of electricity already comes from these sources: 23% natural gas, 20% nuclear, 7% hydroelectric, and 4% renewables other than hydropower. So in effect the President was asking to capture and sequester carbon dioxide from approximately 60 percent the 45% of electricity that comes from using coal. In 2008 the percentages of 'clean energy' were as follows: 21% natural gas, 20% nuclear, 6% hydroelectric, and 3% renewables other than hydropower. US coal power accounted for 50% of total electricity.


If you see the effect of recession on electricity consumption, you see that it was effective at decreasing the use of coal-fired electricity as in 2008 coal power amounted to 1,986 TWh (1 TWh is one billion kWh) of electricity versus 1,764 TWh in 2009. The total electricity generation in the US was 4,119 TWh and 3,953 TWh in 2008 and 2009, respectively. Increasing percentages of alternatives can come from decreased, or conserved, generation of total electricity as well as switches in use of technology. Of course the decrease in electricity generation and consumption is broadly associated with no or less economic growth, something most people don't see as a solution.

Thus, the President's goal (at least as discussed in the State of the Union) for clean energy was relatively narrow in that it focused on electricity and not energy sources in general. He did mention biofuels and reducing imported oil but gave no new specifics in the speech. His statements on clean electricity can broadly be interpreted as promoting carbon capture and sequestration technologies. This is not a large difference from what was already happening through funding and research projects supported by the US Department of Energy. Targeting 2035 provides plenty of time to develop technology over the next 10-15 years while then retrofitting existing and building new coal-fired power plants that capture carbon dioxide from 2020-2035.

So the President's speech was inspiring for clean energy advocates and coal purveyors. Not so inspiring for oil companies, but perhaps provides them direction to focus more on natural gas than oil in the US. This is not too hard given the increasing plays for natural gas from shale that the large companies (e.g. ExxonMobil acquisition of XTO, a company with assets in shale natural gas) are already moving on. In all, the discussion of a move to 'clean energy' was primarily stating what is already being done.

Enhanced by Zemanta


For moment the EU leads the offshore wind field, with nearly 3GW in place, 1.3GW off the UK coast. The world's largest offshore wind farm is currently the UK's 300MW array at Thanet, in the North Sea. But the 500MW Greater Gabard project 23km off Suffolk is coming along and the 1GW London Array in the outer Thames estuary will top that, and many more are planned off the UK and also elsewhere, for example France is now looking for 30 sites for 3GW of offshore wind and wants 6GW by 2020. Overall, nearly 20GW has now been consented in EU waters. The longer term EU potential is put at maybe 150 GW, or much more, given newly emerging deep-water technology, allowing for location much further out in the North Sea and elsewhere, for example, off the coast of France, Spain and Portugal.

Although some projects are now emerging off New England, the US has been slow to develop offshore wind projects, in part since, unlike the UK and some other EU countries, it doesn't have shallow water off its (east) coast. But deep-water technology solves that problem and, as in the EU, opens up a very large resource.

Deepwater Wind, a company based in Providence, Rhode Island, has drawn up plans for what could be the largest wind farm in U.S. waters – a 1GW Deepwater Wind Energy Center with 200 turbines off New England, which would cost $4–5bn. It would use 5MW turbines mounted on four legged platforms 18 to 27 miles off the Rhode Island coast at a depth of 52 meters: more than twice that of conventional steel 'monopole' wind turbine platforms.

As water depth increases, the diameter of monopoles must increase exponentially, making them uneconomical in water deeper than about 20 meters. By using a four-legged design, the company says they will be able to work in depths that were previously prohibitively expensive. And being far out to sea there should be fewer problems with objections over visual intrusion- a major issue so far in New England. More: www.dwwind.com

The basic deep-water technology, involving structures with multiple legs, was originally developed for oil and gas platforms and a version has already been used for some deep-water offshore wind turbines in the EU e.g. the Beatrice 2 x 5MW turbine project off Scotland. So-called 'tension leg' technology, with typically four semi-submersible piles tethered to the sea-bed, has been used. The 2.3 MW Dutch Blue H turbine, tested off Italy, used this concept. And in Portugal, EDP is developing a 2 MW WindFloat with three legs, while in France Nénuphar, Technip, and EDF are developing a novel Vertiwind device, with vertical axis wind turbine mounted on three legs. After work on a prototype, they hope to test a full scale 2MW device at sea in 2013.

If you want to go even further out and into deeper water then you need fully floating systems, like Norways Sway and HyWind, which, it's claimed, can operate in depths of between 120 and 700 metres. See my earlier blog.

Sway is about to start testing a scale prototype of its proposed 5MW floating turbine off the coast of Norway at site near Hordaland. The Sway turbine can tilt by 5–8 degrees from the vertical.

Although, a 2.3 MW version of HyWind has already been tested,10km off Norway, in general fully floating device technology is relatively undeveloped. But hopefully not for long. In addition to the work mentioned above, 19 partners from 8 EU countries, under the direction of the Fraunhofer IWES, have entered the conception phase for HiPRwind, the largest publicly funded research project for the development of enabling technology for deep-water offshore wind, with €11 m contributed to the €20 m 5-year project by the European Commission.

That's in parallel with the EU's £3m DeepWind Vertical axis Darrieus ('egg beater') type floating 10MW wind turbine project led by RISO in Denmark, Spain's 50MW floating wind project 30km offshore in depths up to 100m, and the UK's novel V-shaped vertical axis 10MWAerogenator X, developed out of the ETI's Nova project.

China is also developing offshore wind technology. Its first offshore wind farm, a 102 MW array near Shanghai, is being followed by others Construction of a 1GW offshore wind farm in Bohai Bay, around three hours from Beijing, is expected to be complete by 2020. The Government has invested €1.6bn in the project, which is being managed by the state-owned China National Offshore Oil Corporation.

However, they are not currently looking far out to sea. As I reported in a previous Blog, they are looking at what they see as less risky and more commercially viable near-shore options: http://environmentalresearchweb.org/blog/2010/12/offshore-wind-costs.html.

In particular, with a potential 100 to 200GW being available in extensive tidal flats, the emphasis in China in the short term is on intertidal projects. There should be at least 10GW of installation by 2020 in Jiangsu province. A 30MW pilot project is under construction there and a 300-MW may follow, using 3.6MW turbines and a novel five pile support structure to cope with the tidal flat's muddy seafloors and shifting sandbars.

www.technologyreview.com/energy/24978/?mod=related

It's hard to know if deep-sea wind will prosper. If they can be developed successfully, floating wind turbines can be towed out to be located on station, and so can avoid some of the large deployment costs associated with drilling piles into the sea bed, and they can also be brought back to harbour for maintenance, avoiding the difficulty and cost of getting access at sea. But there are significant extra costs with longer undersea grid links. However, if and when a full North sea supergrid network is established, then some of these costs would be shared with other projects, and the system as a whole would earn more income by being able to transfer power between the UK and the other EU countries involved with the supergrid: see http://environmentalresearchweb.org/blog/2010/12/the-north-sea-supergrid.html.

The US economy is challenged by the financial-real estate bubble induced Great Recession, and it is struggling to escape the carbon lock-in induced by physical and mental dependencies on cars and coal. This carbon lock-in is distinctively characterized by temporal dynamics: A change in infrastructure, technologies and behavior is costly and pays off only in the mid-to-long term. If nothing happens, however, US commuters will pay the bill in becoming passive reactants of volatile and augmenting oil prices.

Obama, in his State of the Union Address, related both challenges by suggesting to move subsidies from oil companies to clean energies – not without jokingly pointing out that they are doing fine on their own (see minutes 20–22 in this video). The redistribution addresses a concern of fiscal hawks in not spending new money. This move will also count twice for the big energy transition. First, obviously clean energies are supported. Second, dirty technologies loose support. In fact, the latter move may be more important, and appropriate then the first one. For this consider that for reasons of economic efficiency you want to provide a level playing field across environmentally technologies. Or more bluntly, you don't want to waste all your money into a semi-plausible highly expensive technology. The problem is: you don't know in advance, at least you cannot be sure. Is hydrogen part of the solution or a dead-end? Will biofuels at one point contribute to decarbonization in a sustainable manner, or will they rather continue to exacerbate our environmental troubles?

Subsidies always need to choose technologies based on limited knowledge on future development. However, if you know that some technologies are harmful, you can make them more expensive, e.g. by cutting wasteful subsidies. With such a move, you can be sure to do something right – and to support the treasury.

Update: Andrew Revkin rightly points out that only targeting oil is a quite narrow focus. Indeed, the US corn industry (including biofuels) also lives on generous subsidies while producing corn ethanol with high GHG life cycle emissions. Here is what Andrew says:

"Obama clearly picked up on bipartisan interest in eliminating distorting energy subsidies, but sadly targeted only oil subsidies in seeking the billions he wants for research and innovation.

A bias toward punishing the oil industry, leaving out the huge bonbons handed out to big coal and biofuels, is bound to stir up a fight rather than resolve one. That's one reason that some "green" subsidies would need to go, as well."

By James Dacey

At the end of November last year, the presidency of the UK's Royal Society passed from cosmologist Martin Rees into the hands of the Nobel-prize-winning geneticist Sir Paul Nurse. Heading the world’s oldest scientific academy brings a responsibility to uphold the organization’s grand aim “to expand the frontiers of knowledge by championing the development and use of science, mathematics, engineering and medicine for the benefit of humanity and the good of the planet.”

And Nurse, it seems, is wasting no time in grabbing his presidency by the reigns. Last night he appeared on UK television presenting an episode of the long-standing documentary series Horizon, entitled “Science under attack”. The hour-long show explored the public’s relationship with science, as influenced by the media, and it focused primarily on climate science and the rise of public scepticism.

Towards the beginning of the show, Nurse cited a recent poll that found nearly half of people in the US, and more than a third of Britons, believe that manmade climate change is being exaggerated. “It’s this gap between scientists and the public that I want to understand,” proclaimed Nurse, teeing up the show.

For the next 50 minutes or so, Nurse then visited a selection of players on either side of the debate. It was framed within the narrative of a personal journey: an eminently reasonable scientist who knows lots about the process of science but not the specifics of climate science. And to his credit, Nurse played his part exceptionally well, showing that science involves personalities and conflicts just like any other human activity.

Naturally, the show came to focus on “Climategate”, the controversy that erupted in November when internal e-mails between members of the Climate Research Unit at the University of East Anglia, UK were leaked to the public. The main controversy blew up around an e-mail sent by the then CRU director Phil Jones to a colleague in which he referred to “Mike’s Nature trick”, describing the splicing of temperature data from direct and indirect sources.

“The [World Meteorological Organization] wanted a relatively simple diagram for their particular audience,” Jones explained to Nurse. When asked why he thought there had been such a huge reaction to the leaks, Jones is obviously still perplexed. “A number of the climate change sceptics or doubters or deniers, whatever you want to call them, just wanted to use these e-mails for their own purposes, to cast doubt on the basic science.”

Following his visit to UEA, Jones then paid a visit to a person firmly on the other side of the debate, James Delingpole, the online journalist who broke the “Climategate” story on his Telegraph blog. This led to the most captivating scene of the documentary when Nurse puts it to Delingpole that denying climate change is like ignoring the consensus medical view when choosing how to treat cancer.

Asking Nurse to change the topic, Delingpole retorts, “I think it’s very easy to caricature the position of climate change sceptics as the sort of people who don’t look left and right when crossing the road.” Adding that he “slightly resented” the way the analogy had been brought in.

UK viewers can watch the documentary at this link.

Source: www.physicsworld.com

In an earlier blog post ("The Algebra of Algae…to Biodiesel") I discussed if the US was to reduce its CO2 emissions to 17% of those in 2005 (mimicking the 'popular' climate legislation from two years ago in 2009), then the US could produce 50 billion gallons of biodiesel from an algae feedstock. Aside from later being told that titling the blog "Algaebra" would have been much better (what I agreed with at the time), I have now discovered that the web is littered with discussions of brassieres made of algae. I'm glad I used my previous title!

But I digress, the caveat for my previous blog on algae biodiesel was is that to meet the CO2 emissions limits there could be no other source of CO2 emissions other than the power plants that would be capturing CO2 and piping that CO2 to the algae farms. There is also the possibility of using CO2 directly from the atmosphere to grow algae, but most algae-facility designs assume a source of concentrated CO2 to grow the algae feedstock. Clearly we need to understand the limitations of using ambient air, and the inherent CO2 in the air, versus supplemental CO2 from anthropogenic sources.

Over the last year a student (Colin Beal) at the University of Texas, Austin, has been characterizing the experimental set-up at the Center for Electromechanics for testing an algae to bio-oil process. The process stops short of converting the bio-oil into biodiesel, and he presented the results at a recent conference: Beal, Colin M., Hebner, Robert E., Webber, Michael E., Ruoff, Rodney S., and Seibert, A. Frank. THE ENERGY RETURN ON INVESTMENT FOR ALGAL BIOCRUDE: RESULTS FOR A RESEARCH PRODUCTION FACILITY, Proceedings of the ASME 2010 International Mechanical Engineering Congress & Exposition IMECE2010 November 12–18, 2010, Vancouver, British Columbia, Canada, IMECE2010-38244.

Colin counted the direct (electricity primarily) and indirect energy (nutrients, water, CO2, etc) inputs into the process along with the energy content of two outputs: the biomass of the algae itself and the bio-oil extracted from the algae. He did not count the energy embodied in any capital infrastructure. What he found for this experimental, and very batch process was that the EROI of the experimental process was approximately 0.001.

This experimental EROI value for energy from algae must be kept in perspective of the stage of development of the entire technology and process of inventing new energy sources and pathways. It is important that we understand how to interpret findings "from the lab" into real-world or industrial-scale processes. To anticipate the future EROI of an algae to biofuel process, Colin performed two extra analyses to anticipate what might be possible if anticipated advances in technology and processing occur: a Reduced Case and Literature Model calculation.

The Reduced Case presents speculated energy consumption values for the operation of a similar production pathway at commercial scale. Many energy inputs are simply not needed or would be much smaller in a continuous flow process. The Literature Model provides an estimate for the EROI of algal biocrude based on data that has been reported in the literature. In this way the Reduced Case is grounded on one side by the sub-optimal experimental data and on the other side by the Literature Model, which is largely comprised of theoretical data (particularly for biomass and lipids production from optimal algae).

What Colin discovered was that the EROI of the Reduced Case and Literature Model were 0.13 and 0.57, respectively. This shows that we have much to learn for the potential of making viable liquid fuels. Additionally, Colin's calculations for the experimental set-up (and Reduced Case analysis) show that 97% of the energy output resides in the biomass, not the bio-oil. For his idealized Literature Model, 82% of the energy output was in the biomass.

While these results seem discouraging, we do not have much ability to put these results into context of the rate of development of other alternative technologies and biofuels. How long did it take to get photovoltaic panels with EROI > 1 from the first working prototype in a lab? We have somewhat of an idea that it took one or two decades for the Brazilians to get reasonable EROI > 1 from using sugar cane for biomass and biofuel production (Brazilian sugar cane grown and processed in Sao Paulo is estimated near EROI = 8).

I believe we need to strive to quantify EROI for new technologies even they are still in the laboratory stage. Perhaps some very early technologies and processes are even too early for estimating or measuring EROI, but algae biofuels are clearly in the mainstream of research given the $500 m investment by Exxon-Mobil into genomics firms searching for the ideal strains of algae. These ideal strains of algae might simply excrete hydrogen, ethanol or lipids such that all of the capital infrastructure and direct energy requirements assumed for collecting algae and extracting the lipids even in Colin's Literature Model can be largely unnecessary. Let's hope others join in in trying to assess the EROI of their experimental and anticipated commercial processes for alternative energy technologies.

District heating networks, using gas, waste heat from power stations or heat from biomass combustion, to heat houses and other buildings collectively, are common across much of continental Europe, especially in the North. There are also some large solar-fed heat grids and many heat stores. There are even some inter-seasonal heat stores, which help to deal with variable supplies over the year, and variable demand for heat, e.g. during winter evenings. See my earlier blog.

More district heating projects are proposed. For example, 'Heat Plan Denmark' a study financed by the Danish District Heating Association, argues that District heating is the key technology for implementing a CO2 neutral Danish heating sector in a cost effective way. They claim that the Danish heating sector can be CO2 neutral by 2030 by upgrading and expanding the existing system, with, for example, heat pumps being used to upgrade the heat energy currently supplied and more heat stores being added. At present much of the system still uses gas as the main energy input, but they look to the use of more renewables, and more efficient waste-to-energy Combined Heat and Power (CHP) plants with flue-gas condensation. So the emphasis will shift increasingly to using large-scale solar heating, biomass /biogas CHP, geothermal energy and excess wind energy – and more heat storage.

Overall, they see district heating moving up from 46% to 70% of the market share, and suggest that the remaining heat market can be covered by domestic-scale heat pumps and wood pellet boilers in combination with individual solar heating. However, they claim that district heating combined with CHP plants and larger scale renewable energy is more cost effective than domestic-scale solutions based on more investments in the building envelope and/or in individual renewable energy solutions. More at www.danskfjernvarme.dk.

A similar conclusion emerged from a study of district heating in Copenhagen, which has the world's largest heat network, currently fed mostly by 10 CHP plants with a total of 2 GW of heat capacity. About 45% of the fuel is from renewable sources (biomass/wastes), and that proportion is planned to expand. In addition to using geothermal heat, they are testing a demonstration solar plant to deliver solar heat to the district heating system, with a heat pump being used to raise the temperature of the water from the solar panels, or a linked heat storage tank, before the heat is delivered to the district heating network.

By contrast, we have a long way to go in the UK. Heat accounts for about 44% of UK energy consumption, mostly for heating homes and providing hot water, using individual domestic boilers – 84% of UK homes are heated by gas. This may change as and when the Renewable Heat Incentive (RHI) and the Zero Carbon Houses programmes kick in and domestic-scale solar, biomass micro-CHP and so on are taken up. But what about the larger scale and all of the waste heat from power stations?

The UK's total demand for heat is about 800 TWh p.a., about the same as that released by all power generation/industrial processes as waste. So far, with gas being relatively cheap, Combined Heat and Power, which can reclaim much of this waste, has not lifted off very significantly in the UK, a few biomass-fired plants aside. Neither has district heating or heat storage. But with gas prices rising and concerns about emissions growing, heat reclamation and storage ideas are now being explored-borrowing from what's happening elsewhere in the EU. For example, the Energy Technologies Institute is looking at waste heat collection and storage on a large scale. As they note 'It is technically possible to store very large quantities of heat energy below ground in geological structures such as saline aquifers or disused mines. The heat could even be accumulated through the summer to be used during the winter. Many of the potential heat sources and storage areas are close to centres of population and could be used to support large-scale district heating schemes.

And the recent DECC Microgeneration Strategy Consultation also looks at energy storage, and in particular at Underground Thermal Energy Storage (UTES). It includes a mini case-study of a UTES installation in Sweden, which paid back the additional installation cost (compared to an oil fired system) in under four years and continues to save money, energy and carbon year on year. It reported that, as well a tapping heat from power stations, this approach "can be particularly effective to create energy clusters where excess heat from buildings with a net cooling load can be utilised as a source of heat for others nearby with a net heating load to save carbon and reduce energy consumption".

However, DECC's focus seems to be on the smaller scale. It added: "Work is on-going between Sweden and the UK to use similar underground thermal energy storage techniques on a domestic scale." It commented that in the UK: "In the domestic sector there is scope to store hot water generated by renewable energy through the wider deployment of hot water cylinders." But, it said: "74% of the circa 1.5 million boilers fitted annually are 'combination' boilers, so the opportunity to future-proof homes for renewable-heating technologies, through the provision of hot water cylinders, is limited." And it noted that: "There are no plans to set mandatory requirements for the provision of hot water cylinders – there is a trade-off between the benefits of large water volumes needed to bank/smooth renewable-energy supplies and the higher standing losses from large volumes at elevated temperatures. There are also design issues to consider. Larger cylinders weigh more and take up more space and there is a greater risk of stratification. There may also be an increased threat of legionella from water storage facilities, unless the appropriate elimination steps are taken."

So, as seem to be the norm, we are taking it very slowly and cautiously, and focusing mainly (the ETI project apart) on the domestic scale. Ideas like large-scale solar district heating are still evidently heretical. Someone had better tell the citizens of Graz, a city in NE Austria, which has a District Heating network with 6.5 MW of solar thermal input.

Cyclists

Image via Wikipedia

Transport research is often fixated on the automobile. Car transportation provides user benefits, costs money, and produces social costs, making it an attractive object for economic researchers of transportation. At the same time, pedestrians and cyclists receive less intention - even though constituting truly environmental (and often enjoyable) modes important for many travellers. This gap in profound research may be caused by the relative insignificance of these modes in purely economic terms.

Eva Heinen, Kees Maat and Bert van Wee from the University of Delft have now published an article that not only focuses on cycling but also tries to establish a framework that goes beyond pure socio-economic utility considerations in analyzing the role of attitudes and norms in cycling decisions. The study asks Dutch cyclists on their attitude on cycling, including both utility and normative aspects, such as status and environmental concerns.

The study concludes: "[...] individuals base their mode choice decision on the direct benefits in terms of time, comfort and flexibility. Individuals who commute over longer distances have, on average, a more positive attitude towards cycling than those who cycle shorter distances. [This] support the idea that individuals have a more positive attitude as the bicycle commute lengthens."

Safety plays a significant but minor role. However, the authors suggest that cycling in the Netherlands is relatively safe, and that safety is a much more important factor in other countries. Indeed, in Beijing , for example,  repeated comments from residents indicate that they prefer to cycle but are scared by dangerous car traffic and shift to public transit or cars.

This study, and others, need to be complemented by both quantitative and conceptual research to strengthen the research record on the modes that may be less important in terms of monetarized utility but provide wider social benefits in terms of physical health, environmental benefits, and life quality. 


Reference

Eva Heinen, Kees Maat and Bert van Wee. "The role of attitudes toward characteristics of bicycle commuting on the choice to cycle to work over various distances". Transportation Research Part D: Transport and Environment, Volume 16, Issue 2, March 2011, Pages 102-109

Enhanced by Zemanta

The coalition government has talked a lot about decentralisation and supporting small-scale local projects. The preface to DECC's Microgeneration Strategy consultation says: 'There will be a role for small-scale electricity producers in homes, schools, offices and factories around the country to complement the substantial new investments needed in large-scale Carbon Capture and Storage, nuclear and renewable electricity such as offshore wind; a new supply of locally-produced power that spreads the risk and can help make us all more self-reliant. And there will be a step-change in the use of renewable micro-technologies such as heat pumps, as we tackle the single biggest cause of greenhouse gas emissions, the heating our homes.'

The deployment of micro-generation options like micro-wind and PV solar, has been left mainly to the small existing 'Clean Energy Payback' Feed-In Tariff scheme and, for housing, the ZeroCarbon Hub, an agency the government has commissioned to deal with the issue.

In theory, micro-gen will be picked up as and when building and construction companies get to grips with the governments requirement that all new build houses must be 'zero net carbon' by 2016.

However, there is still some uncertainty as to what that actually means. Under pressure for clarification, the Labour administration had indicated that, although the aim was for houses to generate all their own power, some power could still be imported, and the upshot seemed to be that new-build houses were only expected to have to cut emissions by 70%. But now an even more flexible approach seems to have emerged, with only 56%, being required. Evidently, 70% was seen as 'particularly challenging and may not be achievable in all cases,' according to the ZeroCarbon Hub. What that means in terms of how much of their net energy they must get from on-site renewables, still remains a little unclear. The Building regulations are also being 'rationalized', so it's all still rather muddy. See: www.guardian.co.uk/environment/georgemonbiot/2010/nov/26/zero-carbon-homes.

The same goes for the proposed Renewable Heat Incentive, now delayed until June. But, when it comes into force, it could stimulate solar heating, and the use of biomass and biogas in micro CHP units, plus heat pumps and even fuel cells, at the domestic level.

Although it's mainly aimed at remedial domestic energy efficiency projects, some micro-gen options may also eventually get some support from the Green Deal, which aims 'to radically overhaul the energy efficiency of homes and small businesses' by making energy efficiency affordable for all, whether people own or rent their property'. The details emerged in the Energy Bill in December. The scheme is expected to start in late 2012.

The Green Deal is basically a short-term credit system – money is loaned by participating commercial organisations for upgrades, but paid back continually through a charge on their energy bills. As DECC explains: 'The upfront finance will be attached to the building's energy meter. People can pay back over time with the repayments less than the savings on bills, meaning many benefit from day one. It will help save carbon, energy and money off fuel bills.'

DECC added that it's estimated that there are 14 million insulation measures like loft, cavity and solid wall to be carried out; and that the most energy-inefficient homes could save, on average, around £550 p.a. by installing insulation measures under the deal. When the occupier moves on, not only will a more efficient property be left to the next occupier, the charge will also be left behind.

According to Chris Huhne, the scheme could support 250,000 jobs over the next 20 years, from the projected £7 bn of Green Deal private-sector investment per year, assuming all 26 million households take up the Green Deal.

All the above schemes focus mainly on domestic-scale projects and upgrades. Larger community-scale energy supply projects are the focus other schemes like the Community Sustainable Energy Programme (CSEP) – an £8 m open grants programme, run by the Building Research Establishment as an award partner of the BIG Lottery Fund. The Feed-In Tariff also provides support for local community projects – up to 5 MW. And DECC has a 'Community Energy Online' information service.

However, overall, the government does not seem to see community-scale projects as very significant. The new revised National Policy Statement on Energy says: 'The government does not believe that decentralized and community energy systems are likely to lead to significant replacement of larger-scale infrastructure.'

This despite the fact that an earlier report by the Energy Saving Trust ('Power in Numbers') suggested that there were significant economies of scale: 'Little to no benefit is observed in progressing from individual action, i.e. single household, to the five household level, even after the implementation of additional policy support. It is only when action occurs at scales above 50 households, and ideally at or above the 500 household level, that significant carbon savings become available.'

Community-scale projects could, it says, economically meet 4.3% of total UK energy demands if householders were to act collectively. That's 13% of total annual UK household energy demands.

Community projects might get some support from the changes that are proposed in the governments new Localism Bill, which gives councils, communities and individuals a greater say in local planning decisions. But equally it might lead to opposition to the adoption of new technologies large and small ‐ some have seen it as a NIMBY's charter. The bill introduces new rules allowing for local referendums where local people, councillors and councils can instigate a vote on any local issue, including planning proposals; and new powers to allow communities to give planning approval to chosen sites on local land.

It makes sense to use local energy sources to meet local energy needs wherever possible, and there is a lot of enthusiasm for a more decentralised community-based approach, as well as for domestic scale projects. The latter may have it limits, but done properly it can make a useful contribution, not least in alerting householders to their energy use and associated lifestyle energy and eco-issues. Community-scale projects by contrast should also be able to make a much more significant contribution on the supply side, if given proper support. So far though that seems rather patchy. However, Housing Minister Grant Shapps announced recently that the government would look into developing a community energy fund to ensure that new homes built after 2016 are carbon neutral.

Neil Crumpton has outlined an ambitious bio-energy future, based on solar driven bio (algae) oil production, and the use of Carbon Capture and Storage (CCS) to achieve carbon-negative energy generation.

While he sees Concentrated Solar Power and wind power as major global energy solutions, he believes that combining desert-based algae production with CCS could become a global scale carbon-negative climate solution.

He sees Seawater Greenhouses or similar desert-based low-tech structures as potentially ideal for algae production in photo bio-reactors.

The 'Seawater Greenhouse' is a solar-driven technology which uses adapted greenhouses (low-cost polytunnels) to desalinate seawater in arid regions to provide suitable growing conditions for food – or energy crops like algae.

Seawater is evaporated by drawing the hot desert air through a wetted cardboard wall in one side of the greenhouse. The cooled humid air passes over the crops and condenses to provide water for the crops. The humid air is then expelled from the greenhouse and can be used to improve the growing conditions for nearby outdoor plants: www.seawatergreenhouse.com. Sunny locations near the sea would obviously help, but the cost of piping sea-water long distances inland may not be significant.

The 'Sahara Forest project' is a supercharged variant of this concept, which would link huge greenhouses, potentially for growing algae, with concentrated solar power (CSP), which uses mirrors to focus the Sun's rays and generate heat and electricity. The combination of these desert technologies would provide more energy for evaporation, pumping and algae production – and desalinated water for mirror cleaning, CSP cooling and algae production.

Further potential synergies could lead to higher bio-oil yields, says Crumpton. The thermo-chemical liquefaction and the trans-esterification of the algae 'soup' to produce bio-diesel could be achieved by heating some algae types to 300 °C under pressure for 30 minutes – just the job for CSP technology: www.futurity.org/earth-environment/pressure-cook-algae-to-make-better-biofuel/. Also, reject CSP heat could be used in power direct CO2 air-capture devices and the CO2 could be bubbled through the algae soup to enhance production.

Crumpton says that by 2040 bio-oil from such desert-based bio-energy systems, if proven, could be shipped to gas turbine or fuel cell/CCS gasification schemes in countries with more variable renewable energy resources (e.g. Europe), to provide reliable and carbon-negative daily grid back-up and strategic energy reserves.

By 2040, bio-oil importing countries could have extensive CCS infrastructure, deployed initially to abate gas and coal power stations and industry in the 2020s. A recent study of North Sea CCS deployment and storage potential estimated that there might be about 450 mt CO2 per year injection capability by 2050. Crumpton estimates that this could equate to a carbon-negative potential of up to 2 tonnes per UK resident per year. He sees coal and gas CO2 sequestration as paving the way for, and potentially being replaced by, CO2 sequestration from bio-energy gasification, including imported algae.

It sounds pretty ambitious, but many of the components are in place or under development. CSP technology is moving ahead rapidly. New CSP technology, such as the 'Mulk' curved aluminium sheet mirror system, may achieve significantly lower costs compared to conventional glass troughs, by reductions in system weight and other design and construction benefits: www.mulkre.com.

Several experimental Seawater Greenhouse projects have been tried and tested. Soon the world's first commercial Seawater greenhouse will be completed in Australia: www.seawatergreenhouse.com/australia.html.

Also algae production is picking up, for example Argentine company Oilfox has opened the country's first plant to make biodiesel from algae, which it claims can be grown using seawater. There have been reports that a Texan company, Petrosun, has developed an algae-to-biofuels facility using a series of saltwater ponds spanning 1,100 acres.

CCS in geological strata remains unproven at large scale, and is sometimes seen as undesirable if it simply facilitates unchecked fossil fuel use, but the carbon negative bio-oil application proposed by Crumpton might give CCS a new renewable direction. Trials are also underway to enhance algae production by bubbling captured carbon dioxide through the algae 'soup'. The Carbon Trust has launched a major algae R&D project funded by DECC.

Using deserts, algae and seawater would certainly avoid most of the land-use and biodiversity conflicts that have bedevilled biofuels so far, although there could still be conflicts with food growing that might otherwise be done in the seawater greenhouses.

Neil Crumpton has worked for Friends of the Earth and more recently the Bellona Foundation, which is a partner in the Sahara Forest project. He is currently a consultant to B9 Coal, which is a fuel cell/gasification CCS power-station project development company.

You can find some surprising things at the bed of the glacier. Normally it is inaccessible to direct observation, but these days most glaciers are retreating. If you don't mind waiting a bit — and glacial geomorphologists don't really have the option — then keeping a close eye on what is emerging can be very informative.

In a paper published recently in Geology, Mark Johnson and co-authors present another surprise: nice fresh drumlins. Múlajökull is an outlet glacier, draining one of the ice caps in Iceland. Like almost every other glacier, it has been retreating. Like only a small proportion of other glaciers, it is a surging glacier — which is going to set the cat among the pigeons when we have had time to think it over and decide whether the surging is relevant. For the retreat of Múlajökull has exposed a field of drumlins.

Johnson and his co-authors were able to show that the drumlins consist of multiple layers of till, sediment carried by the glacier and deposited by a mixture of lodgement — expulsion from the moving ice — and deformation of the sediment over which the ice was flowing. The evidence suggests that each of the till layers represents a surge of the glacier. What is more, at least one of the boundaries between till layers is an erosion surface. That is, the lower layer has been truncated before the upper layer was draped over it.

This is yet another confirmation that the old question about drumlins, "Are they formed by erosion or by deposition?", was the wrong question to ask. The answer is "Sometimes one and sometimes the other, and often (as at Múlajökull) a bit of both, with some deformation of what was there already mixed in".

The resemblance of drumlin fields to baskets of eggs has been remarked on before. Lowland Britain is covered with them — tens of thousands of eggs. What is most interesting about the Múlajökull drumlins is that they are new-laid eggs, and the hens are still busy in the coop.

Nobody believes that the drumlins we see today in places like Great Britain and central North America have changed much since the retreating ice margins exposed them to view thousands of years ago. All the same, drumlins that are henhouse-fresh exert a powerful pull on the geomorphological and geological imagination. This is because of actualism, the ingrained principle that the present is the key to understanding the past. The likelihood is that there are lots more drumlins still forming behind the present-day retreating margin of Múlajökull, and as the authors point out we know as yet of no other drumlins that are in process of formation.

One thing that bothers me about the Múlajökull drumlins is that I have trouble seeing the multiple till layers in the photograph that is supposed to illustrate them. But among the reasons why I am not a sedimentologist is that dirt is not very photogenic, and I am prepared to go along with the authors' interpretation of what they saw in the field. Let us take it that these drumlins are indeed layered, and let us go one step further and accept their evidence that the layers have probably formed during the successive surges of the glacier. (They come along every 15 to 20 years, short-lived advances of a couple of hundred metres, punctuating a retreat that has been going on for about 200 years.)

Does this mean that there is something special about drumlins that are shaped by surging glaciers? Surging glaciers are sufficiently uncommon, and drumlins sufficiently widespread, that it is not likely that surging behaviour is a necessity for drumlinization. It is, however, interesting, and maybe significant, that the deposition probably accompanies the surges and not the longer intervals of retreat, during which there was either erosion or at least non-deposition.

Is there, instead, significance in one or both of two observations made in the Johnson paper: that the drumlins appear to have formed very close to the ice margin, within a kilometre; and that they appear to have formed beneath crevasses that run parallel to the flow direction of the ice? The authors offer only a sketch of an argument for why these associations might be a source of insight. But drumlins have been a puzzle for more than a hundred years. More facts can only help, even if all they do is to make us confused in a deeper and richer way — but especially if they are new-laid facts.


Offshore renewables have been doing well- notably offshore wind (see my Blogs earlier this year) now nearing 1.5G off the UK coast, but also tidal current turbine systems. MCT's 1.2MW SeaGen has been earning ROC's in Strangford Narrows, Northern Ireland. And dozens of other tidal current devices have been under test, some at full scale (1MW and above), including Neptunes ducted vertical-axis Proteus rotor system (in the Humber), and Open Hydro's Open Centre turbine (in the Bay of Fundy) . Meanwhile, Pulse Tidal is looking to install a 1.2MW version of its novel twin hydrofoil device off Skye in Scotland, and Hammerfest Strom UK has developed a 1MW version of its turbine which is to be deployed in Scotland. The largest unit developed so far is Hydra Tidal's 1.5MW multi-rotor tidal device, being tested off the Norwegian coast.

Wave energy had a somewhat less good year. Although the 600kW in-shore Oyster wave flap device has proved very successful, the leading wave device, the Pelamis segmented wave-snake, three units of which had been installed of Portugal in a 2.25MW tidal farm, had technical and financial problems. In addition, the giant 2.5MW Oceanlinx project Oscillating Water Column prototype fell foul of heavy weather just off the coast of Australia at Port Kembla, and was wrecked - much like the UKs 2MW Osprey, with which it shared some features- back in the 1980's.

Waves are clearly harder to tame than tidal flows, with even novel designs sometimes not doing so well- Tridents 80 tonne 20kW linear motor prototype sank off East Anglia back in Sept 2009. And even what you would think was a robust design, Finavera's prototype buoy system, sank off the Oregon coast just before its 6 week test period ended.

However, lessons are being learnt, with wave developers pressing on- the Mk 2 750kW Pelamis P 2 is being deployed at EMEC in Scotland with backing from E.ON, along with Oyster 2, an enlarged 2.5MW version of Oyster wave flap, and the Wave Hub undersea power socket off North Cornwall is now open for business. Wavegen is also moving ahead with its 4MW in-shore Siadar project off the Western Isles. And, as with tidal current turbines, new ideas are emerging all the time, like the clever Wave trader device that is attached to the bottom of the tower of an offshore wind turbine, and shares its grid link to shore. And elsewhere in the world, there's Portugal's WEGA 'gravitational wave energy absorber', and in the USA, Atmoceans WEST Wave Energy/Sequestration Technology, using buoys, and Florida Tech's 'wing wave' system, using a sea-bed mounted flap.

We need all we can get of both wave and tidal of course, so there is no direct competition. Earlier this year The Crown Estate allocated Scottish sites for over 1.2 GW of wave and tidal current projects on an equal basis, and the UK governments new £22m Marine Renewable Proving Fund supports each type equally, as does Scotlands new £12m funding. Nevertheless, PIRC's Offshore Valuation claims that, contrary to earlier assessments, the practical tidal current resource (now put at 33GW) is actually larger than the wave resource (18GW), so maybe the former will dominate in the UK. www.offshorevaluation.org/

Elsewhere it may be different: wave projects are moving ahead around the world- OPT are doing well, and there is talk of a 10GW wave programme in China. But South Korea seems to be focusing on tidal, as does Canada - e.g. in the Bay of Fundy. Basically it comes down to the physical resource, and, the tidal regime in both these places, as in the UK, is large. Canada and the USA are both supporting tidal projects on the NE east coast.

In Korea, although several tidal current projects are underway, the emphasis is on tidal range projects, and tidal barrages and lagoons did figure in some UK plans, notably for the Severn estuary e.g the 8.6GW Severn Tidal Barrage, despite it being opposed by all UK environmental groups. Smaller less invasive barrages concepts have also re-emerged for the Mersey and Solway Firth, although they have also attracted opposition. Moreover, the PIRC study put the UK tidal range resource at only 14GW- not insignificant, but smaller than wave and certainly less than for tidal currents. Tidal range was also seen as the most costly option.

Looking at all the options long term, the Department of Energy and Climate Change produced a 2050 Pathway report which had wave and tidal stream capacity running neck and neck in their maximum 'Level 4' programme- delivering 70 TWh and 69TWh p.a by 2050 respectively, with 58GW in all installed. However, DECC made clear this it felt this was very optimistic- the most that could be realistically conceived. Their lower level 3 programme only had 29 GW of wave and tidal stream (68 TWh) and level 2, only 11.5GW (25TWh). For comparison, tidal range was put at 1.7GW (3.4TWh) at level 2, 13 GW (6TWh) at Level 3 and 20 GW (40 TWh) and Level 4, by 2050- all much lower than wave or tidal stream . Perhaps then it wasn't surprising that when the government finally published its long await review of tidal projects for the Severn estuary, tidal range was seen as unattractive, and not to be followed up at this point. http://environmentalresearchweb.org/blog/2010/11/barrage-sinks.html . With that out the way, we might expect more progress on tidal current projects, although we can also expect problems and technical glitches. For example, Open Hydro's 1 MW test project in Nova Scotia, installed in 2008, has had to be extracted a year ahead of schedule, after blade failure. And just before it was due to be tested at EMEC, faults were found in the blades of the 1MW Atlantis double rotor tidal turbine system. But they evidently were replaced and space has now been allocated for a 400MW project in the inner sound area of Pentland firth, bringing the total planned wave and tidal deployment there to 1.6 MW by 2020.

There are many other new ideas emerging. One of the most intriguing is the tidal kite- being developed by Swedish company Minesto. It's an aerofoil wing, with a rotor and generator mounted on it, which is tethered to the sea-bed but free to move under the tidal flow. However it doesn't just stay in one place, but moves rapidly in a figure of 8 glide pattern under the influence of the tether, a rudder and the lift forces created by the tidal flow. That means the rotor turns faster than if it was simply in the tidal flow- in fact, it's claimed, up to10 times faster. Given that, unlike other tidal devices, it doesn't need expensive foundations or towers, it ought be to be cheaper, and less invasive, and there should be many locations where it could extract power from relatively low tidal flow- thus, in effect, expanding the potential tidal resource. A prototype is to be tested off Northern Ireland www.minesto.com

Perhaps even more exotic, one of the new concepts supported under the governments Seven Embryonic Technology Scheme, and then backed as a long term possibility in the DECC review of Severn Tidal options, is the Spectral Marine Energy Converter, SMEC, developed by VerdErg. It's based on the venturi effect - creating a low pressure area via vanes mounted in the tidal flow, which can be used create higher flow rates in a secondary flow, to drive a turbine. This concept could be used in cross-estuary tidal fences and may offer a way to extract energy from tidal flows without having major environmental impacts. So a Severn barrage, of sorts, or a project elsewhere (e.g. on Solway Firth) may still be on the cards, although the idea can also be used at smaller scale, e.g. in shallow river locations www.verderg.com.

The above is based in part on the end of year Annual Supplement to Renew: www.natta-renew.org