December 2011 Archives
2011 had some terrible examples of what some might see as media bias and distortion in the energy field. We had a Panorama TV programme seeming to join in the media narrative (Times, Telegraph, Mail etc) that much of the increase in power bills faced by consumers (put by some at £200 p.a per household) was due to support wind power -whereas OFGEM said that only cost around £10 p.a, less than 1% of the average household fuel bill. For more along these lines see: www.carbonbrief.org/blog/2011/09/gwpf-mail-pcc
Similarly we were treated to scare stories about 'green energy projects blotting out the countryside'. The Sunday Times and others tried to recycle the numbers from Prof David MacKays 'Sustainable Energy without the hot air', to prove that if we covered 10% of the UKs land area with wind turbines we would only get a sixth of the energy we need and 'for solar power to dominate', we'd need one third of the land. But they seem to have got the numbers wrong. PowerUP calculated that wind on 10% the UK would deliver a third of total energy (not just electricity) and 10% of land covered in PV would deliver over 100% of it. So with a mix, and remembering that much of the wind would be offshore and most of the PV on rooftops, the land-take is much smaller. http://energyrace.wordpress.com/
So sometimes effective rebuttals do emerge. Here is another. While the Times talked of the countryside 'disappearing beneath solar panels', the Guardian ran a nice piece which said that protests against what had been described by one objector as 'blanket desecration of the countryside' by solar farms were undermined by their tiny footprint. It said 200 acres of solar farms now exist which was about 0.0003% of the UK. There were 500 crazy golf courses in England alone. Assuming they covered roughly an acre each, they would be over twice the area covered by solar farms!
Sadly though our understanding of the significance the Fukushima nuclear accident in Japan was often not helped by the media, with a BBCs Horizon documentary and its popular science 'Bang Goes the Theory' show both presenting essentially pro nuclear views on nuclear risks (see my earlier Blog).
New Scientist also seemed happy to quote the view from Prof. Gerry Thomas at Imperial College that 'not an awful lot got out of the plant - it was not Chernobyl.' By contrast, Nature reported on claims from the Norwegian Institute for Air Research that the Fukushima accident released more total radioactive material than did Chernobyl, although some was in the form of xenon, which is not harmful. Even so it claimed that caesium emissions, which are longer lived and dangerous, were in total about half that from Chernobyl. www.nature.com/news/2011/111025/full/478435a.html
It is understandable that the media wants to simplify issues, but it's more usual for it to delight in controversy. So it is perhaps odd that over the last year we have had relentless barrage of pro nuclear and anti wind articles and programmes, rather than what you might think would have been (if nothing else), a more entertaining exploration of conflicting views. But even the BBC seems to want to avoid this. In relation to a formal complaint by Nuclear Consult about the alleged lack of balance in the 'Bang Goes the Theory' programme, the BBC commented 'the divergent views and debate relating to nuclear incidents at Chernobyl and Fukushima relate primarily to long term and indirect health and environmental impacts. In making the decision to present none of these divergent views within this programme, yet consider them all in drawing our own conclusions, we sought to avoid making unfair representation of any one view. Rather we presented only statistics which have been officially reported with firmly substantiated evidence'.
It is perhaps not surprising then that those wanting a fuller picture look to the internet, where of course you get just about every view possible expressed, with varying degrees of reliability and bias. And of course, biases, as well as wishful thinking, can be just as apparent in anti-nuclear and pro-renewables coverage. We all like to stay within our comfort zones! Perhaps it is hopeless to assume that 'the truth is out there', somewhere, as some do, but a better, more balanced, media might help get nearer to it. Or do we just get the media we deserve?
* For an attempt to balance the Panorama programme, see the YouTube video from Tony Juniper, former Executive Director of Friends of the Earth and now Chair
of Action for Renewables,:
It's Christmas, so here's a light-hearted contribution looking back over the year.....although maybe not quite so light-hearted, given the troubled year we've just been through, what with the quake and tsunami in Japan - and Fukushima.
The UK did send Japan an early Christmas present, but they may not have wanted it: a shipment of high-level radioactive waste. This derived from spent fuel from Japanese nuclear plants, which had been reprocessed at Sellafield's Thorp facility to extract plutonium and uranium, some of this maybe later ending up in mixed oxide 'MOX' fuel, for use in Japan- which now they probably don't want either!
Another perhaps less serious problem they had was when the seawater intakes of a nuclear power plant in Japan got clogged with jellyfish. The same thing happened in Scotland at Torness. Maybe they were trying to tell us something? So maybe was the three-eyed fish that a group of fishermen reportedly caught in a lake near a nuclear power plant in Argentina.
Nature does seem to remind us who is boss occasionally, with the tsunami in Japan being the most obvious recent example. That has it seems also happened in the past in the UK. What seems to have been tsunami evidently hit the English and Welsh coast of the Severn estuary in 1607, with the flood recorded with a metal peg set into the wall of a church (built on a low rise) at adult chest height, and showing surrounding ground had a flood height of 4 metres. It caused 2000 deaths of people of the villages and much loss of livestock. I do hope the designers of the new nuclear plants proposed for sea-level sites on the estuary at Hinkley and Olbury remember that.
However let's move on from nuclear. Renewables had their share of crazyness and problems this year too, with, for example, the 1/6th scale prototype of Norway's Sway floating wind turbine sinking in heavy seas in November. And a 2MW wind turbine caught light in fierce gales in Scotland in December The global recession also took its toll with US company Clipper Wind abandoning development work on the 10MW Britannia offshore wind turbine concept. But, technical and economic hiccups like this apart, the main problem facing renewables seems to have been heavy handed government intervention. Thus PV solar got hit twice with a 72% and then 50% cut in UK Feed In Tariffs. The argument was that PV solar had boomed too fast. Rapid growth was an issue that also seemed to hit on land wind power- there were problems with excess wind generation in Scotland, leading to curtailment of valuable output and provocative compensation to the generators. The main reasons seem to be that the grid system is not ready for it. But at least we have now agreed on a new design for grid pylons -a 'T'- shaped tower won the national competition.
What we haven't quite agreed on it s what to do with the electricity- use it as normal, or also for charging up electric cars, and running heat pumps. That would help balance out night-time excess power. But we could also store it as hydrogen. I was much taken by a quote I came across from a talk given in Cambridge in 1923 by J.B.S Haldane, who predicted: "The country will be covered with rows of metallic windmills working electric motors which in their turn supply current at a very high voltage to great electric mains. At suitable distances, there will be great power stations where during windy weather the surplus power will be used for the electrolytic decomposition of water into oxygen and hydrogen".
Ah, no, some say we may not need exotic new supplies like this - since we will have lots of shale gas. A bit surprisingly perhaps, Chris Huhne undercut some of the hype about that, pointing out that it had 'as not yet lit a single room nor cooked a single roast dinner in the UK'. The collapse of the Longannet Carbon Capture and Storage project also put CCS into a somewhat longer time frame, and that's worrying if we are to rely of gas in to the future.
Which, for now, leaves us with nuclear and renewables slogging it out for a place in the sun, or rather for their share of the 'Contracts for a Difference' when ever they finally get going. It's not quite clear to me how that is going to work. With no direct Obligation on anyone to take specific types of power, just overall government indicative supply targets, I assume it will be up to the market- which will presumably veer towards whichever option can offer the best deal in the short term. That is far from clear. On land wind looks likely to be the cheapest of the main non-fossil option at present, depending on how you do the sums and what other subsidies are on offer. But you get the feeling that the government sees nuclear as special: it gave an early Christmas present to Sheffield Forgemasters in the form of a loan of up to £36 million "to continue its drive into civil nuclear and steelworks plant production." But that's not a subsidy honest! And even if it was then, they would no doubt say, similar offers have been, or will be, made to renewable projects. But a new £3bn MOX plant at Sellafield? Difficult to justify...especially when the last (£1bn +) one didn't work.
Personally, rather than leave it up to the market, or backstage funding deals by civil servants, I'd rather leave it up to the democratic process to decide on big issues like which major technology to invest in. But that's not our way any more, at least not until things go seriously wrong. Then you may get terrible results, like 94% of the public voting against nuclear, as happened earlier this year in Italy. It couldn't happen here!
And no, that doesn't mean including footage of people attending exercise classes. The S Factor under scrutiny in this blog is the S Factor Workshop on how to make successful science videos, held at the American Geophysical Union Fall Meeting in December. The event saw a panel of Hollywood professionals critique ten entries, picked from a total of 42 submissions by hopeful researchers.
On the panel were marine biologist-turned film-maker Randy Olson, author of Don't Be Such a Scientist: Talking Substance in an Age of Style, and his former film-school classmates Sean Hood, now a screenwriter with credits such as horror movie Halloween: Resurrection and Conan the Barbarian to his name, and Jason Ensler, co-producer and director of Franklin & Bash, and director of episodes of Gossip Girl, Chuck and Psych.
The trio were cheerfully disparaging of scientists' storytelling skills, saying that many of the videos took the approach "here's our lab, here's our kit, come see us some day". But story is key - "think of it as making a trailer for science".
One exception was San Jose State University's Green Ninja. The panel felt this video showed good storytelling, with a character who clearly has a problem - his oversized and ever-growing feet - that he needs to solve.
A useful technique, as detailed by Nicholas Kristof, is to follow the story of one individual and, ideally, to reach an uplifting conclusion. According to Olson, Kristof argues that an article on death is depressing but an article on people fighting a disease engages. In the same way, a story about coral deterioration could be depressing or dull, but a story about a man interested in coral can catch people's attention.
Since film is good for conveying emotion and humour but not for transmitting information, it can be useful to break your complex content down to a simple story. According to Ensler, it takes time to develop stories but they can be overdeveloped and lose some of their original spark. Hood stressed the need "to keep hold of that nugget of awe", and that scientists should "inspire the eleven-year-old in all of us".
It's also worth considering changing the order of events from a "that happened, then that happened, then this happened" type of narrative. Replacing "ands" in the storyline with "buts" and "therefores" can change the direction of the story and add tension, the film experts explained. For example, in Volcano from Space, the storyline could have been "We monitor volcanoes but they're hard to see so we need new techniques." Arguing two sides of an issue can also create a good story.
Ensler recommended that researchers set up cameras whenever they are in the field so that they have plenty of interesting footage to use in their videos.
But interesting is not enough; if somebody says interesting after Hood's latest film pitch, he knows "I've failed, because I haven't grabbed them emotionally". People are most engaged by people talking, not things, he said, so it's useful to show a person alongside a piece of scientific kit. Because watching a person speak in real-life is different to seeing them onscreen, if you're filming a talking head you need multiple cameras and different angles, as per the TED talks, to stop it from being boring.
That said, many of the films submitted began with somebody speaking to camera - the panel felt there was no need for this. According to Olson, it's good to arouse and fulfil - grab the audience's attention, make them want, then fulfil their need. For example the Mata Eruption video from JISAO (Joint Institute for the Study of the Atmosphere and Ocean) could have put its amazing video footage of an undersea volcanic eruption right at the start of the film before answering the questions the footage raises. Alternatively, Ensler said the team could have made the audience want by promising them they were going to see some great footage but first explaining why it's hard to obtain.
As film is a visual medium it can be helpful to see if you can get the gist of a short film without listening to the soundtrack, the professionals explained. Indeed, one of the most well-received videos - Perspective, which used animated graphics to indicate the relative energy release of large earthquakes throughout history - contained no sound at all, and was praised for its Hitchcockian withholding of information from the audience.
In summary? Every picture (should) tell a story...
Genetically engineered food had not exactly been popular in the UK, with many people being worried about the risks. Quite apart from the dangers of cross-species gene transfer, some are concerned that the underlying aim is to enable suppliers to lock farmers into dependence on them. More generally, some see it as part of a wider claim that 'technology can fix everything, don't worry about impacts'.
Views like this are likely to shape reactions to the latest idea- genetically engineered energy crops. The back-story is that the first generation of biofuel crops has been widely criticised for being low yield, land hungry and undermining of food production. The second generation of non-food crops, it is claimed, will be better. But genetically, modified crops could, it's said, be even better- with much higher yields, and more resistance to drought, pests and diseases.
In fact in a new report on 'Next generation biofuels and synthetic biology', the Foundation for International Environmental Law and Development (FIELD) says that the aim is to go beyond simple genetic modification 'by splicing a few genes from one organism into another' and on to designing 'entirely new life forms with pre-selected functions, like the microbes which will digest trees and grasses and ferment them into biofuels, or the algae which will harvest solar energy to produce oil'.
Well actually that sounds interesting. So why not? FIELD offers some compelling arguments.
Minor genetic adjustments may not sound too horrific- it's what nature does slowly and we do a bit faster via selective breeding. But FIELD quote the Royal Society explanation that 'the synthetic biologist seeks to build a bespoke system (such as an organism) by re-designing an existing system or constructing one from scratch using parts taken from nature or specially designed. This approach can lead to organisms...with properties not found in nature.'
FIELD report that some synthetic biologists are designing 'a biological shell which will express synthetic DNA as flexibly as a computer runs programmes. The shell is created by disabling the genes of an existing organism one at a time and removing those that can be removed without killing the organism'. Others seek to catalogue and assemble biological parts like Lego bricks. FIELD says 'BioBricks, a leading effort of this type, is a registry of DNA sequences that each reliably perform a specific function. Each "brick" is designed to be compatible with the others.' Still others aim to construct synthetic life forms entirely from scratch using DNA synthesisers, 'the biological equivalent of word processors'.
FIELD notes that the world's first self-replicating synthetic genome, announced by the J. Craig Venter Institute in May 2010, was constructed in this way. Venter described it as 'the first self-replicating species we've had on the planet whose parent is a computer.' That certainly sounds worrying.
FIELD says 'It is extremely difficult to anticipate the risks and harms of a new science like synthetic biology, and therefore of next generation [GM] biofuels. Traditionally, the risks of new genetically engineered organisms are assessed by comparison with their known relatives. Containment rules and risk mitigation strategies are then set based on the rules for the known relative. But synthetic biologists are capable of designing organisms with no relatives in nature'.
It accepts that 'building "terminator genes" into synthetic organisms, or making them dependent on artificial substances, may decrease the likelihood of uncontrolled proliferation', but asserts that 'uncontrolled proliferation may occur despite best efforts at containment. Synthetic micro-organisms released into the environment, accidentally or intentionally, could share genes with other micro-organisms through horizontal gene transfer or evolve beyond their functionality. One hypothetical, worst-case scenario is a newly engineered type of high-yielding blue-green algae cultivated for biofuel production unintentionally leaking from outdoor ponds and out-competing native algal growth. A durable synthetic biology-derived organism might then spread to natural waterways, where it may thrive, displace other species, and rob the ecosystem of vital nutrients, with negative consequences for the environment.'
It goes on 'Synthetic biology also presents new bio-security threats. DNA sequences and design software are available online and synthesised DNA is available by mail order. In 2002, a team of researches at the State University of New York demonstrated the potential threat by recreating the polio virus from sequences of DNA ordered by mail.'
It then outlines the current state of play on regulation, but warns that 'there is little clarity on how synthetic biology is currently regulated under domestic and international law, and no clarity on how regulation should proceed'.
There are vast amounts of money potentially to be made from synthetic biology, and, given the rapidly developing field, those seeking to devise regulatory controls also face, in effect, a moving target. So perhaps it's hardly surprising that regulation is problematic.
Worried? FIELD clearly is. So too are Friends of the Earth and Greenpeace. Some may see all this as just scare-mongering by those who are basically anti-scientific progress, but there would seem to a valid cause for some concern. One way or another, we seem likely to be in for another round of the GM debate.
FIELD report: : www.field.org.uk/files/syntheticbiologybiofuelsbriefingpaper.pdf
One of the crucial questions of our time is whether international environmental regimes can and will manage to save the climate and other global environmental commons, such as biodiversity. Oran Young from the University of California, Santa Barbara, provides crucial insights in his meta-analysis, just published in PNAS. Rather than trying to summarize his observations, I aim to highlight a few points.
Interestingly, political scientists judge international regimes systematically more positive than economists, probably reflecting different assumptions and mindsets. Nonetheless, analysts agree there is a full spectrum of international regimes, ranging from the rather successful ones (such as the Montreal protocol on ozone) to the rather unsuccessful ones (such as Kyoto), and many in between. According to a metastudy by Breitmeier et al., international regimes contributed significantly or very strongly in about half of the cases where environmental problems improved. Another observations is that regimes can be successful even in non-hierarchical, non-enforcable circumstances. It turns out that some "easy" problems are systemically messed-up whereas some "hard" problems are surprisingly successful dealt with. Instead, regime design matters a lot: Attention to details can be more crucial then the path chosen (both incentive-based and command and control regulations can lead to success).
Perhaps motivating for the climate issue: a coalition of influential actors can drive a regime towards some sort of success, even if the a single dominant actor remains passive. Fairness and legitimacy are preconditions for success, especially given the mostly non-enforceable character of international regimes. Indeed, Young suggests that we need an understanding of the conditions under which fairness and legitimacy can be productive forces.
Will there be a successful climate regime? We don't know. We can only try. Young suggests some innovative routes that scientists and policy makers should explore.
The UK governments new Carbon Plan, produced as required under the Climate Change act, looks at a core strategy based on a mix of renewables (45GW), Carbon Capture and Storage (28GW) and nuclear (33GW) by 2050, but also includes three alternative possible scenarios. In one, if CCS does not take off (just reaching 2GW) and renewables are restricted to 22GW, up to 75GW of nuclear is built by 2050. In the second, with CCS moving up to 40GW, nuclear is then at 20GW and renewables 36GW. However, in the third, renewables move up to 106GW, with nuclear at 16GW and CCS at 13GW by 2050. All three future scenarios are at http://2050-calculator-tool.decc.gov.uk
Some might say having three main options spreads the risks. Certainly there are risks and problems with each and it could be argued that some of these are sufficiently serious that the options should be reconsidered.
We are used to hearing about the short-term economic, safety and security risks of nuclear, but there are also longer term issues- and beyond the usual one of waste disposal. In a report on 'Energy balance of Nuclear Power Generation' the Austrian Institute of Ecology and the Austrian Energy Agency have had another look at the issue of the full lifecycle energy requirements for providing the fuel for nuclear power plants. They looked at all the previous studies and concluded that, assuming the low growth scenario of the World Nuclear Association (WNA) and the IAEA data on uranium resources from currently operated uranium mines, reserves will be sufficient until 2055. If mines which are currently being developed are also taken into account, the uranium reserves would last until around 2075 in the low WNA growth scenario. However emissions from the increased use of lower grade uranium ore will rise, since uranium fuel production will get much more energy intensive.
With ore grades between 0.1-2%, the energy expenditure for generating one kWh of final energy is put at between 2-4%. With ore grade of 0.01% and 0.02% the energy expenditure rises to 14-54% and the resulting CO2 emission amount to 82-210 g/kWh. By contrast, CO2 emissions for renewabales are put at 3 - 60 g/ kWh.
The study notes that one third of currently operated uranium mines have an ore grade below 0.03%, but if we push ahead with more nuclear, then we reach the point when continuing become increasingly pointless in energy/carbon terms.
You might of course still continue with nuclear despite that, but below about 0.008 to 0.012 % ore grade, the report notes, 'the energy expenditure for the uranium mining is so high, that the overall energy balance turns negative... From this ore grade on, the operation of nuclear power plants does not generate any energy surplus.'
The only option then, if for some reason you wanted to continue to use nuclear, would be to use renewables to provide the energy for uranium mining and processing. It's just conceivable that uranium mines in Namibian might use solar PV power and those in Kazakhstan wind power, and that uranium ore processing plants will also use renewable sources, but surely it would not make sense to use renewable so wastefully. /www.ecology.at/lca_nuklearindustrie.htm
CCS delayed or dead?
The demise of the proposed coal-fired Carbon Capture and Storage pilot project at Longannet in Scotland, due to the high investment cost, led some to say CCS was dead as an option in the UK. One key issue for CCS evidently is the need to cover the risk of accidental sudden large scale CO2 release at some future point. Hard to quantify! For a spirited demolition of CCS see Eurosolar president Prof Peter Droege's review: www.europeanenergyreview.eu/site/pagina.php?id=3251
He notes that the IEA roadmap envisions that by 2050 3,000 CCS projects will capture and store 10 billion tonnes of CO2 annually, about a third of current global carbon emissions. He says that's 'a tall order, in view of the fact that not a single utility-scale CCS plant is currently operating on the planet'. He reports that 'American Electric Power, cancelled plans to deploy CCS at one of its big facilities - even though the U.S. government offered to pick up half the tab.' At best he says 'most observers peg 2020 or 2025 as the earliest date by which enough large-scale CCS plants are on-line and returning evidence to prove technical viability' However 'renewables are set to achieve grid-parity over the same period. This means that there will be risk that CCS becomes economically obsolete just as the returns come in.'
He concludes 'Funds can be far better spent on stimulating demand reduction and energy efficiency, improving renewable energy storage and two-way energy grids to balance intermittent generation, and - last, not least - to bank on 'carbon storage' that works: namely the active bio-sequestration of greenhouse gases in wetlands, moors, humus rich agricultural soil and in growing new forests.'
Nevertheless, CCS enthusiasts argue that it could be competitive with renewables and avoid their grid balancing issues. Some small pilot projects exists around the world and the UK government is still keen to press ahead with its £1bn CCS competition, if it can find a new candidate. In addition, there are it seems still 6 industrial consortia keen to compete for maybe 4 UK 'slots' in the EU subsidised (NER-300) CCS demo programme. The UK's proposed new CfD support system should also offer support for CCS, cheaper gas-fired plants included.
The Longannet coal project was to involve post-combustion capture and access to offshore storage via a 170 mile long pipeline. Some say a better first option would gas fired pre-combustion capture schemes, possible even using bio-methane in existing CCGTs. Many environmentalists are unhappy with CCS, not least since they say it will deflect support from renewables. But biomass-fed CCS would be carbon negative, assuming the biomass is fully replaced, so some see fossil-fed CCS as just a preliminary stage and as a bridge to a much more sustainable approach.
For an overview of EU CCS prospects: www.europeanenergyreview.eu/site/pagina.php?id_mailing=2 23&toegang=115f89503138416a242f40fb7d7f338e&id=3361
Renewables certainly have their problems, not least, for some of them, intermittency, although that can be overstated. It's a relatively minor operational issue when the renewable input is below around 20%, and can be dealt with without leading to significant extra emissions using standard approaches, including the new breed of flexible, but high efficiency, combined cycle gas turbines, like the FlexEfficiency 50, developed by GE: www.ge-flexibility.com
As more renewables come on line, we may need more energy storage capacity, and there are some clever new ideas emerging in the hydrogen field. The electrolysis of water is sometimes seen as inefficient, especially with variable electricity inputs, but RE Hydrogen say that their novel materials electrolyser can handle intermittent electricity inputs, usually a bugbear for wind or PV powered hydrogen generation: www.rehydrogen.com/id1.html
More radically, there's a new idea for thermal dissociation of water at high efficiency using high temperatures and solid acid materials: www.sciencedirect.com/science/article/pii/S0360319911010007
Meanwhile, Airproducts has developed a cryogenic system for storing energy as liquid air. It claims that overall energy conversion efficiencies of 75-85% are possible with up to 100MW storage for 12 hours: www.airproducts.com/industries/Energy/Power/Power-Technologies/product-list.aspx?itemId=%7B7D677622-F274-40B1-8EC9-F6D33CC19C5E%7D
Innovations like this, and also upgrades to the basic renewable generation technologies, are moving ahead rapidly around the world, with costs falling rapidly. And if you want to spread risks, well there are dozens of different types of renewables- real diversity. I know where I'd put my money!
Ocean acidification is often overlooked as a problem in favour of its more famous parent, climate change. But it's receiving plenty of attention at the AGU Fall Meeting in San Francisco.
Whilst most information on the effects of acidification is based on modelling or lab experiments over limited time periods, Adina Paytan of the University of California Santa Cruz has been looking at whole ecosystems - the natural submarine springs, or "ojos", that occur along Caribbean coastlines. Formed when rainwater travels through limestone caves under land and discharges into the sea via faultlines, these springs have a low pH, making them a natural laboratory for studying the effects of acidification on ocean-dwelling species over long timescales.
To her surprise, Paytan found that three coral species were able to grow under the low pH conditions near the springs, despite a scarcity of the carbonate ions that corals normally need to form their calcite shells. Near the springs the pH was around 7.6 whilst further away, where nine coral species were discovered, the pH was 8.1.
"It's encouraging that certain corals can survive," Paytan told reporters, "but it's only a few species and they're not reef-building. They tend to grow slowly in patchy colonies."
Paytan speculates that the higher nutrient concentration of the springwater may aid growth of the coral's symbiotic photosynthesizing algae, providing the coral with more energy and the ability to grow under less optimal conditions. She is currently investigating this theory further. CT scans of samples drilled from the corals revealed that the organisms formed less dense skeletons when the surrounding water contained fewer aragonite ions (a form of carbonate). Under these conditions, the corals also suffered more boring by clams and worms. As a result, the corals may be less robust and more susceptible to damage by hurricanes.
Acidification researcher Nina Keul of the Alfred Wegener Institute in Bremerhaven, Germany had also had a surprise - her lab tests showed that a species of foraminifera found in Northern Germany actually grew faster in more acid water. This is in contrast to previous studies, although Keul's specimens were at an earlier stage of their development than those used in other work. The mechanism for the increase is not yet clear; Kent speculated that a lower pH may make it easier for the forams to expel hydrogen ions formed during their shell-making process.
Robert Riding of the University of Tennessee Knoxville, meanwhile, has been looking to the past. His analysis of cores drilled from a reef in Tahiti in 2005 indicates that natural acidification associated with past climate changes weakened the bacterial crust that glues reefs together. At some periods, these calcifying bacteria formed layers up to 20 cm thick and could make up as much as 80% of the reef framework. But acidification led to a lower abundance of the bacteria, making the reefs less strong. Riding explained that this creates a double whammy - natural acidification has weakened reefs and manmade acidification will now weaken them again.
There are many theories as to why the ancient Central American civilisations of the Mayans, Aztecs and Toltecs died out. It could be that drought, perhaps due to solar forcing or random climate variability, was a factor. And in 2010 Robert Oglesby of the University of Nebraska, US, suggested that deforestation may have contributed to drought and the Mayan collapse. Now Ben Cook of the NASA Goddard Institute for Space Studies and Columbia University has confirmed that past deforestation in Central America may indeed have cut precipitation.
"Pre-Columbian central American contained 19 million people in sedentary agricultural societies," Cook told reporters at the AGU Fall Meeting. The resulting landscape probably consisted of cropland interspersed with patches of rainforest. But after the Spanish conquest, population crashed by 90%, enabling reforestation to occur.
Cook used land cover reconstruction data based on population numbers (obtained from colonial records) and a climate model to examine the effect of forest cover on climate before and after 1492. He found that pre-Columbian deforestation suppressed precipitation by roughly 10-20% in the region and led to half a degree of warming.
"Grass and croplands absorb slightly less energy from the sun than the rainforest because their surface tends to be more reflective," said Cook. "This means that there's less energy available for convection and precipitation."
The temperature rise is explained by the lower soil moisture - energy that isn't being spent evaporating water goes into warming the ground instead.
Lake and cave records from the area indicate that there was a 14% decline in water balance - the difference between precipitation and evaporation. Cook says his simulations can account for about half this drying.
Today the region is more extensively deforested than in pre-Columbian times, except for the Yucatán Peninsula, which has more forest cover. Cook believes that future deforestation in the Yucatán could lead to similar drying.
The Dead Sea region is long on history but short on water. To cast a more detailed eye on both, researchers have drilled a nearly 500m-long core from the middle of the Dead Sea to reveal more about its fluctuating water levels over the last 200,000 years.
Presented at the AGU Fall Meeting, the initial findings indicate that around 120,000 years ago, the lake almost dried up as a result of natural variation in climate. That doesn't bode well for today's scenario, in which extraction of water from the Jordan River for irrigation has almost entirely stopped the flow of water into the lake, and climate is projected to become warmer and drier. According to the UN, water shortage has the potential to cause conflict in the region.
"The Dead Sea is already drying up because humans are using so much water," said Steven Goldstein of Columbia University. "The evidence it has actually gone away without any human intervention, under conditions that might return soon, is something people should think hard about."
It was the discovery of a layer of pebbles around 235 m deep that revealed the drying of the Sea - such pebbles are typically found on the shoreline so their presence in the centre of the lake shows that the water level was extremely low. Below the pebbles was a 45 m thick layer rich in salt, which also indicates extreme drying. In fact Goldstein said calculations show that producing a layer of salt this thick would require evaporation of virtually all the water that's in the lake today.
The Dead Sea, which currently lies 425 m below sea level, is the deepest place on Earth. But its water level has fluctuated massively as climate and other factors have changed. Around 25,000 years ago, for example, its surface was just 160 m below sea level, while 6000 years ago it was 370 m below. In 1997 the surface was 413 m below sea level. When it came to the start of the drilling project in November 2010, the lake level had dropped so much that the team had to build a new road to access the lake, Emi Ito of the University of Minnesota told assembled reporters.
As well as beach pebbles, the core contains layers of sediment, explained Goldstein. White layers were laid down in summer due to the precipitation of calcium carbonate whilst darker layers of mud and silt were deposited in winter by floods and sand storms. In warmer times salt was also precipitated due to shrinkage of the Sea but the team found that these layers were not visible during Ice Ages. In several regions of the core the layers are jumbled, indicating an earthquake.
So far the researchers have dated the core, which they only finished drilling in March, by comparing it with stalagmites found in caves in the region. Now they plan to use radioactive dating to provide a more accurate picture. Finding out how quickly the lake dried is a priority; Goldstein believes it could be anywhere from a few hundreds of years up to thousands.
The project brought together researchers from Israel, the US, Germany, Japan, Switzerland and Norway, using a rig from the International Continental Scientific Drilling Program.
As luck might have it, when he spoke at the American Geophysical Union Fall Meeting in San Francisco this morning on communicating climate change, Michael Mann of Penn State University, US, was able to reveal that the Wall Street Journal today published his letter contesting attacks on climate scientists. "As Nature put it in an editorial a couple of years ago, climate scientists are in a street fight," he said.
Mann stressed that climate change is a reality. "The recent study in Nature Geoscience [which found that at least three-quarters of the current warming is due to manmade factors] came to an even stronger conclusion than the IPCC report," he said. But Mann would like to contest the paper's conclusion - he believes that more than 100% of today's warming is caused by man, as natural factors should on average have led to cooling over the last few decades.
Mann detailed his current involvement with the Climate Literacy Zoo Education Network (CliZEN), a collaboration between nine US zoos and Polar Bears International. "Zoos provide a unique opportunity for trying to communicate the threats to the natural world," he said.
It seems that zoo visitors are more concerned about climate change than the general public. A survey of visitors to zoos and aquaria in the summer found that nearly two-thirds were alarmed or concerned about climate change; in contrast, only 39% of the public surveyed as part of the Six Americas project showed the same levels of concern.
"[Zoo visitors'] primary impediment in becoming more engaged is actually knowledge - they don't feel well enough informed on what they can do," said Mann. Given that nearly 50 million people in the US visit a zoo each year, he feels that's a tremendous opportunity to educate.
• Mann will be talking about his book, The Hockey Stick and the Climate Wars: Dispatches from the Front Lines, which is due out in January 2012, at 4 pm on Tuesday. If you're at the AGU meeting, head to Room 3001 in Moscone West to hear more.
Local community initiated and run renewable energy projects seem to be catching on around the EU. Wind co-ops have been very common in Denmark for many years- about 80% of the wind generation capacity is locally owned. It seems to be one reason why local opposition to wind is much lower than in the UK, where there are very few locally owned projects. As the Danes say ' your own pigs don't smell'.
It's similar in Germany where many projects are locally owned. A comparative study conducted in Germany by researchers from the University of Amsterdam concluded that the social acceptance of wind power is very high in general, and even higher when community members are directly involved. 62% of the residents near a community owned wind farm expressed a positive or very positive opinion on the wind farm in their neighbourhood and only 1 % had a negative or very negative attitude. In the case of the non-community owned wind farm, 47 % expressed a neutral opinion, while 26% were positive or very positive and 27 % were negative or very negative.
Stefan Gsänger, World Wind Energy Association Secretary General said: 'If we want to reach a 100 % renewable energy supply worldwide with wind energy as a cornerstone, we have to make sure that the local communities actively support this endeavour and that they benefit from the wind farms in their vicinity. Community Power ownership models offer an excellent approach to achieving this objective.'
The local ownership idea has also spread to other technologies. A well as being a leader in wind, Denmark, makes a lot of use of district heating, and it is now developing some solar-fed heat networks, with some of them being run as community cooperatives.
For example, the Brædstrup District Heating co-op owns the network and heat meters and delivers district heating to almost 1,400 households, covering around 95 % of the heat demand in the town. Supply temperatures are between 72°C in summer and 80°C in winter, and all heat meters are remotely read at years end. A General assembly is held once a year, mostly in March, and all members of the cooperative have access.
The 2006 general assembly decided to invest in a major solar heat collector panel installation to go along side the existing gas-fired plant. Financial support was received from the national TSO (€480k), and installation took place in 2007. Solar heat production from the 5.6MW 8.000 sq.m solar array was 3,229 GWh in 2009.
Their next project is to expand the solar array to more than twice the size of the existing one, and to develop a heat storage based on 100 holes in the ground, each with a pipe loop, where surplus solar heat can be stored and extracted later with the help of heat pumps. Financial support has also been applied for and received for this experimental project to the sum of €850.000. If it goes well, further expansion is foreseen.
Many more community solar heating projects like this have emerged, with back up heat stores, including the 13MW array at Marstal, soon to be doubled: see www.solarmarstal.dk. For more see: www.solar-district-heating.eu
Biomass is also being used as a basis for local community projects around the EU. For example, Juehnde is the first bioenergy village in Germany, meaning that it produces its electricity and energy for heating and cooling locally from renewable biomass resources. While the project was started in 2000, it reached the self-sufficiency level for energy in June 2006. In 2007, Juehnde produced around 5 m kWh electricity, while the village's consumption, with 750 residents in 200 households, 75% of whom are connected up, is about 2 m kWh. The excess is sold to energy providers. The major feedstocks for electricity generation are methane (biogas) produced from fermented liquid manure and locally grown energy crops. Heat is produced as by-product from electricity generation in a 700kWe biogas fired CHP plant and in Winter from burning woodchips. The major motivation behind the use of biomass is climate and resource protection. It's run as a co-operative.
Local agriculture, with 9 local farmers, is the backbone for operating the project, as 25% of the 1300ha farmland and 10% of the annual forest wood growth from its 800ha of woodland is contracted for bioenergy production. But there are also two PV solar arrays- 10 and 8.6kWpeak
The project received €3m in financial support from Federal, regional, and local government agencies. It proved so successful, that a number of other bioenergy villages are being developed, even without the same government support.
The largest so far are Rai-Breittenbach with 900 residents 90% of whom are served by 3.5MW of biomass and 30 kW of PV; Iden with 1000 residents and 250kW of biogas and 850kW of wood fired generation, serving 75% of the population; and Randeqq, with 1300 people supplied by 2.7MW of wood generation and some solar thermal, supplying 50% of the population. And more are on the way.
More at www.bioenergiedorf.de
So how far have we got in the UK? The Bay Wind co-op in Cumbria was the first breakthrough, and several more wind co-ops have followed including Westmmill near Swindon, also now a site of a solar co-op: www.westmill.coop/westmill_home.asp
Scotland has been leader in the field, with support from the Community and Renewable Energy Scheme. That has assisted 105 electricity generating projects over the last 2 years, which it's claimed should result in an installed capacity of 53 MW and more are on the way. Overall, Community Energy Scotland have estimate that there is about 180 MW installed capacity of community owned renewables schemes currently under different stages of development and many more are planned. The 2020 Routemap for Renewable Energy in Scotland included a commitment to expanding the contribution from community schemes, with a new target of 500 MW community and locally- owned renewable energy by 2020.
One of the most recent projects is the community wind power scheme at Udny, Aberdeenshire, which is to be followed by Torrance Farm Community Wind Energy project at Harthill. A Community Trust Company has been formed to disburse the profits from the 800kW Udny scheme - £4 m over 20 years - which could go to fund projects such as a new community hall, a youth hut, a cinema or the expansion of a local paths network.
AAT has been trying to do the same thing in Wales, www.awelamantawe.org.uk/ and there are many new projects emerging across England, via groups like CoRE, the Community Renewables Co-op: www.corecoop.net; FREE, Fowey Renewable Energy Enterprise http://freefowey.co.uk ; and WREN, Wadebridge Renewable Energy Network www.wren.uk.com.
However it's up uphill struggle, not least to raise finance. The Renewable Obligation is not much use for smaller schemes- it's designed for large-scale commercial projects. On the continent the various Feed In Tariffs were by contrast much more use, and there were hopes that the UKs small FiT could help, but its support for PV has now been drastically cut back. The new energyshare.com scheme, backed by British Gas, is promising, with hundreds of hopefuls signing up, but it seems we have some way to go before we can expect to see anything like what's happening elsewhere in the EU.
The UN climate negotiations in South Africa have been underway for four days now. There are reports that the EU is taking a hardline stance to the consternation of developing countries, while US President Obama has come under pressure at home for US negotiators to take a more flexible stance. BBC correspondent Richard Black, meanwhile, has compared negotiators to stags fighting.
For those who can't be there and who'd like to follow the tussles minute by minute, there's a whole raft of possibilities for logging in remotely.
The UN website has a page dedicated to virtual participation, with links to the negotiation's YouTube channel, for press briefing highlights and video messages, Facebook page, which is liked by nearly 10,000 people and where today's quiz question is 'who was the first COP president?', Twitter feeds, and more. There's even a "UN climate negotiatior" iphone app. And the really committed can watch live and on-demand webcasts of press briefings.
Alternatively you can browse the blog’s category archives: