This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

July 2010 Archives


In 1997, 'Factor Four: Doubling Wealth, Halving Resource Use' by Ernst von Weizsäcker, Amory Lovins and L. Hunter Lovins, agued that technical innovation could cut resource use in half while doubling wealth. A new book, upgraded to 'Factor Five', builds on this idea- looking at how and where factor four gains have been made in the intervening period, and at how we might achieve greater factor five or 80%+ improvements in resource and energy productivity in future, in line with IPCCs recommended target of 80% reductions in greenhouse gas emissions.

It provides an overview of efficiency opportunities across a range of sectors. The approach is very positive and optimistic, making it hard to make any criticisms without appearing churlish, as with a most work from what is now sometimes called the Amory Lovins school of analysis. But it ought to be asked- aren't there limits to what can be achieved by energy efficiency? Using fuels more efficiently, in terms of both their conversion and consumption, means less emissions for the same amount of energy finally used. So obviously that's a priority. Investing in efficiency is also often seen as an economically attractive option. It is certainly true that, in the past, since fuels have often been relatively abundant and cheap, we have not paid much attention to their efficient use. So there is a lot of potential for cheap and easy energy saving options which we can and should exploit. Initial Factor 5 gains seem not unreasonable expectations. However once these 'low hanging fruit' options have been exhausted, there will surely be diminishing returns: it will, in many cases, be progressively harder and more expensive to make further savings.

This is where there is a key disagreement. The Lovins school view seems to be that costs of energy saving will continue to fall as new technology emerges. It is certainly true that for most mass-produced items, prices do fall from the initially expensive first versions- think of DVD players. It's the classic learning curve model. But it's not true of all products- e.g. cars are getting more efficient, but also more expensive.

Can Factor 5 type efficiency/resource savings continue to be made across the board, with costs continually reducing? Especially also given the rebound effect- with cost savings from efficiency often being used to buy more energy services, so undermining the emission savings. The new Factor 5 book does recognise this issue (Lovins tends to have dismissed it in the past as 'small') but suggests that the way out is to ramp up prices of fuels and resources to squeeze out consumption and steer consumers to the use of less energy and resource intensive options, including renewables. Certainly, if the money saved from efficiency is invested in renewables, then you capture all the emissions savings. And, like Factor 4, the new Factor 5 analysis does include the use of renewables- it's a key part of their programme. After all, quite apart from the rebound issue, that is where we might expect some major 'Factor 5' emission savings, i.e. from the widespread use of renewable energy supplies, although not necessarily with ever-diminishing costs. Clearly, whatever we do to improve efficiency, there will still be a need for new energy sources, as the availability of fossil fuels declines, and as we seek to avoid using them, for environmental reasons. But there are ecological and resource limits to how much we can get from renewables, and that means an acceptance of the need to curtail consumption at some point.

Whereas Lovins sometimes seems to have implied that we can all have more of everything, the new book challenge the simple 'win-win-win' belief that we can expect reduced costs and emissions while continuing to expand material consumption: it talks of rising eco-taxes and a focus on qualitative 'well being' in a more equitable world - a much more palatable, if maybe a little utopian, version of win-win-win, which most of us would no doubt be happy to sign up to. Growth may be vital for some developing economies in the medium term to allow populations to move beyond subsistence level, but it can't go on forever, everywhere. The result, after all, does not bear thinking of- endless material growth in an increasingly fractious and alienated global society, until it all shudders to a social and ecological halt. Much as the Dark Mountain group fears: for their rather different view on how we should respond, see www.dark-mountain.net

The above is based on my review in issue 186 of Renew: http://www.natta-renew.org

With the Tour de France all done for another year, it is time to look back and reflect on the latest instalment of the world's most famous cycle race. First up, congratulations to Alberto Contador of the Astana team, who claimed his third Tour win on Sunday and thereby rounded off a magical summer for Spanish sport, following on from national success at the World Cup and Wimbledon.

Surprisingly, no Tour de France cyclist tested positive for doping during this year's competition. From a welfare perspective doping does not really matter – to the first degree – because the costs of doping are internalized: it is the cyclist who suffers from an increased risk of dying early. To the second degree it becomes more interesting: the Tour de France may also incentivize non-professional cyclists to dope. However, this negative externality is probably outweighed by the positive externality of people being encouraged to cycle.

There are also other direct environmental benefits. Consider this study from Werner Scholz, from the Landesumweltamt Baden-Württemberg. It reports the reduction in air pollution in Karlsruhe on 8 July 8 2005. On this day, the Tour de France passed through Karlsruhe and major streets in the inner city were closed. As a result PM levels were reduced significantly, by a quarter below their usual value (NOx was reduced by as much as 55–70%). In public health costs this corresponds to a low five-digit number of euros. Similarly, some tonnes of CO2 were not emitted, noise costs and perhaps accidents were reduced, and people gained access to additional public space. A full cost-benefit analysis, of course, needs also to account for reduced mobility in the centre of Karlsruhe. As a cyclist, however, I would not mind the Tour de France passing through Berlin next year.

The Dutch Environmental Assessment Agency, PBL, has released the results of a minutely detailed search for errors in part of the second volume of the Fourth Assessment of the Intergovernmental Panel on Climate Change. The search focussed on the eight chapters that assessed regional impacts.

It turned up quite a number of errors, mostly too trivial to waste time over, but two of them glaring. One was about how much of the Netherlands is below sea level. The IPCC gave a wrong figure, 55%. A quarter is much closer to the truth. But in the words of the PBL, "the error was made by a contributing author from the PBL[!], and the [IPCC Coordinating Lead Authors and Lead Authors] are not to blame for relying on Dutch information provided by a Dutch agency." Really? My atlas shows land below sea level in a tasteful strawberry shade, so I would have thought that a glance at the atlas would make it seem unlikely that more than half of the Netherlands is below sea level. On the other hand, the Dutch have had their finger in the dyke for centuries, and so are unlikely to be misled for long by anything the IPCC, or for that matter their own government, says on this point.

The other error was about the water tower of Asia, but I don't want to revisit that one just now other than to repeat that blunders happen.

This is all so predictable, and in a cosmic sense so trivial. There is, however, a way to view these blunders in proper perspective, even though I know I shouldn't use the word "paradigm" in an article for popular consumption.

All of us in the sciences know what a paradigm is, because we have either read or been told about Thomas Kuhn's 1962 book The Structure of Scientific Revolutions. A paradigm is a big, governing idea, one that makes sense of a lot of other ideas that would be disparate without it. Kuhn argued that when a scientific discipline undergoes a revolution, it is actually undergoing a paradigm shift, in which an old paradigm is replaced by a new one.

I am not sure about the old paradigm. Kuhn says that a paradigm is a set of one or more past achievements that some scientific community acknowledges for a time as supplying the foundation for its further practice. The achievements have to be unprecedented enough to attract adherents, and open-ended enough to leave all sorts of unsolved problems for them to work on. I detest the relativistic sociology of "acknowledges for a time", but this paraphrase is important as a key to understanding the recent fusses about climatic change and IPCC mistakes.

I don't think there ever was an old paradigm in the atmospheric and neighbouring sciences. First of all, there has been no revolution. Climatologists have been doing what Kuhn calls "normal science" for centuries. The foundations in dynamics and thermodynamics are as they have been since the 17th century, and in radiation as they have been since the late 19th century. But atmospheric and oceanic dynamics, and radiative physics, describe systems that, though they change continually, always stay the same unless you mess with them.

In fact, the big unifying idea didn't burst onto the scene. It evolved. Fifty years ago and more, there weren't many "adherents" in the study of global environmental change because there was little to adhere to other than a vague idea that another Ice Age ought to begin any millennium now. But there is certainly something to adhere to today. These days, the big unifying idea is the greenhouse effect, and in particular the anthropogenic greenhouse effect. It unifies because it is triumphant at explaining the facts while generating more questions than it answers.

Denialists are fond of criticizing climatological claims that "The science is settled", or equivalently that "The debate is over". If any climatologists have ever used those particular words, then what they meant to say was that the paradigm is doing fine. It did not originate in a revolution, but it has adherents who see it as unprecedentedly successful and find it so open-ended that the science of climatic change is still growing explosively.

If you can find a big enough concept, like Kuhn's paradigm, even mistakes about Asian water towers and Dutch polders fall into intellectual place. They are storms in a teacup. Debate about Himalayan glaciers and the risk of flooding in the Netherlands will go on indefinitely, enlivened by the occasional howler. If and when something more intellectually powerful comes along, we will replace the paradigm, but for now it is firmly in place and there is no sign of a replacement.

Wind scam

| | Comments (3) | TrackBacks (0)

John Etherington's The Wind Farm Scam – an ecologist's evaluation (Stacey International, London, 2009) is evidently seen as a definitive text by anti-wind groups. You don't have to read much of it to see why. Here are a few quotes. He says that the wind power industry is determined "…to drive roadway after roadway through lonely places, to dump concrete in enormous quantity, to bulldoze acres of hillside into wind farms studded with gigantic, identically mass-produced steel and plastic monsters. This is akin to demolishing the great cathedrals for road stone or shredding the contents of the National Gallery to make wall insulation".

He says that "…as the developers have grabbed the remote lands of Britain, so their flailing blades perforce creep closer to habitations". He describes windfarm turbines as "wind monsters" spreading "environmental harm" and sees anti-wind campaigners as "the heroic defenders of the land". And, rather than "twitching crucifixions of landscape", he recommends, as an alternative, nuclear power, which he claims "could give secure supply of very large amounts of electricity". For good measure he's also a bit of a contrarian on climate change: he feels that: "It is not credible that the virtual-world output of the models can reliably be used to make policy decision."

So can this book be ignored as just a silly opinionated diatribe? Unfortunately no, since John Etherington is an ecologist and academic of some standing, having been reader in ecology at the University of Wales, Cardiff, and a former editor of The Ecologist. And the bulk of the book consist of a well written and detailed account of wind power – how it works and what problems there might be – with much of this being respectably done, even if there are occasional lapses and errors. Some of the errors are technical – he is not an engineer and occasionally slips up on details, some of which are important.

Fortunately Prof. John Twidell has provided a detailed critique, pointing out the errors and misunderstandings, in a review in Wind Engineering, Vol. 34, Issue 3, pp 335-350' , 2010. For example, Twidell notes that, in his account of the impact of variable inputs from wind turbines on the grid, Etherington fails to mention that demand is constantly changing, and that supply has to be altered to match the demand. This omission is serious, since the impression is given that variations as from wind power are distinct and previously unknown, whereas the variations due to changes in load have always been similar and predominantly more extreme. Thus a grid that copes with load/demand variation, copes easily with the arrival of wind power. A similar unforgivable error is not to mention that all forms of generation fail and hence need back-up strategies. Maintaining short-term operating reserve capacity with a range of mechanisms to balance supply and demand have always been the central tasks for grid operators.'

There are many more such examples of omissions or errors, which are catalogued in detail in Twidell's 8,000 word assessment, along with some statements which can only be described as disingenous: Etherington describes the various ways in which wind projects have been financed, with most (like the Renewable Obligation and Feed In Tariffs) passing the costs on to consumers. Nothing new there – it's all well known. So how can he then complain about "huge and concealed benefits to the wind power developers and covert arrangements which prevent this from being common knowledge"?

Twidell's review deserves to be widely read – it's a carefully measured analysis, which strives to keep irritation at the anti-wind rhetoric in check. He even rather kindly offers a let-out at the end: "Visual impact and its psychological implications is probably the key to understanding the divisions exposed by this book; every other criticism from Etherington and his colleagues probably flows from this problem." Other readers may not be so kind.

Even so, despite its often hectoring tone, this book deserves to be read: some wind-power enthusiasts do overstate their case, there certainly can be problems with managing wind systems, and it is always useful to have beliefs and certainties challenged, especially since, on most plans, many countries around the world now expect to rely on wind for large parts of their energy input in the years ahead. In addition, this book may also serve as a timely reminder to us all, whatever our beliefs, that "purple prose" and rhetoric do not sit well with, and can undermine, more careful analysis. And that is something that can apply to "greens" as well as to contrarians.

You can access Twidell's review at www.embracemyplanet.com/critique-wind-farm-scam.

For more on renewable-energy policy and issues, visit www.natta-renew.org.

The Zwally effect is an acceleration of the flow of marginal ice in the ice sheets due to lubrication of the bed by meltwater percolating from the surface. Up to a point, this phenomenon is not surprising. It is well documented on smaller, thinner valley glaciers. The surprise, first documented by Zwally and co-authors in 2002, is seeing the same phenomenon in ice as thick as 1,200 m.

The Zwally paper has stimulated a growing literature with two main threads. One thread tries to explain how meltwater can find its way through more than a kilometre of ice. The other tends to show that the Zwally effect is not the reason for dramatic increases in the speed of tidewater outlet glaciers, where the evidence favours, quite strongly, warm ocean water as the culprit. But that doesn't mean that seasonal acceleration is uninteresting.

Ian Bartholomew and co-authors report on more dramatic seasonal acceleration than has been measured hitherto. It still doesn't rival the speed-ups observed on some tidewater outlets, but the observations highlight the potential of GPS from a different angle, and suggest fascinating insights into how the surface meltwater does its subglacial work.

This new report relies on time series of positions obtained with four Global Positioning System receivers deployed along 35 km of a land-terminating flowline at 67.1° N in southwest Greenland. The data include not just horizontal but also vertical velocities, as well as near-surface air temperature. Averaged over the summer, the speed-up from winter background values was rather modest. But the fascinating bits are the details.

The further up-glacier, the later the onset of speed-up, by more than a month. The natural explanation is a later onset of melting at higher elevations. The highest site was at 1,063 m and the lowest at only 390 m above sea level.

More interesting is that the horizontal velocity correlates very nicely with the vertical acceleration, or in other words with the rate of uplift of the surface. The ice goes faster when the surface is uplifting rapidly. Or rather, rapid uplift seems to provoke speed-up. This is a subtle observation in more ways than one. For one thing, the amounts of uplift are a few decimetres at most. That we can detect such subtle vertical motions is a payoff for all the trouble it took to loft a couple of dozen GPS satellites into orbit.

More interesting still is the authors' subdivision of the summer into three phases. In phase 1, there is no particular surface uplift or speed-up: the meltwater, if any, has yet to reach the bed. In phase 2, the cumulative uplift increases towards a maximum, and so do the horizontal velocities, more or less. (You need the eye of faith to see these phases in the noisy data. But I buy them.) The concluding phase 3 sees repeated episodes of uplift and speed-up, but the course of the surface elevation is downward and so, more or less, is that of the horizontal velocity.

Phases 2 and 3 add up to another picture of an invisible world beneath the authors' feet. The meltwater, once it reaches the bed, pressurizes the ice and forces it upwards, filling and enlarging cavities and promoting basal sliding. But the enlargement proceeds at least in part by melting of roofs and walls, implying the creation of connections and, in short, of a network. The network grows steadily better at discharging the arriving meltwater. Phase 2 becomes phase 3 when the network becomes more than able, on average, to cope with the spate of water. Phase 3 ends when the supply of meltwater gives out, and the ice starts winning again, resuming its regular wintertime job of squeezing the summertime channels shut.

If you want real glaciological drama, visual or acoustic, you should probably go to tidewater terminuses, at which most of the ice leaves the ice sheet. But there is still plenty of land-terminating ice, and the main things about the Zwally effect, granting that it is real, are that it must be real everywhere; and that if the surface of the ice sheet gets warmer then the bed of the ice sheet is bound to get busier.

Shell Puget Sound Refinery on the south end of...

Image via Wikipedia

Cathy Kunkel & Felix Creutzig


Until recently, the CO2 intensity of fuels was regarded as something fixed. Gasoline and diesel dominated and still dominate the transport fuel market. Their relative CO2 emissions are mostly fixed. However, with the rise of alternative fuels, carbon intensity becomes an issue and discussions on electric vehicles, hydrogen powered cars, and the life cycle emissions of biofuels become more abundant. One ugly species is unconventional fossil fuels, produced from heavy oil, oil shale, coal, or bitumen. The latter is a substance extracted from Canadian tar sands which is upgraded - by energy intensive processing - into synthetic crude oils. Because of the energy needed for extraction and processing, petroleum from Canadian oil tar sands has higher life cycle emissions then convention fossil fuels, up to 25% more.

The EU and California regulate the carbon intensity of fuels by the so-called Fuel Quality Directive and the Low Carbon Fuel Standard respectively. These regulations incentivize the use of low-carbon fuels and punish high-carbon fuels. Hence, there does not seem to be much place for bitumen oils or similar products in the California or EU markets. From a marginal abatement point of view things look good. But what happens when one looks at "emission capacity" (=long-lived capital stock emitting or enabling the emission of CO2)?

Recently, Chevron tried to expand its Richmond based refinery, in Northern California - an unsuccessful endeavor that was beaten back by local protests. However, some refineries in the US are scheduled to expand capacities by 2012, adding 500.000 barrels a day (EIA, 2010a). Several of these expansion products in the Midwest are designed for processing heavier crude oil. In addition, a new refinery with capacity of 400,000 barrels per day is being planned for South Dakota specifically to process tar sands. Hence, refineries are expanding their "emission capacity". At the same time a number of pipeline projects are being planned out of Alberta (home of the tar sands), significantly expanding Canadian exports of oils to the US (EIA, 2010b).  For example, the Keystone and Keystone XL pipelines would bring tar sands oil from Alberta to the Midwest and would have a capacity of 1.1 million barrels per day (already 435,000 barrels per day of capacity has been constructed) (Transcanada, 2010).  For comparison, U.S. crude oil imports in 2009 were 9 million barrels per day (EIA, 2009).

Investing into heavy oil refineries and pipelines is probably reasonable business as conventional oils are running out. Perhaps heavy oils will replace only some fraction of conventional oils while the rest is left to low-carbon fuels.

The problem here is that US infrastructure is once again running into a carbon lock-in. When costly refineries and pipelines are built, political pressure increases to avoid carbon regulation policies which would render this "emission capacity" into sunk investment. 

While California has a LCFS, carbon intensity of fuels is not regulated nationwide. One solution to avoid further carbon lock-in is an immediate implementation of a nationwide LCFS (or carbon tax for that matter), sending credible signals to investors that heavy oil infrastructure bears a mid-term financial risk. However, if such a solution is currently not feasible, is there a second-best option? Perhaps, EPA could regulate oil infrastructure to induce lower than average life-cycle emissions of their product (or some other plausible threshold). Such a maximum emission requirement could then be adaptive in the sense of the Japanese top-runner standard which requires products like electronic equipments to become more efficient following the state-of-the-art equipment. Similarly, "emission capacities" could be required to have always lower CO2 lifecycle emission then the current average "emission capacities", thus inducing a race to the good.

 

EIA (2010a) US Energy Information Administration. Annual Energy Outlook.

EIA (2010b) US Energy Information Administration. Country Analysis Briefs: Canada.

EIA (2009) US Energy Information Administration. US Imports by Country of Origin.

Transcanada (2010) Keystone Connection Cananda, http://www.transcanada.com/docs/Key_Projects/keystone_connection_spring_2010.pdf.

 

 

 

 

Enhanced by Zemanta

"We advocate inverting and fragmenting the conventional approach: accepting that taming climate change will only be achieved successfully as a benefit contingent upon other goals that are politically attractive and relentlessly pragmatic. Without a fundamental re-framing of the issue, new mandates will not be granted for any fresh courses of action, even good ones".

That's the line adopted in the Hartwell report, by an international group of academics co-ordinated by the London School of Economics. They claim that 'it is now plain that it is not possible to have a climate policy that has emissions reductions as the all encompassing goal'. They note that 'in particular the ambitions for regional- let alone global - "Cap & Trade" regimes to regulate carbon by price, can be now seen to have been barren in their stated aims although profitable for some in unexpected and unwelcome ways.'

They say that in any case we shouldn't just be focussing on climate change: 'there are many other reasons why the decarbonisation of the global economy is highly desirable'. In the approach they would like to see 'decarbonisation is achieved as a by-product of pursuing more pragmatic and popular primary goals, including expanding energy access, energy security and, ultimately, making energy less expensive and more abundant.'

They also claim that we have focussed too much on carbon. So for example they want vigorous and early action on non-CO2 climate forcing agents like black carbon and tropospheric ozone. But their main overall concern is to rebuild public trust via successful improvements in energy efficiency and new energy innovation which clearly cut costs. So they want to develop 'non-carbon energy supplies at unsubsidised costs less than those using fossil fuels' and advocate funding this work by 'low' hypothecated (dedicated) carbon taxes.

However they say that there is a way to go: renewables are still mostly expensive, 'except under the best of circumstances, i.e. when located at optimal sites; close to existing transmission lines; displacing peak generation rather than base load, and serving a constituency willing to pay higher prices'. In passing, they manage a dig at 'the chilling history of European and particularly British wind-power, recently', which has 'led to poorly-chosen wind facilities that have performed much less well than promised, with serious financial and social consequences because they also distort overall portfolio investment decisions in significant ways'

Clearly they see carbon taxes, as being better than government subsidies, which is evidently what they see as being the basis of the EU approach. In fact, at present, RD&D apart, there are not many EU government subsidies - the UK's RO, the Feed In Tariffs in the EU, and even the EU Emission Trading System, are all in the end supported by extra charges levied by supply companies on consumers - not by taxpayers.

While the authors clearly don't like subsidies, really though their fundamental objection to current climate policy is much wider. They say 'energy policy and climate policy are not the same thing. Although they are intimately related, neither can satisfactorily be reduced to the other. Energy policy should focus on securing reliable and sustainable low-cost supply, and, as a matter of human dignity, attend directly to the development demands from the world's poorest people, especially their present lack of clean, reliable and affordable energy. One important reason that more than 1.5 billion people presently lack access to electricity is that energy simply costs too much'.

This sounds a little disingenous- there will be precious little dignity if climate impacts turn out to be as bad as expected. Carbon and other emission have to be dealt with- and fast. What the Hartwell report is claiming is that it is no good focussing on emission targets, in part since, as was stated in an earlier comment from the team quoted in the report "It is a characteristic of open systems of high complexity and with many ill-understood feed-back effects, such as the global climate classically is, that there are no self- declaring indicators which tell the policy maker when enough knowledge has been accumulated to make it sensible to move into action. Nor, it might be argued, can a policy-maker ever possess the type of knowledge - distributed, fragmented, private; and certainly not in sufficient coherence or quantity - to make accurate 'top down' directions."

So do we just leave it to the market? The authors seem to think so, in terms of choosing which way to go: 'Driving cost reductions must be the explicit purpose and primary design of deployment policies. Achieving consistent reductions in the unsubsidised cost of clean energy technologies must be the measure that determines which technologies will fly and which will stall in the long term'. But it can't, they say, just be left to the free market: 'since much of the energy technology revolution will require…basic RDD&D investment, public funding on a long-term basis is essential; and that is why an hypothecated tax is so important'.

A bit confusing since later they say 'innovation activities will of necessity be sponsored initially by the public sector' i.e. a short-term input. But what they are adamant about is that the carbon tax must not try 'to alter short-term consumption behaviour'. They were clearly chastened by the spectacular failure and withdrawl of the ambitious Carbon Tax proposed for France by President Sarkozy. Theirs would be lower and just for technology push. But then it sounds like they hate government dictats of any sort, and only accept government intervention and support grudgingly, since the private sector won't fund much R&D.

Overall, the carbon tax aside, their approach seems to have much in common with that adopted by the US and China at COP 15 - a free market technology-led approach, with no binding emissions targets, or government edicts. That may not be too surprising when you look at some of the sponsored of the report- who include the Japan Iron and Steel Federation, and the Japan Automobile Manufacturers Association.

Leaving conflicting ideological views on markets aside, of course more needs to be done across the board - the Kyoto approach was marginal at best. However, it's not clear the Hartwell approach is any better. A carbon tax would increase consumer energy bills, and most studies suggests that, although, in time, the transition to renewables will reduce prices, initially it may not.

Some have also seen the reports emphasis on non-CO2 emissions as odd. Dr Bill Hare, from the Potsdam Institute for Climate Impact Research, commented that 'The paper's focus away from CO2 is misguided, short-sighted and probably wrong'.

And underlying it all is a belief that technical fixes are the answer, while social and behavioural change is likely to be hard. The latter is clearly true, but that doesn't mean we shouldn't try to do both. Technical fixes may well work in the short to medium term , but is it really realistic to expect continually reducing costs, and perhaps more importantly, continually expanding energy use, long term? Energy efficiency and renewables can allow us to expand energy use up to a point, but there are limits. We also need to start thinking about sustainable consumption.

The Hartwell Paper: A new direction for climate policy after the crash of 2009

Last year I had occasion to take a 500-km trip by taxi. The taxi had a GPS unit — a talking GPS. I didn't pay much attention along the way, but I had to be impressed when the taxi pulled up on the main street at our destination and the GPS announced, smugly but correctly, "You have reached your destination."

The Global Positioning System has become part of our lives in the last decade or two, but it is much more than talking taxis. Some recent work illustrates dramatically the ability of accurate positioning devices to tell us things about how the world works.

The toothpaste in the Earth's mantle complicates attempts to measure glacier mass balance by the gravimetric method, and also by the geodetic method. (For the latter, you need two maps of surface elevation. Subtract the earlier map from the later, and divide by the time span. The result is nearly a map of the glacier's mass balance, the only missing ingredient being an estimate of the density of the mass gained or lost.)

But what if the elevation of the glacier bed has changed, in conflict with our assumption that surface elevation change equals thickness change, or equivalently that all of the gravity signal is due to the glacier? It can and does happen, and the glacier itself is often to blame. Loading the underlying bedrock, it forces the soft mantle material, at depths below about 100—200 km, away from where the ice is building up. A glacier that is shedding mass constitutes a "negative load", and the mantle material flows back. (The negative load ends up as a positive load spread more thinly over the ocean.)

The trouble is that the mantle deforms viscoelastically. The elastic deformation is instantaneous and reversible, just like that of an elastic band. Very crudely, it amounts to about a third of the equilibrium response to the load. The remaining viscous part of the deformation is what we think of as flow. To model it, though, we need an accurate model of the variation of viscosity (stiffness) throughout the 2,800 km thickness of the mantle. That is a formidable challenge.

The mantle flows so slowly that it is still responding today to the loss of ice at the end of the Ice Age, roughly 10,000 to 15,000 years ago. The toothpaste is pushing the bed of the glacier upward slowly, and before we can interpret a change in its surface elevation as a change in its mass we have to remove the bed-elevation component of the change.

But now Yan Jiang and co-authors offer an ingenious twist on the monitoring of elevation change with GPS. They have collected five or more years' worth of GPS readings of surface elevation from several fixed sites around the North Atlantic. The sites are all on bedrock, not on glaciers. (They wouldn't bear on this particular problem if they were on the ice.)

The surface's vertical velocity varies from place to place. The ingenious twist is to focus on the vertical acceleration of the surface, which turns out to be systematically greater near to large ice masses (in Greenland, Iceland and Svalbard). The authors argue persuasively that, while the vertical velocity will reflect delayed viscous adjustment, the acceleration is a signal of the Earth's elastic response to recent increases in the rate of glacier mass loss.

There are some rough edges: sites with large accelerations and not much glacier ice nearby, and one site not too far from the ice but with relatively low vertical acceleration. But I can't think of a mechanism to explain these observations other than elastic response of the solid earth to recent removal of glacier ice. The viscous response to this unloading has barely begun, and the viscous response to deglaciation cannot possibly change by so much over a period as short as a few years. The authors even have a go with an elastic-response model at estimating the mass balance that would account for the acceleration in west Greenland, and get plausible answers.

The talking GPS on my taxi ride demonstrated the power of positioning accuracy at the few-metre level. For measurements of glacier mass balance we would like millimetre-level (vertical) accuracy, but there are technical and conceptual problems to be ironed out before that becomes reality. For now, Yan Jiang and co-authors have shown that decimetre-level accuracy will do nicely to be going on with.

Technology guru Bill Gates of Microsoft fame gave a sparkling presentation on energy back in February, under the title 'Innovating to Zero' (i.e. zero emissions).

Oddly he repeated the old saw about renewables being expensive and needing a lot of backup. Strange given that ,in California, wind power is the cheapest energy source on the grid and the main issue in the US is not so much the intermittency of wind as there often being too much wind generated electricity for the grid to handle (see my earlier blog on curtailment issues).

And even more oddly, he didn't mention the smart supergrid idea, which could balance and manage local variations in supply and demand. You might think that would be right up his street, as someone who pioneered internet information grid systems and applications.

However his main thrust was on innovations in nuclear. He said: 'innovation really stopped in this industry quite some ago, so the idea that there's some good ideas laying around is not all that surprising'. He backed the so called 'Terrapower' idea, in which a mix of fresh uranium and depleted uranium is formed into a log type tube, buried deep in the ground and 'burnt' progressively, with the fission reaction running through it from one end to the other, like a candle. So it's sometimes called a 'travelling wave reactor'. It's envisaged that it would take 60 years to burn through end to end , and that the waste products could just be left where they were, underground . Using depleted uranium/ spent fuel to breed more plutonium and run reactors essentially from some of the wastes from conventional nuclear plants , is hardly a new idea, but the travelling wave idea is new and untried.

What Gates now wants to see is a lot of supercomputer modelling to test if it will work. He is obviously keen. If 'Instead of burning a part of uranium, the one percent, which is the U235, we decided, let's burn the 99 percent, the U238' we could "power the U.S. for hundreds of years". And there's more: "simply by filtering sea water in an inexpensive process, you'd have enough fuel for the entire lifetime of the rest of the planet". For more on the Terrapower idea see its UCB originators' website.

This says that you might need to add in some plutonium and/or thorium, basically to avoid the reaction fizzling out. Sounds a little crude and messy, leaving a wide range of wastes for future generations to deal with. And also quite hard to control, once started up. A rival approach to this mix of solids, mentioned briefly by Gates, uses liquids – actually a molten thorium flouride salt. For more on the Liquid Flouride Thorium reactor, see http://thoriumenergy.blogspot.com.

Gates evidently likes the solid idea, but the US government seems to be focussing mostly on (slightly) more conventional concepts in its 'Next Generation Nuclear Plant' R&D programme. The main emphasis is on an advanced High Temperature reactor with co-generation (CHP) capability, to be built at Idaho National Lab. WNN noted that 'This was originally meant to actually operate in 2010, but its priority has fluctuated'. NGNP is part of the 'Reactor Concepts RD&D' programme, which will also begin working on small modular reactor concepts with a total budget of $195m.

Although the Terrapower idea may be a little off the beaten track, it's interesting that nuclear plant designers are looking to new concepts, for example co-generation reactor systems which can be used to provide heat and well as power, possibly for industrial process heating purposes. In addition there is renewed interest in developing systems which can be used to generate hydrogen gas, either directly (by high-temperature dissociation of water), or indirectly (by electrolysis), for use as a vehicle fuel. It could be that they think that the long-term future for nuclear is not in electricity supply, but in other perhaps more lucrative and less contested markets. With wind power, and some other renewables, already looking increasingly competitive as electricity suppliers, perhaps that's not surprising.

Even in current cost terms the conservative Nuclear Energy Agency (NEA) and the International Energy Agency (IEA) have commented in their latest joint study into Projected Costs of Generating Electricity, that "nuclear, coal, gas and, where local conditions are favourable, hydro and wind, are now fairly competitive generation technologies for baseload power generation".

They add that "there is no technology that has a clear overall advantage globally or even regionally. Each one of these technologies has potentially decisive strengths and weaknesses".

They conclude that "the future is likely to see healthy competition between these different technologies, competition that will be decided according to national preferences and local comparative advantages".

It will be interesting to see how it all pans out long term, and, amongst other things, if Gates got it right. Personally, I don't fancy a Terrapower unit in my backyard! I'd much prefer the renewable energy technologies being backed by Google, in pursuance of its $4.4 trillion "Clean Energy 2030" plan, which calls for the replacement of all coal – and oil-fired electricity generation with natural gas and renewable electricity globally, including 380 GW of wind power, 250 GW of solar power and 80 GW of geothermal power. See http://knol.google.com/k/clean-energy-2030#.

For more on renewable energy developments and policies, see www.natta-renew.org.

In a remarkable paper about Pine Island Glacier just published in Nature Geoscience, Adrian Jenkins and co-authors describe the latest step in maturation of an emerging glaciological technology: autonomous underwater vehicles — AUVs — or in other words unmanned submarines, for exploring the undersides of ice shelves.

The United States Navy has been sending manned submarines beneath the ice pack of the Arctic Ocean for more than 60 years. But pack ice, a few metres thick, is small beer by comparison with the ice shelves that fringe much of the Antarctic Ice Sheet. These are typically a few hundred metres thick, and there is no question of surfacing by punching your way through the ice if you run into trouble. Indeed, the cavities beneath ice shelves, between the base of the shelf and the sea floor, must be among the most inaccessible of all the theoretically accessible places near the Earth's surface. Being so hard to reach — until now — the sub-shelf cavity is also one of the least observed, but not necessarily least understood, parts of the climate system.

Climate, you say? Well, the sub-shelf cavity is where the ocean meets the ice sheet. Assuming (safely) that thermodynamic equilibrium prevails, the contact between shelf ice and seawater must be at the freezing point of the seawater, which depends on the pressure of the overlying shelf and the saltiness of the water. But the water and ice at some distance from the contact will not be at the freezing point, so there must be a heat source or sink, and therefore melting or freezing, at the base of the shelf.

If you add warm water to the sub-shelf cavity, you should expect additional melting. In 2002, Rignot and Jacobs inferred, from temperature profiles in the ocean offshore, astonishingly high rates of basal melting near several grounding lines: more than 50 m of ice per year. But this does not mean 50 m/yr of thinning of the shelf. A central part of their analysis was measurement of ice flow across the grounding line by radar interferometry. The inference of rapid melting was required to explain why the floating shelf was "pulling" ice so aggressively out of the grounded ice sheet.

So the contents of the sub-shelf cavity, and what goes on within it, are just as much part of the puzzle of glacier response to climate as are the grounding line itself, the shape of the floor of the cavity, the water at the base of the grounded ice, and of course things that happen in the atmosphere and the wider ocean.

Jenkins and his co-authors have contributed the first large set of in-situ observations from the sub-shelf cavity. What strikes me most forcibly about this dataset is how triumphantly it confirms earlier theoretical analysis of the way things ought to be down there. According to theory, warm water should be flowing inward at depth, melting the shelf base aggressively near the grounding line and — having thus become cooler, fresher and more buoyant — flowing upward and outward along the base. This is exactly what the autonomous underwater vehicle observed, confirming that we did know a thing or two even before measurements became possible on this scale in the sub-shelf cavity.

There are other noteworthy points about this study. For example the AUV found a ridge in the sea floor beneath the floating ice, in just the right place to explain why the grounding line of Pine Island Glacier has been retreating inland since the first observations in the early 1970s, and to confirm that this is something we ought to be worried about.

But perhaps the most noteworthy point of all, looking ahead, is the AUV. Buried in the Methods section of the paper is this, describing an incident part-way through the field campaign: "... the AUV lost track of the rugged ice-shelf basal topography, ascended into a crevasse, collided with the ice and executed avoidance manoeuvres that prompted it to abort its program and take a direct route to the recovery waypoint. After minor repairs, ...". Putting it another way, the world is now a little bit smaller, but not less dramatic, than it used to be.

There is enough sunshine to grow grapes for wine in Cornwall, so why not harvest solar electricity as well? Independent renewable energy generator and supplier Ecotricity is planning dozens of large grid-linked photovoltaic 'solar farms' in the South West and there are plans for 'Sun farms' in Cornwall and the Scilly Isles.

Ecotricity aims to start with a 25-acre, 5 MW solar farm, possibly near its HQ in Stroud, but by 2020 it plans to have 500 MW of PV arrays, all over the south of the country. Meanwhile Benbole Energy Farm, working with the Penzance- based Renewable Energy Cooperative, is planning a 15-acre Sun Farm near St Kew/St Mabyn in North Cornwall. It's seen as part of an eventual £40 m 20 MW network of 10 Sun Farms in Cornwall and the Scilly Isles.

Ecotricity say that solar farms would not be blight on the landscape, arguing that they would be less obtrusive than wind turbines, or rows of polytunnels used to grow fruit and vegetables. Dale Vince, the company's founder, told The Times (14/5/10): 'They won't stand more than 2 metres (6.5 ft) tall so you won't see them if you look across the landscape because they will be obscured by hedgerows. You would see them if you were standing on a hill but the visual impact is very minor compared with wind arrays.' But, he said that some might have solar panels and turbines in the same fields. 'Solar panels and wind turbines complement each other well because in summer the winds are lighter but there is more sunlight, with the opposite in winter.'

The farms will cost £15–20 m each but Ecotricity will receive index-linked income for 25 years from the feed-in tariff, which starts at 29p/kWh, and should yield a return of at least 8% a year. That's the main reason why large scale PV solar is now being seen as economically viable in the UK. But PV costs are also falling as new cells emerge and the market for them builds, so we are likely to see more in future.

The Campaign to Protect Rural England said that it would be better to place banks of solar panels on factory and warehouse roofs and above car parks. But it felt that some farms in the countryside could be acceptable, depending on the quality of the landscape.

PV solar is of course also widely touted as an option for domestic roof tops, and the 'Clean Energy Cashback' Feed In Tariff should stimulate that significantly: npower has already reported an 80% rise in PV inquiries and consultants PricewaterhouseCoopers have suggested that the rate of UK installation of solar PV panels will increase five-fold this year because of the feed-in tariff.

However, solar water heating is a much cheaper and already very widespread option in the UK. Hopefully that will get supported strongly by the governments proposed new Renewable Heat Incentive (RHI), which should come into force next April. In its RHI consultation submission YouGen, a social enterprise lobby group for self-generation, said solar heating should be given top priority. That would make a lot of sense.

The UK may not be the obvious place to develop solar, but we do get enough annual insolation to make a significant contribution to meeting heat and power needs.

For more on renewable-energy developments and policy, visit www.natta-renew.org.