This site uses cookies. By continuing to use this site you agree to our use of cookies. To find out more, see our Privacy and Cookies policy.
Skip to the content

IOP A community website from IOP Publishing

February 2013 Archives

Flowers and bees communicate using electric fields

| | TrackBacks (0)
By Hamish Johnston

Spring will soon be upon many of us - and for me, nothing evokes the spirit of the season more than a bee buzzing from flower to flower on a warm, sunny afternoon. But I never would have guessed that a bee takes a measure of a flower's electrical field before it alights.

That's the claim of biologists at the UK's University of Bristol, who have shown that bees and plants exchange "information" in the form of electrical charge.

According to Daniel Robert, Heather Whitney and colleagues, flowers tend to accumulate negative charge, whereas bees gain positive charge as they fly. When a bee lands on a flower, some of the charge is neutralized and the flower takes several minutes to charge up again - which the team discovered by placing electrodes on the stems of petunias.

The team also used a charged powder to modify the charge on the surface of several different types of flower and found that the bees were able to distinguish between flowers with different charges.

The biologists speculate that the charge of a flower could tell a bee how long it has been since the flower was last visited by an insect. "This novel communication channel reveals how flowers can potentially inform their pollinators about the honest status of their precious nectar and pollen reserves," explains Whitney.

Finally, the team devised a learning test for the bees, which had the insects distinguishing between different colours. The researchers found that the creatures learned faster when different colours were accompanied by different electrical fields.

The research is described in this paper in Science and also in this press release from the university.



The Climate Change Committee's report on bioenergy last year argued that, at best, the UK might only get 10% of its energy from bio-source by 2050. The subsequent DECC/DEFRA/DfT Bioenergy Strategy was a lot more positive, as was the parallel DECC Heat Strategy. It claimed that biomass could supply up to 21% of the UK's energy by 2050 http://www.decc.gov.uk/assets/decc/11/meeting-energy-demand/bio-energy/5142-bioenergy-strategy-.pdf

Certainly it has attractions. Consultants Deloitte say: 'As the amount of intermittent generation technologies in the UK's energy mix increases, flexible fuel sources that can provide stable and predictable electricity will become increasingly more valuable. Sustainably-sourced biomass could provide this stability.

The big issue is whether biomass can in fact be used sustainably. DECC have proposed a limit on life cycle greenhouse gas emissions from biomass power that ranges from 200 to 285g CO2/kWh, but a report from the RSPB, FoE and Greenpeace says the standard is fundamentally flawed since it doesn't count significant emissions to the atmosphere when electricity is generated from wood harvested from forests. It claims they are actually much worse than from coal combustion.

The report argues that the usual rationale, that biomass combustion is carbon neutral since emissions are balanced by the carbon that was absorbed when the biomass was growing, is faulty. It is usually accepted that it wont quite be 100% balanced, since there will be emissions resulting from energy used in harvesting, transport and processing, and indeed that is included in DECCs emission standard, but the NGO report goes much further and opens up a fundamental issue. Focusing on trees, it claims that in fact, even with replanting, there will be significant excess net carbon dioxide added to the atmosphere since it will be emitted rapidly when they are burnt, but only absorbed slowly when the new plants start growing.

There is some truth in this- it's a dynamic system. It is clear that there will be a delay after mature trees have been harvested and burnt, before new plantings will start absorbing carbon dioxide. The report says the delay can be 'many decades'. It claims that over a 20-year period, emissions from power generation using wood from conifer plantations are 1879 g/kWh- 80% greater than coal power. It adds 'Over a 40-year period emissions are lower because the trees have had longer to re-capture carbon, but even then biomass emissions would be 49% greater than coal power. Only after 100 years does electricity generation from conifer trees perform better than coal. And, regardless of the time period, it's never better than the current grid average and never meets DECC's proposed maximum emission limit for Biomass' .

This may be overstated. If you look at the complete carbon life cycle, there may be mitigating factors. Trees absorb carbon dioxide most rapidly when growing and slow down later in life -so although mature trees and their roots will still absorb some extra CO2, they are less active and basically just carbon stores. Moreover they are not permanent carbon stores- they will eventually release it (as CO2 or methane) when they die, rot or burn. So if carbon sequestration is the only issue, it might be best to grow biomass and chop it down regularly before it's fully grown, and then grow more, while using it for fuel- and so avoiding the use of fossil fuels and their emissions. An extreme version of this approach is represented by the development of Generically Modified versions of fast-growing plants like eucalyptus, which, it is claimed, can grow 40% faster, so speeding up CO2 absorption and compensating for any short delay in absorption: gu.com/p/3bd4y/tw

However, this approach highlights some of the problems with using trees for energy. Trees play key complex roles in ecosystems, not just as carbon absorbers, so we need to understand what we might be doing if we use them for energy, not least in terms of water use. For example, fast growing eucalyptus sucks water out of the soil rapidly, which could be disastrous in some locations, as has already been seen in Asia: http://www.hindu.com/2011/04/28/stories/2011042865310400.htm and http://balwois.com/balwois/administration/full_paper/ffp-1296.pdf

Although not everyone agrees: http://etff.org/Articles/Eucalyptus.html

But GM enhanced biomass could also have many other negative environmental and climate impacts: http://www.globaljusticeecology.org/files/biofuels-ppt-web2.pdf

These issues, and issues relating to biodiversity and changes in land use, may turn out to be as important as the absorption delay issue raised in the aforementioned NGO report. For example, the report notes that the use of trees for combustion can divert timber from other uses, which means that more will have to be imported. That does seem to be unwise. So of course does the destruction of forests anywhere. That's why many environmentalists would prefer to stick with just occasional wood fellings, forestry and farm wastes, along with domestic biowastes, as a source of biomass for energy use, and many would also prefer AD biogas production to direct combustion. Certainly burning biomass, instead of coal, in old modified low efficiency coal plant is not much of a step forward- we need to think about CHP and district heating.

There is no question that the use of some types of biomass for energy is likely to be a poor choice- some energy crops have very low calorific value and mono-cultural plantations can be very bad for biodiversity as well as requiring a lot of water and undermining local ecosystems. Working conditions in some biofuel plantations in Asia and elsewhere can be abysmal. And they can divert land from food production. But there may still be a role for some high yield energy crops on marginal land, and for less invasive approaches, such as short rotation coppicing. Forests however, well that's a different matter. It seems clear that we should avoid deforestation and unsustainable imports, and look to other less damaging approaches to biomass sourcing and use, but we need to understand a lot more about the carbon life cycle in varying climates and locations before we can pronounce finally on the net impact of managed wood use.

'Dirtier than coal? Why Government plans to subsidise burning trees are bad news for the planet' Friends of the Earth, Greenpeace and the RSPB, 2012, http://www.rspb.org.uk/Images/biomassreporttcm9-326672.pdf

• While some greens see it as the only sensible use for CCS, the idea of capturing CO2 from biomass combustion, so making it carbon negative, has also come under attack A recent Biofuelwatch report says Biomass energy with carbon capture (BECC), like CCS in general, will not be economic, or carbon neutral (due to delayed re-absoprtion) much less negative, and it worries about 'underground land-grabs' for storage space. Given their concerns about the environmental implications of biofuel production, Biofuelwatch also worry about ethanol production involving fermentation, which results in a pure stream of CO2 that can be directly captured and biodiesel production using the Fischer-Tropsch method, which involves production of syngas as an intermediary step, and offers opportunity for CO2 capture. Overall, even with high yield biomass crops , they don't think there will be enough land to allow for significant sustainable biomass production, so BECC is unlikely anyway to be much of an option. www.biofuelwatch.org.uk/2012/beccs_report/

A factor of two

| | TrackBacks (0)

By Michael Banks, news editor of PhysicsWorld

"A factor of two is not a small thing, it is quite a challenge," says Robert McCory from the University of Rochester in New York.

McCory was speaking about the latest in laser-based fusion research (known as inertial confinement fusion) at the 2013 AAAS conference.

The National Ignition Facility (NIF), which began full operation in 2009, is based at the Lawrence Livermore National Laboratory in the US and is currently the world's leading laser-based fusion device.

It uses a 1.8 MJ laser to create X-rays that act to rapidly heat and compress a capsule containing deuterium and tritium, which cause the hydrogen isotopes to fuse. Yet so far the facility has failed to achieve "ignition" - the point at which the fusion reactions generate enough heat to become self-sustaining. Indeed, the best shot so far at NIF has fallen short of the pressure needed to compress the fuel enough to produce ignition by a factor of two or three.

This has led to some to suggest that NIF needs to change course. The current method - indirect drive - uses X-rays rather than the light itself to heat the fuel capsule. But some say that direct drive should now be employed - where the laser light is shone directly onto the fuel capsule.

However, McCory, who runs a direct-drive experiment at Rochester called OMEGA, has done simulations showing how results on his machine could scale to NIF conditions.

"If we take our best OMEGA shot and extrapolate it to NIF then we would still be a factor of two or three away in required pressure from ignition," says McCory.  "And there is still physics that does not scale well, so there are added uncertainties."

Researchers working on both direct and indirect methods are now looking at ways to get around the factor of two problem, which include trying to remove impurities in the plastic shell that holds the fusion fuel.

NIF scientists still remain confident, however, that they will overcome these barriers. Hopefully, policy-makers will have the same amount of patience.

A trip to MIT

| | TrackBacks (0)

By Michael Banks, news editor of PhysicsWorld

It may have been the prospect of free pizza that led me to hop on a bus heading to the Massachusetts Institute of Technology (MIT).

But apart from a free lunch, we were also promised a tour of MIT's fusion facilities, which are based at institute's Plasma Science and Fusion Center (PSFC).

So after a few slices of pepperoni pizza, we donned the hard hats and moved on to the tour, which included a look at MIT's main experimental fusion facility - the Alcator C-Mod fusion tokamak.

Operating since 1991 and with a budget of around $25m per year, Alcator C-Mod is a magnetic-confinement fusion device. It heats up a plasma of deuterium and tritium atoms to millions of degrees kelvin, which causes the hydrogen isotopes to fuse and release energy.

However, Alcator C-Mod faces an uncertain future. Last year Congress slated the facility for closure after increasing the budget for the ITER fusion reactor in France. Given no increase in the Department of Energy's budget for fusion - standing at around $450m per year - the cut had to then come from the domestic fusion programme.

This year the facility has been given around $14m, which will keep it running. But PSFC director Miklos Porkolab likens this to a "warm shutdown", adding that this is just enough to keep the staff working on the facility.

Indeed, Porkolab says he has been spending most of his time thinking about the budget effect and trying to lobby for more funding to keep Alcator C-Mod running. He admits that the chance to take some people on a lab tour - even on the weekend - is a welcome pause from worrying about budgets.

While Alcator C-Mod is not the only fusion facility at the PSFC - indeed some researchers based there work at the National Ignition Facility in California - its shutdown would be devastating for the lab. "We really don't really want to think about that at the moment," admits PSFC fusion researcher Amanda Hubbard.

Enhanced by Zemanta

Pugwash Scenarios

| | TrackBacks (0)


British Pugwash has produced a report looking at three possible 2050 UK energy scenarios - high nuclear, high renewables and an intermediate scenario. They were all run through the DECC Pathways analysis software to see how they stood up.

Pugwash has been working as an international body since 1955 to diminish the role of nuclear arms in international politics and towards nuclear disarmament, but it seems has no specific view on civil nuclear, apart perhaps from general background sympathy for the post WWII 'Atoms for Peace' position, given that many of its members were of that generation. But some of the younger members may have anti civil nuclear views-including those in the UK.

The UK Pugwash group enlisted support from some external experts to help produce the report. I was asked to produce the 'maximum renewables' section. It was a challenge. But, working with Dr David Finney, who helped with the modeling, I found that it does seem that, assuming a sensible commitment to reducing energy waste (an eventual 40% reduction), it would be possible to match supply and (reduced) demand with almost 100% renewables by 2050, with off shore wind, wave and tidal power playing major roles, along with PV solar, for electricity supply, and some of this electricity being used, together with biomass and solar, for heating, and, along with biofuels, for transport.

With 76GW of offshore wind and 30GW of on land wind, the High Renewables scenario supplies electricity for direct use, and for heating and transport, but that leaves plenty for export. Indeed exports of excess wind derived electricity would earn around £15 billion per annum, by 2050. There is also heat from biomass and solar, but no biofuel or biomass imports. Hydro, geothermal, biomass CHP, some stored electricity and some stored green gas, along with interconnectors (up to 15GW) and demand management, balance the variable renewables.

The DECC calculator indicated that, with the balancing provisions, by 2050 there would no need for extra fossil backup capacity. Even when energy demand was high and the input from wind and the other variable renewables was low, it met demand and indeed, most of the time, oversupplied, leading to a significant potential for net power exports. It also easily achieved the DECC emission reduction target.

To get to a full 100% renewables, while being able to balance the system when wind was low, we wanted to make use of some of the excess wind derived electricity to make hydrogen and store it ready to meet demand and replace the residual fossil input. The DECC software would not allow for that, or for all the fossil fuel transport input to be removed, so the scenario we tested did not reach 100%, but were confident that this would be possible.

Looking ahead to 2050 is hard, so as a guide I used existing scenarios where I could. For the electricity side, I used the contributions outlined in the 'Max' 2050 electricity scenario produced by Poyry in 2011. That actually now looks quite conservative, given for example the capacity gains PV solar has made recently- DECC now says PV might supply 22GW by 2020. And also the 9.5 GW of geothermal power that REA/SKM say are possible. The only adventurous element on the electricity supply side was the assumption that floating offshore wind systems would be available for deep-sea use. But to be cautious I cut the Poyry offshore wind allocation in half.

On the heat side however I was a bit more adventurous. In addition to the idea of producing green gas, using excess wind derived electricity, I made use of some the emerging ideas about solar and biomass fed district heating and heat stores, including possibly interseasonal heat stores. But DECC now seems to have rccognised these as possibilities. The land-use implications of biomass requirement might be more problematic (10% of UK land) , but if that proved to be major issue then some biomass could be imported e.g. wood chips from Canada.

Given the novel boundary-crossing options, it was not easy to run it on the DECC software- as noted above that won't let you replace all the fossil oil and gas with syngas from wind ! But nuclear was easy to remove. It was just not needed. The Pugwash 'maximum nuclear' scenario by contrast saw it as essential, supplying over 74% of electricity by 2050. I will leave that to you to decide if that is realistic or needed.

I have avoided scenario writing in the past since I felt that all that did was allow you the project unrealistic normative and technical assumptions into the future and give a spurious sense of legitimacy to them. However maybe it is useful to test the technical viability of specific policies and have assumptions played out for all to see! Even so I still have my doubts. Prof. David Mackay has sensibly made much of the need for quantification and for 'the sums to add up', but unfortunately, at this stage, given the open-ended nature of the emerging system, it is hard to come up with reliable data to feed into the DECC model and the model itself has problems. So perhaps, for now, we still have to fall back on what Mackay depicts as 'Hot Air', that is, more general strategic assessments.

Although I did include some detailed technical rationales, perhaps inevitably, I also found it necessary to make some policy points. For example I argued that 'There are longer term advantages from focusing on renewables, not least the fact that the energy sources will never be exhausted and the conversion technology is likely to continue to become more economic. It is sometimes argued that, while that may be true, the UK should wait until the technology has developed (presumably mostly elsewhere) before deploying it widely. But that argument ignores 'first mover' commercial and technological advantages. It might be wise for the UK to focus on it strengths, which currently are in the marine renewables field, but more generally, as Barak Obama put it eloquently 'the country that harnesses the power of the clean, renewable energy will lead the 21st century'.

The UK may not be able to lead in all areas, but it can be a major player, whereas it stands little chance of leading in nuclear or CCS. Given that the UK has probably has the worlds best renewable resources, as well as established technological expertise, particularly in offshore engineering and marine technology, there is a strong strategic case for focusing on renewables.' Certainly of its is diversity you want, then renewables offer a range of options at various scales.

My venture into modeling confirmed that renewables really can help us attain a sustainable energy future and by 2050, if we so wished. Several studies have suggested that while some of the initial investment costs might be higher than for some conventional technologies, longer term the overall cost would be similar if not less- since we would no longer have to import expensive fossil fuel or deal with the extra cost of nuclear power. And, interestingly, on the basis of DECC's Calculator, the Pugwash High Renewables Pathway did turn out to be slightly cheaper than the nuclear/CCS based pathways also looked at in the report.

The Pugwash report is at: www.britishpugwash.org/

A lot new under the sun

| | TrackBacks (0)


Solar energy continues to develop as new ideas emerge. There is over 70GW of solar PV now installed around the world and new cheaper cell technologies are emerging, such as die sensitized cells: www.idtechex.com/research/reports/dye-sensitized-solar-cells-technologies-markets-and-players-2012-2023-000328.asp

Longer term, other novel options are also on the horizon. For example, researchers at Stanford University have developed a carbon based thin film cell, with a nanotube cathode and a graphene anode sandwiching an active layer made of nanotubes and buckyballs, all made by printing or evaporating from inks. When fully developed this technology holds out the promise of new robust applications- a tough spray on PV surface for use in extreme conditions. pubs.acs.org/doi/full/10.1021/nn304410w

However, although important, PV cell development may not be the key issue . Solar power is inevitably limited by the fact that it gets dark every night, so energy storage is all important. Although batteries can be used in some situations, in general, storing electricity is hard, but it can be converted into hydrogen gas by electrolysis- and that can be stored, ready for use for electricity generation when needed, or for other fuel uses. So there is some interest in solar PV arrays linked to electrolysis, possibly using focused solar mirrors, since mirrors are much cheaper than solar cells. Large-scale Concentrating PV arrays ('CPV') in deserts are one option.

Another approach entirely is to use focused solar heat to make hydrogen directly by thermal dissociation of water molecules- but you need very high temperatures and the process is not very efficient. A more indirect approach, as in the Solar Gas project in Australia, is to use focused solar heat to convert a mixture of methane (natural gas) water and/or carbon dioxide into new higher value synfuels. CSIRO claim that you can get an extra 25% energy gain-a solar upgrade. http://csirosolarblog.com/tag/solargas/

Meanwhile, Rice University's Nanophotonics Lab in the USA have developed a solar thermal system using nanoparticle absorbers which heat up in sun light and flash off steam when immersed in water . They have tested a parabolic focusing device. The system is claimed to have an overall energy conversion efficiency of 24%. Focused solar is already widely used to boil water and produce hot gasses for feeding into a combined cycle gas turbine. See www.solugas.com/

But of course that wont work at night. However, it is relatively easy to store heat. Molten salt heat stores are one option. Typically they use a mixture of 60% sodium nitrate and 40% potassium nitrate to store some of the heat from large focused-solar Concentrating Solar Power (CSP) plants, for use for continued stream raising at night, to make 24 hour power generation possible.

Several CSP plants have been built in desert areas around the world, some with molten salt heat stores, and much has been written about the prospects for desert solar in North Africa and the Middle East generating power for local use (including for desalination) and also for export via HVDC supergrids to the EU, as envisaged by the Desertec. www.desertec.org

CSP is still expensive and, as noted above, some interest has been shown in large scale PV using focusing mirrors ('CPV') , which may prove to be cheaper. However recently Siemens pulled out of the Desertec project, saying they wanted to concentrate on what they felt were more appropriate technologies for them- wind and hydro. Even so, key players like RWE, E.ON, Deutsche Bank, ABB, and the German reinsurer Munich RE, along with 50 other groups, are still involved, and the idea of using desert solar seems unlikely to go away, with local CSP projects emerging independently. Egypt already has a CSP plant just outside of Cairo , Morocco is planning one, Algeria's 25MW Hassi R'Mel unit started up last year, backed by a 130MW gas-fired plant, and another solar plant is planned there, while Tunisia's 'TuNur' CSP project should ultimately have 2 GW of electricity generating capacity. In parallel, as I have mentioned before, Desertec and others have been looking at the potential for CSP in the Gobi desert, as part of a pan-Asian green supergrid power network. And the Sahara Forest Group is looking to combine CSP, desalination and biomass production in solar greenhouses- their project in Qatar is now running http://saharaforestproject.com/

There is thus no shortage of big ideas for the future. Moving things on even more, in a paper entitled 'Solar-Based Man-Made Carbon Cycle and the Carbon Dioxide Economy', Detlev Mo¨ller outlines a visionary plan to link solar electricity production such as Desertec CSP, with CO2 utilization via (chemical) air capture (i.e. from the atmosphere) as well as conventional CCS. CO2 would then be reacted with electrolytically produced hydrogen to produce fuels for direct use or for electricity production when needed. The 'SONNE' approach, as he calls it , would thus seek to build a man-made carbon (CO2) cycle, like the natural assimilation/respiration carbon cycle by which CO2 is recycled and changed from waste (emissions) to a resource, the process energy being supplied by solar energy. AMBIO 2012, 41:413-419 (DOI 10.1007/s13280-011-0197-6).

As I have noted before, there have been similar ideas circulating for linking hydrogen production from other variable renewables like wind with air captured CO2, to produce green syngases and fuels, or to upgrade biogas produced from biomass. See for example http://www.iset.uni-kassel.de/abt/FB-I/publication/2010-088_Towards-renewables.pdf

Wind power may be the cheapest major renewable available at present, but the global solar resource is very much larger (wind, after all, is just an indirect form of solar energy), and the technology is developing rapidly, so longer-term solar, in all its varieties, seems likely to become a dominant option. There is already over 245GW of solar thermal capacity in use around the world, rivaling wind power : see my earlier Blog: http://environmentalresearchweb.org/blog/2012/09/solar-power--245gw-so-far.html.

However, as I have indicated above, that could just be the start. Even in the cloudy UK: where so far nearly 2GW of PV has been installed- with more to follow, including a 32MW solar farm near Leicester

Renewable Performance

| | TrackBacks (0)


Some renewable energy sources are variable, so over a year the actual energy output from wind turbines etc will be much less than the theoretical maximum if the energy conversion devices was able to work at 100% efficiency for the full time using its full rated power. The ratio of the actual effective capacity the device offers to its nameplate installed capacity is sometimes called the capacity factor, or more usually load factor.

In the UK on land wind load factors vary from below 20% to above 40% depending on location. 25% is often taken as an average, but that has been moving up to 30% as the technology improves. Offshore wind is sometimes quoted as 35-40% but DECC uses 45% for new projects. The best for far achieved, at Horns Rev II off Denmark, is 47.7%.

PV solar is quoted in the range 10-20% depending on location e.g. its 19% in Arizona. The UK range maybe 10-12% Hydro is site specific: global average 44% but it can be very much higher or lower depending on location and year e.g. down to 22% in poor rainfall years in the UK. Biomass plants can have quite high load factors- since the fuel is storable. So they don't need backup/balancing- indeed they can be used for that. For some examples of LFs for current UK renewable energy projects see: www.ref.org.uk/energy-data.

And for DECCs views on future systems, see the table below.

DECCs Load Factors from the UK 2050 Pathways spread-sheet

Offshore wind 45%

Onshore wind 30%

Wave 23%

Tidal stream 36%

PV solar 10%

Hydro 38%

Biomass - electricity from CHP 90%

Geothermal - electricity from CHP 80%

Fuels from biomass 90%

Solar hot water 50%

Bioenergy, energy from waste 80%

Tidal range 23%

http://www.decc.gov.uk/en/content/cms/whatwedo/lc_uk/2050/2050.aspx [DECC have moved web home, but you will be redirected!]

For comparison, the 2011 DUKES quotes UK nuclear load factors as: 69.3% (2006), 59.6% (2007), 49.4% (2008), 65.6% (2009) and 59.4% (2010). That averages out at 60%. But optimistically it uses 80-90% for future projects. The NEI quotes average US nuclear load factors (1971-2009) as 70% but again future projects are assumed to reach much higher figures.

Degradation over time

The anti-wind lobby has recently argued that, in practice, wind load factors will reduce by around 1-2% p..a. as the machines get older, due to mechanical wear are tear and blade damage (e.g. under the impact of bugs, birds and debris) and increased maintenance outages. See the data at: http://windfarmrealities.org/?p=1284.

However, it's hard to separate out variations in wind availability over the years from any operational degradation and technical/operational improvements. Danish offshore load factors from 1980 to 2011 mostly seem to indicate continual improvement. http://energynumbers.info/capacity-factors-at-danish-offshore-wind-farms

All mechanical devices suffer from ware and consequent reduced performance over time, but that is likely to be much more pronounced and significant for high-temperature close-tolerance steam and gas turbines i.e. for conventional power plants. While, there is no question that, being exposed to the elements, wind turbine blades, just like aircraft wings, will need cleaning regularly, there are some clever ideas emerging for that, which avoid the need for them to be shut down for the duration to give access to cleaning crews- with all the associated risks of working at height: Opinno have a self powered mechanical system that crawls along the tower and spreads water which cleans the blade when it passes through the spray: That sort of thing will be vital for offshore devices, where salt encrustation will be the key issue.www.opinno.com/blade-cleaning/

PV cell performance can reduce over time (especially with thin film cells) and PV arrays also need regular cleaning to avoid loss of output, otherwise, depending on location, you can get maybe 10-20% or more degradation over a year from bird droppings, leaf mould, road grit, dust and the like. With the rapid expansion of domestic PV, there is a growing trade in cleaning services as an extension of window cleaning. Using detergent has negative eco impacts, but some self-cleaning surfaces have been developed, e.g. see http://www.solarguide.co.uk/new-self-cleaning-glass-could-be-used-in-solar-panels and http://www.solarpowerportal.co.uk/productcatalogue/self-cleaningcoatingforsolarpvglasscouldincreaseefficiencybyupto

In addition electrostatic dust repelling systems are being developed for use in with CPV and CSP in desert areas, based on technology developed for Mars missions : http://www.ecoseed.org/technology/13801-mars-inspired-technology-makes-pv-panels-self-cleaning

The long-levity of renewable energy technologies also varies. Wind turbines are expected to last a couple of decades before needing replacement although increasingly projects are re-bladed quickly to take advantage of improved, uprated, designs. PV cells also have lifetimes in the decades, but can perhaps sometimes be beneficially replaced earlier by newer more efficient versions. These system, along with modular wave and tidal stream devices, are very different from large very capital intensive very long lead-time projects like tidal barrages, where, once built, the civil engineering infrastructure will last for hundreds of years, with just the turbine generators being replaced every 50 years or so.

Nuclear plants are somewhat similar- they have very long construction lead times but it's claimed that once built they can run for 40 years for more, with occasional refits and extensions, although that often results in down rating. In addition big inflexible systems like this face the problem that over the long periods that they can be run, rival technologies may emerge which will make them economically and technologically obsolescent. In addition there are large and often unknown back end cost- for waste disposal and final decommissioning.

All of the factors mentioned above are reflected, to varying degrees, in the economics of the various technologies. At present on-land wind is usually seen as the most viable economically of the new renewables, established hydro and biogas waste plants aside, and as being competitive with new nuclear, with PV seen as likely to come up fast behind, followed by offshore wind and then hopefully tidal stream and perhaps wave . Opponents of wind /PV etc say the cost estimates do not take into account the grid balancing/backup intermittency costs, while opponents of nuclear say they don't take back end and insurance cost fully on board- or the cost of providing extra reserve capacity to cope with the risk of nuclear plant outages. For the moment, in reality, the difference may be too close to call, in which case decisions should really be based on wider strategic issues - such as long term energy security and fuel costs and availability. On that basis, although as I've indicted there are some practical problems, overall renewables seem to win outright- for most, the fuel is free and will be available indefinitely into the future , while fossil and nuclear fuels will inevitably become more constrained and expensive.