Is it right to drive cost reductions in renewable technologies by use of direct production subsidies that are adding increasing amounts to domestic bills?

Or should we be spending more, much more, on fundamental research and development?

The argument is this. Broadly speaking, we can achieve cost improvements in any technology either by accumulating production experience (usually called 'the learning curve') or by targeting improvements in technology.

It is often difficult to disentangle the two phenomena but I still think the distinction is useful. Put another way, should we trying to cut prices by 'learning by doing' or by 'learning by research'?

Feed-in Tariffs emerged as the popular solution

Governments around the world have backed away from energy research. In the 1970's administrations that had been frightened by the OPEC oil embargo put big sums into R&D, particularly into nuclear but also into wind.

Outside France, that investment largely failed, and failed catastrophically. Energy R&D then plummeted around the world. A decade ago, UK energy research was costing just a few tens of millions a year. (It has gone up somewhat since).

Instead of research, governments decided to back 'learning by doing'. They offered production subsidies (now often called Feed In Tariffs) to get investors to put capital into wind, solar and a few other technologies.

This, they correctly foresaw, would allow manufacturers and installers to cut costs. The learning curve (which I pedantically call the 'experience curve') swung into action as it almost always does (except in nuclear).

As the accumulated volumes of wind turbines that had been built doubled, costs fell by about 14%. The rate of learning for PV looks greater, at about 20%.

When I talk about the experience curve, I don't just mean the cost improvements arising from larger turbines, or bigger factories. For some almost magical reason, costs fall in a reasonably consistent and predictable way just because companies get better at making the turbine.

So it's obvious why governments like Feed In Tariffs. Prices do go down without any obvious reason.

Research is a very different game

Compare this with 'learning by research'. Put $100m into some crazy new idea for making solar panels and you are 95% likely to fail.

Faced with media always eager to locate apparent stupidity, or even corruption, no government minister or senior official will want to back the latest idea coming out of the Oxford Science Park or an automotive supplier in Swindon knowing that she is fairly certain to look really foolish within a year.

As a result of bureaucratic risk aversion, direct subsidies are going to cost the UK consumer £3bn this year while government energy research languishes at perhaps 6% of this figure.

Though to be honest, this estimate is almost guess - nowhere can I find an authoritative estimated of the total budget for government energy R+D is for 2014/15, The DECC documents I have seen are extraordinarily confusing and obfuscatory.

There's an even more important problem as well. At present the UK government doles out its R&D budget in tiny spoonfuls. It gives £1m to this nascent technology, a few hundred thousand to another, and a generous £3m to a particular favourite. In my view this not just pointless, it is actively counter-productive. Little dollops of cash have a truly awful effect.

We need transformational, not incremental change

I'll try to explain why this is. Engineers leaving universities or companies with a brilliant idea need money. And government will often provide this, even when venture capital does not. Bodies like the grandiosely named Technology Strategy Board will drip small amounts of cash into many ideas-based companies.

It won't actually be enough to pay for real innovation or commercialisation but it will be just about enough to keep the business alive.

Why is this bad? It means that the talented engineer will stay beavering in his lab night after night hoping to make marginal improvements that can justify the next request for government rations. He works for the government, not for the marketplace. Actually, it would be far better if he failed, went broke and returned to the labour market where he could exercise his (undoubtedly real) skills on another project.

Spreading a hundred million pounds or so a year over perhaps two hundred potential innovators in the UK energy market is a mistake. It would be far better to gamble (and, to be absolutely clear, this is gambling) tens of millions on the technologies that might really make a difference.

This is the way it would happen in the States but even there the disastrous experience of backing PV venture Solyndra has chilled the willingness to try to back winners.

FITs - the law of diminishing returns

But back winners we must, however unfashionable this task seems to be. Without large punts, progress on cost reduction in renewable technologies will slow.

Let's look at one example of this. What will the iron law of the experience curve do to the cost of wind turbines over the next few years? More precisely, if we do decide to continue to back wind globally, but only by means of production subsidy, we're reliant on the expected magical cost reduction of 14% for every doubling of accumulated (not yearly) production.

Let's say we want to cut the cost of wind in half to make it competitive with fossil fuels across the world. If the 14% experience curve continues to operate, we'll need to expand total accumulated production about eighteen fold.

I guess that the world now has about 250 gigawatts of production experience of turbine production. That would mean we'd need 4,500 gigawatts of wind turbines to get down to 50% of the current cost. And this amount is equivalent to almost the entire world generating capacity today.

It's also going to be very expensive indeed. To cut costs by 14% when the world had made 1 gigawatt of wind turbines required another 1 gigawatt to be manufactured. To do so now requires 250 gigawatts to come off the production line.

And although the subsidy needed has fallen considerably since the days of 1 gigawatt accumulated production, it has probably only declined by five or six fold per unit of capacity. So overall subsidy costs might be 50 times as much.

Of course this is a very generalised argument. In some windy places, including much of the central USA, wind is probably already almost competitive with new gas-fired plants. In other countries, wind will never be a real choice.

The problem - the renewables industries have a lot to lose

I just want to make the point that for governments to rely on the experience or learning curve to drive down costs inevitably becomes more and more expensive. By contrast, sponsoring R&D doesn't cost more as technology advances. It probably costs less.

So my case is simply that whether it be wind, PV, anaerobic digestion, heat pumps, geothermal energy, tidal lagoons or micro-hydro, genuine background R&D must make more and more sense. Intelligently directed in large amounts per idea, it may create large improvements in costs.

I know of three technologies (in wind, PV and AD) that may have the potential to reduce the underlying price of energy by at least 50%. None will come to market through the aid of Feed In Tariffs.

All of them need tens of millions of pounds, which may well not be available from commercial sources. A small fraction of the billions now used to subsidise existing technologies needs to be diverted into directly backing companies like these.

But, of course, this is unlikely to happen. The renewable industries, who are so ready to criticise fossil fuel subsidies, are now addicted to their own guaranteed cash streams from government and have growing lobbying power. The genuine innovation that we need is in danger of never happening.

Chris Goodall is an expert on energy, environment and climate change. He blogs at Carbon Commentary

This article is an extract from a longer blog post on Carbon Commentary.