IPCC Scenarios and Spontaneous Decarbonization

May 25th, 2008

Posted by: Roger Pielke, Jr.

Joe Romm has helpfully posted up his full reply to Nature on PWG (PDF), and we are happy to link to it as promised. And after reading Joe’s original letter and his comments, the source of his complaint — and confusion — is now clear. This post explains that Joe has confused the differences between different IPCC SRES scenarios with spontaneous decarbonization within each individual scenario.


Figure 1.png

The Figure above is from our Nature paper. It shows for the six SRES families (A1B, A1FI, A1T, A2, B1, B2) cumulative emissions to 2100. For now lets ignore the light blue part of each bar (which represents the spontaneous or automatic decarbonization that we discuss in the paper, and which I return to below).

Joe Romm points out in his critique:

The Special Report on Emission Scenarios (SRES), which the Commentary cites, makes clear that while the SRES scenarios don’t technically have climate policies, they can and do have energy efficiency and decarbonization policies, which are the same thing. That’s clear from examining the B1 scenario, which includes aggressive policies that help limit total global warming to about 2°C

He is correct in this assertion. The effect of these policies in the B1 sceanrio can be seen in the difference between the height of the green plus red (G+R) parts of the B1 bar and the same G+R portion of the bars for the other scenarios. Clearly, the B1 G+R is closer to the dotted line than any of the others (though A1T is also close). The “energy and decarbonization policies” that Joe Romm refers to are those that account for the difference in height between the G+R parts of the bars in our graph across scenarios — which is completely different than the assumptions of automatic decarbonization within each scenario which are reflected in the light blue parts of the bars.

Automatic decarbonization occurs in the IPCC scenarios not because of specific policies that the report discusses, but because of assumptions that it uses within individual scenarios (specifically, assumptions of decreasing carbon and energy intensities). Whatever policies are associated with these assumptions are not discussed by the IPCC. The decarbonization of the global economy reflected by the light blue portions of the bars in the figure above are indeed accurately characterized as being “automatic” or “spontaneous.”

In its editorial discussing our paper, Nature clearly understood this. Joe Romm apparently does not. He has confused the differences between aggregate emissions across scenarios with assumptions of automatic decarbonization within scenarios.

Now that Joe has released his original letter to Nature, it is clear why they asked him to correct his error of interpretation. It is also clear why his claims that we have made an error in our analysis is incorrect.

18 Responses to “IPCC Scenarios and Spontaneous Decarbonization”

    1
  1. jromm Says:

    Roger –

    We are getting to the heart of the matter just as everybody I know, me included, has lost interest.

    Since you don’t dispute the more important of my primary critiques of your paper I assume you agree with it. Namely, you claim, the IPCC is “diverting attention from policies that could directly stimulate technological innovation,” and conclude “Enormous advances in energy technology will be needed to stabilize atmospheric carbon dioxide concentrations at acceptable levels.”

    That is most certainly not proven by your paper — and as I point out in the extended comment on your earlier post on letters plus my posts on the subject — it is not actually true.

    As I explain here
    http://climateprogress.org/2008/04/14/the-decarbonization-story-and-why-a-carbon-price-beats-tech-breakthroughs/
    The decarbonization data makes clear that if you want to beat 450 ppm and avoid catastrophic climate impacts, a significant price for carbon (plus aggressive technology deployment) is much more important than technology breakthroughs.

    Now back to the other point. The IPCC does not assume magical “built-in” reductions or spontaneous technological gains, but rather explains that a variety of strategies and policies are needed to bring about reductions and gains.

    Tol describes the “problem” on your earlier post: “The SRES scenarios assume very substantial technological change. A number of people in the field, particularly engineers and politicians, do not know that — and they present the difference between ‘frozen technology’ and ‘business as usual’ as ‘climate policy.’ This has led people to believe that climate policy is effortless and costless.”

    But I have never met anybody who has that belief based on the IPCC reports nor seen a major media outlet cover the IPCC story that way. Indeed, the Working Group 3 report explicitly lays out the policies that are need to move from frozen technology to high gains in energy efficiency and decarbonization.

    Look at Table 3.8 on page 216, which “summarizes scenarios with more than 40% CO2 reductions (2000–2050) in several developed countries.” Figure 3.3.1 on the next page clearly show a high price for CO2 is needed for such cuts — perhaps up to $150/ton of CO2, (a price I personally doubt would be needed if you had the right tech deployment policies) that certainly debunks any notion that the IPCC believes this will be effortless or costless or occur spontaneously.

    Now, for me, the fact that your conclusions attacking the IPCC were wrong did, I confess, lead me to focus more on them then on the analysis. I suppose what I found most odd about the Nature piece was that what the article claims to be its big analytical finding “there is a significant technological change and diffusion of new and advanced technologies already assumed in the baselines” is in fact stated directly in the WG3 report — indeed, the quote I just gave is from that report, page 219.

    Now you write, “Whatever policies are associated with these assumptions are not discussed by the IPCC.” But that is simply not true. The IPCC WG3 discusses the necessary policies at great length. They just don’t MODEL those policies. All you can say is the trivial statement that they never make a 1-to-1 correspondence between all the policies described in WG3 and what those policies would actually achieve if they were adopted in different scenarios.

    The Nature paper’s supposedly original conclusions fall into two categories — those that the IPCC acknowledges explosively itself (and hence are not original) or those that your analysis does not actually support (and hence are not true).

    The fact that Tol uses your paper to defend a perspective that is not in fact widely held — and I repeat my request for specific examples of either influential people or the media who have such a misguided view explicitly derived from the IPCC — is again evidence your paper’s specific critique of the IPCC is not justified.

  2. 2
  3. jromm Says:

    “The Nature paper’s supposedly original conclusions fall into two categories — those that the IPCC acknowledges EXPLICITLY itself (and hence are not original) or those that your analysis does not actually support (and hence are not true.”

    Ah, the joys of using voice-dictation software.

  4. 3
  5. Roger Pielke, Jr. Says:

    Joe-

    You can dance around the issue as much as you want, and change the substance of your critique now that it has been shown to be incorrect.

    You misinterpreted the paper and based on that misinterpretation you have attacked me, Tom Wigley, and Chris Green. It of course is fine for you to take issue with our interpretation of the policy significance of the analysis — As I have noted, based on the letters in response to our paper, there is a wide view on this among experts.

    But Nature was right to ask you to change your misinterpretation, rather than publish information it knew was incorrect (but you apparently did not). In fairness to me, my co-authors, and Nature, you should thus correct your statements to the contrary on your blog.

    At a minimum you should stop the personal attacks, but an apology for the misinterpretation would be more dignified.

    PS. Voice dictation, explains the logorrhoea ;-)

  6. 4
  7. JamesG Says:

    Do the IPCC still use an rather pessimistic value of 120 years for the CO2 lifetime?

    And an unrealistically high value for the amount of recoverable fossil fuels?
    http://gristmill.grist.org/story/2008/5/6/10545/14025

    The greenpeace plan seems much better thought-out to me because they consider only the grim realities of what we can do and put real numbers on it, rather than guessing about CO2 levels using guessed lifetimes and guessed emissions scenarios. Perhaps a bit optimistic about nuclear but then…

    Anyway, maybe Joe is right even if for the wrong reasons.

  8. 5
  9. jromm Says:

    Roger:

    You can continue to ignore my main critiques of your paper, as you have for months, but that doesn’t suddenly mean your paper’s main conclusions are correct. Fundamentally, the title of your Nature paper was never justified — and it continues to this day to be unjustified.

    There is NOTHING “Dangerous” about any of the IPCC’s assumption, as I will explain below. Indeed, had they not made those assumptions, you would have no doubt published an article ridiculing them for it!

    The only people who deserve an apology at this point are the IPCC, for your unjustified attack on them.

    Let’s be clear here — although I did not misinterpret the paper (see below), the debunking of the paper’s main conclusions do not rest on this specific narrow technical issue (anymore than they rest on the specific narrow technical issue of your incorrect definition of the word “decarbonization”). If it did, then I can’t imagine why you waited until now to bring up this issue which I blogged on over a month ago.

    Since you have never once rebutted any of my critiques of the central conclusion of your paper — since your specific critiques of the IPCC failings are either not original or are simply wrong — why would I correct any of my statements?

    I moved on from this a long, long time ago — but then you decided to rehash this all by posting your misinterpretation of my letter (and again, you have never bothered to rebut the most important critique of that letter).

    You have never once conceded the myriad mistakes you have made, including your unjustified attack on the IPCC for supposedly distracting people from the necessary strategies and your bizarre definition of the word “decarbonization” which doesn’t even agree with the definition used in your own sources.

    The strangest thing is that if you actually believed the conclusions of your Nature article than either you would stop claiming you endorse the goal of 450-500 ppm (since you keep saying in the paper and elsewhere it is all but impossible to achieve) — or you would actually propose a specific strategy of getting to that goal, as I have.

    At the minimum, you should stop rehashing these unjustified attacks on the IPCC.

    As for the “misinterpretation,” I moved on from explaining the tedious details of this over a month ago. The issue is why the changes in energy efficiency and decarbonization occur in the IPCC scenarios beyond “frozen technology.” As the IPCC explains, “The range labelled ‘frozen technology’ refers to hypothetical futures without improvement in energy and carbon intensities in the scenarios.”

    The key point here is that frozen technology includes existing technology that has not yet been deployed, as well as improvements to existing technology, as well as new technology. Is it “Dangerous” for the IPCC to assume that people will actually keep deploying existing technology that turns out to be economical, or that they will continue improving existing technology as they have for centuries, or that they will develop new technologies?

    Of course not! Especially if the world introduces a high carbon price (as the IPCC recommends) and if there are new policies aimed at promoting technology deployment and removing barriers to things like energy-efficient technologies that are already cost-effective (as the IPCC recommends).

    So then the only question is, as your paper states and as people like Tol read your paper to state, has the IPCC misled anybody by making those assumptions? Has the IPCC created the impression in anybody’s mind that such further improvements in energy efficiency and carbon intensity could possibly be accomplished without a very strong policies?

    Of course not! They in fact devote an entire report — Working Group 3 and its summary — to detailing all the things you would need to do for energy efficiency and decarbonization.

    So there are no “dangerous assumptions” in the IPCC report — at least in regards their assumption that historical trends in energy efficiency and decarbonization would continue. [As I have blogged many times, the IPCC Does underplay the likely speed and severity of future climate impacts, and it underplays the likely impact of amplifying carbon-cycle feedbacks.]

    In fact, as I argued in my very first post debunking your paper — which you never rebutted — it would have been completely absurd for the IPCC to model frozen technology in all their scenarios.

    Yes, 2000 to 2005 had a reversal of more recent longer-term trends in energy efficiency and decarbonization. But as I wrote in early April,
    this begs the question — Are the last few years anomalous or have they become the new norm? The authors believe they know the answer, but, of course, they can’t prove it. They spend a couple of paragraphs arguing that this is a fundamental shift, because of rapidly developing countries, like China. But, in fact, right now much if not most of the recarbonization is China’s abandonment of its two-decade long marriage to energy efficiency for a torrid love affair with coal, an affair that is literally breathtaking.

    For two decades prior to 2000, China had an aggressive energy efficiency strategy and worked hard to avoid inefficient coal use, as I discussed here. Then they stopped. They are trying, admittedly not bloody hard, to go back to efficiency. But they could if they wanted to and it would save them lots of money irrespective of climate concerns. So I’m just not sure you can just put up a graph of the last few years and say that means all the IPCC models are potentially “unachievable.” This analysis suffers because it doesn’t know what the actual baseline for future energy and emissions growth is (and, of course, that’s why the IPCC has 35 models, to cover lots of different future scenarios).

    You can certainly conclude the IPCC models will seriously underestimate emissions growth this decade. Many of us have been saying that for a while. The natural reaction to that would be to argue for more aggressive deployment of energy efficient and low-carbon technology starting immediately. After all, technology advances didn’t stop in 2000 (if anything, they accelerated) — a few countries just stopped adopting them at the normal pace in a frenzy of inefficient and polluting growth.

    Well, that would be the natural reaction — if you actually believed the rest of the IPPC report (as Pielke claims to), and especially if you believed the rest of the IPCC report similarly understated the climate threat we face, as I and others have argued. But to come to that reaction you’d have to understand and/or explain why we must stabilize below 450 ppm and stay far, far away from 800 ppm to 1000 ppm. You’d have to say what you believe is an appropriate target and how we get there.

    One other key point, for years, conservatives and let’s kindly call them “skeptics” have been arguing that the IPCC’s emissions models were too pessimistic. That’s right, the skeptics have been saying that the IPCC was scaring people into unnecessary action by assuming emissions growth was higher than in fact it was.

    Yes, I know, if you actually read the Pielke et al piece, that seems hard to believe. They never bother pointing this out. But after a mere 10 seconds on Google, I found a classic example, an essay from the conservative American Enterprise Institute (AEI) titled … wait for it … “New Doubts about the Dominant Climate Change Models.” Oh it gets better. The April 2003 analysis finds (italics in original):

    “Meanwhile, an Australian statistician and a British economist have blown a huge hole in the methodology by which the Intergovernmental Panel on Climate Change has made its long-term estimates of man-made carbon dioxide emissions for the twenty-first century. If this critique is correct, the IPCC has vastly overestimated the amount of man-made CO2 emissions and will need to remake its climate change models.”

    Yes. Your read it right. A statistician and economist have debunked the IPCC models, proving that they “vastly overestimated” future CO2 emissions. Just to be crystal clear, the paper found

    “Castles and Henderson argue that the IPCC economic forecasts are based on fundamentally flawed economic assumptions that generate huge overestimates of future CO2 emissions.… The mean IPCC projection for the 1990s was that worldwide CO2 emissions would increase by about 15 percent. In fact, worldwide CO2 emissions grew by only about 6 percent according to the U.S. Department of Energy. Even the lowest of the IPCC’s emissions projections is probably too high, which means that the projections of global warming may be too high as well.”

    Amazing. To sum up:

    Five years ago the American Enterprise Institute “proved” that the lowest IPCC emissions projection is too high, and they backed up their conclusion with actual 1990s data, whereas Pielke, Wigley, and Green have “proven” that the highest IPCC emissions projection is too low, and they backed up their conclusion with actual data from this decade.

    Poor IPCC, they can’t win. Hmm. Maybe they should publish a bunch of scenarios with different assumptions, so they can’t keep getting attacked. No, that would be Dangerous!

    The AEI analysis did at least draw a consistent conclusion. If the IPCC were overstating future GHG emissions, then obviously they were overstating future GHG concentrations, and thus obviously overstating future temperature rise, and therefore overstating future impacts, and finally, overstating the urgent need for action now. (And by action, I don’t mean research and development.)

    So tell me how Pielke et al can utterly disprove this analysis (sort of) and come to the same exact conclusion that the IPCC has overstated the urgent need for action now?

    So, no, Roger — I don’t owe you an apology. And I don’t want one from you. But you most certainly owe an apology to the IPCC for falsely accusing them of “dangerous assumptions.”

  10. 6
  11. Len Ornstein Says:

    Joe and Roger:

    Reality is somewhere between your two positions.

    It appears that climatologists, as represented by the IPCC, have had the following problem.

    Through the expertise that comes from familiarity with the physics and observations of climate, they are sufficiently concerned about probable consequences of AGW to want to alert the public and policy makers. The importance of making changes in 1) ‘business as usual’ and 2) the technologies we presently use, are their main focus. But such technologies are not THEIR expertise. Therefore their pronouncement on TECHNOLOGY are less detailed and less confident, than on climate.

    PWG judge that “enormous advances in energy technology” (in contrast to what they claim IPCC ‘conveys’ as “effortless and costless” climate policy) will be required.

    Of course, “enormous” is as much of an exaggeration as “effortless” and “costless”.

    CCS is neither effortless nor costless – nor “enormous”. But it’s not ‘off the shelf technology’ either.

    Solar isn’t yet cheap enough. Compact-enough energy storage (e.g., batteries) haven’t yet been invented – despite long and strong economic motivation. Controlled fusion has not yet (after half a century) lived up to expectations.

    So Roger has a point. But he pushes a little too hard.

    And Joe is also right – that the IPCC has not been as Polyann-ish as PWG seem to claim.

    Len Ornstein

  12. 7
  13. Roger Pielke, Jr. Says:

    Joe-

    It is clear that you still don’t understand the difference between a stabilization scenario and a baseline scenario. If you cannot pause for a second to try to understand this distinction, then there is little hope that you will appreciate the significance our analysis.

    But what is remarkable about your views is that you grant our central point (!!) when you write — “You can certainly conclude the IPCC models will seriously underestimate emissions growth this decade. Many of us have been saying that for a while.”

    If the IPCC has indeed potentially underestimated emissions growth, then it is only logical to conclude as we have, that the mitigation challenge is larger that the IPCC has concluded. Why? Because underestimating emissions growth would make the challenge of mitigation that much more difficult. (Seems pretty obvious.)

    Would you argue the opposite? That if the IPCC has underestimated future emissions growth that the mitigation challenge would be ***the same or smaller*** than the IPCC has concluded? Surely not.

    So one moment you are arguing that the IPCC has not underestimated the challenge, then next minute you are arguing the opposite. No amount of words can obscure this confusion that results from you fundamental misunderstanding of the analysis in our paper.

  14. 8
  15. jromm Says:

    Roger –

    I am glad you do not dispute my major criticism of your paper.

    Please do explain this “difference between a stabilization scenario and a baseline scenario” and how it matters at all to my central criticism of your conclusion — that the IPCC did NOT make “dangerous” assumptions and that it is NOT somehow “diverting attention from policies that could directly stimulate technological innovation.”

    I keep asking you and Tol to show me where the IPCC diverted attention, but I have never been given a single link to a single example — and yet it is the central accusation of your paper.

    IPCC head Pachauri said in November, “If there’s no action before 2012, that’s too late. What we do in the next two to three years will determine our future. This is the defining moment.”

    Doesn’t sound like someone trying to divert attention to me.

    The only one in the back-and-forth between the two of us who keeps “diverting attention” from the urgent need for action now is you — with your joining up with the breakthrough Institute to try to convince people that we need gobs and gobs of technology breakthroughs to solve this problem. Guess what — technology breakthroughs do not occur on command, oftentimes they never occur.

    I don’t grant your central point of your previous comment. Yes, we’ve had a few years where the IPCC underestimated emissions, but as I noted above, we had a whole decade where they overestimated emissions. Maybe next decade they will turn out to have overestimated again. As I explained above, you can certainly point out their mistake, but it is Unfair and misleading to call it a dangerous assumption.

    And what you simply refuse to acknowledge is that the scenarios are quite different than the IPCC’s analysis of the mitigation challenge. As you know, WG3 spends page after page detailing all of the things that have to occur to achieve deep emissions cuts.

    As I noted, they even model $150/ton of CO2 (see page 217)!!! That would add over 16 cents a kilowatt hour to the price of coal power! That is considerably higher than the price model by anybody else.

    I personally consider $150/ton as far in excess of what will be needed — but how can anybody claim that dangerously UNDERestimates what needs to be done — and done now!

    Are you arguing that because the IPCC has underestimated emissions for a few year we can delay deploying further all those technologies that you froze? Of course not, we need to deploy them even faster, to make up for what happened this decade.

    Everything I read from you here or in the media implies we are not in a desperately urgent hurry to start deploying very last bit of energy efficient and low carbon technologies we have today as fast as is humanly possible.

    You fill up page after page with completely irrelevant stuff about whether or not the IPCC can precisely predict the temperature profile over the next decade or two. Who cares? If we don’t listen to Pacauri and act NOW with strong government policies to reverse the energy efficiency and decarbonization trends of the last few years it will be game over by the time we get your meaningless improvements to IPCC’s short-term regional forecasting — the chance for 450/500 will be lost, and will be esperately struggling to avoid 1000 ppm.

    The vast majority of the stuff on your blog maybe of academic interest to people who believe we have time to dawdle. But how could it possibly be a valuable use of your time if you really believe what the Nature article shows?

    My bottom line is that you don’t act is if you believe the obvious primary conclusion of your own paper: Time’s up, humanity. Act now, or risk 1000 ppm.

  16. 9
  17. Roger Pielke, Jr. Says:

    Joe-

    The funny thing is that you are a perfect example of someone holding dangerous assumptions.

    You _assume_ that no more than 14-18 “wedges” worth of emissions reductions by 2050 are needed to stabilize at 450. (And note that when you say possibly 14-18 wedges, rather than the 7 of Socolow/Pacala you have implicitly accepted our technical analysis). You make this assumption because you cannot come up with more wedges than that without technological breakthroughs. You apparently cannot admit that technological breakthroughs would be worth pursuing because you have painted yourself into a corner by stridently and unequivocally labeling anyone who calls for efforts to rapidly advance technology “delayers” and in cahoots with GW Bush.

    Our paper shows that it is quite possible that we might need more — many more — than these 14-18 wedges, perhaps twice that amount. And many people believe that the polices needed even to get your 14-18 are simply not realistic (no matter how loudly you call for them).

    Now you may be so absolutely certain of your view that there is a 0% chance that we will need no more than 18 wedges, and you may also be 100% certain that each of your 18 can be implemented with 100% success.

    I don’t share your certainties. Our _analysis_ clearly shows that across the IPCC scenarios, if the assumptions of EI and CI decline do not bear out, then the emissions reductions challenge will be much larger than advertised (i.e., larger than the equivalent of your 14-18 wedges). That is why these assumptions are dangerous, because they put all of our climate mitigation eggs in one basket, and you know what they say about eggs in one basket.

    The problem is that the climate policies presented by the IPCC and Stern are not scaled to meet the larger challenge, just as your own policies are not scaled to meet the larger challenge.

    Rather than playing the “delayer 1000″ game — as fun as it apparently is for you — you might simply say,

    “Gee, it is perhaps possible that we might need more than 14-18 wedges, and while we are deploying existing technology, which I prioritize, we should also be looking just over the technological horizon with aggressive R&D policies, as an insurance policy in case things we think are true today turn out to be incorrect in the future.”

    You might find that instead of demonizing and demagoguing, it is possible to find areas of agreement and areas where we agree to disagree.

    Now you may so certain of these things that you see everyone with a different view as being an idiot or corrupt. I just don’t see the world in that way. And ultimately, we may just have to agree to disagree on this perspective as well.

  18. 10
  19. jromm Says:

    Roger:

    You have never written anything with more misstatements.

    1) If you really believed we needed 18 to 36 wedges, AND that we need to stabilize at 450 ppm,
    then you would be begging the world to start deploying wedges immediately rather than the irrelevant-to-humanity’s-future stuff you usually blog on.

    2) I don’t “assume that no more than 14-18 “wedges” worth of emissions reductions by 2050 are needed to stabilize at 450.” I have demonstrated that is the case analytically — and you were not able to refute that analysis. Indeed, you were unaware of the key assumption that Socolow and Pacala made that explain why 14-16 is a robust number.

    3) Since the wedges have NOTHING to do with the IPCC, it is absurd to say, “when you say possibly 14-18 wedges, rather than the 7 of Socolow/Pacala you have implicitly accepted our technical analysis.” I have explicitly rejected your analysis. But we still need 14 or so wedges — that is because of how the IPCC has revised down the acceptable amount of carbon emissions for 450, as I have repeatedly said.

    5) I can come up with lots more wedges without breakthroughs. I tried to list of the wedges I thought were most reasonable, not the only ones we could do if we were as desperate as you keep saying we are.

    6) I have been advocating (and helping bring about) greater spending in energy R&D for longer than you have been in the climate arena. I’d love to see a breakthrough. But I have explained at great length why breakthroughs are almost certainly not going to help us much by 2050.

    http://climateprogress.org/2008/04/30/is-450-ppm-or-less-politically-possible-part-3-the-breakthrough-technology-illusion/

    The question is not whether breakthroughs are worth pursuing. The question is What are the chances that multiple (4 to 8+) carbon-free technologies that do not exist today can each deliver the equivalent of 350 Gigawatts baseload power (~2.8 billion Megawatt-hours a year) and/or 160 billion gallons of gasoline cost-effectively by 2050? [Note — that is about half of a stabilization wedge.] For the record, the U.S. consumed about 3.7 billion MW-hrs in 2005 and about 140 billion gallons of motor gasoline.

    Put that way, the answer to the question is painfully obvious: “two chances — slim and none.” Indeed, I have repeatedly challenged readers and listeners over the years to name even a single technology breakthrough with such an impact in the past three decades, after the huge surge in energy funding that followed the energy shocks of the 1970s. Nobody has ever named a single one that has even come close.

    Yet somehow the government is not just going to invent one TILT (Terrific Imaginary Low-carbon Technology) in the next few years, we are going to invent several TILTs. Seriously. Hot fusion? No. Cold fusion? As if. Space solar power? Come on, how could that ever compete with CSP? Hydrogen? It ain’t even an energy source, and after billions of dollars of public and private research in the past 15 years — including several years running of being the single biggest focus of the DOE office on climate solutions I once ran — it still has actually no chance whatsoever of delivering a major cost-effective climate solution by midcentury.

    Roger, since you refuse to read my posts, I’ll just reprint it here:

    I don’t know why the breakthrough crowd can’t see the obvious — so I will elaborate here. I will also discuss a major study that explains why deployment programs are so much more important than R&D at this point. Let’s keep this simple:

    * To stabilize at 450 ppm, we need to deploy by 2050 at least 14 stabilization wedges (each delivering 1 billion tons of avoided carbon) covering both efficient energy use and carbon-free supply.
    * Myriad energy-efficient technologies are already cost-effective today — breaking down the barriers to their deployment now is much, much more important than developing new “breakthrough” efficient TILTs, since those would simply fail in the marketplace because of the same barriers. Cogeneration is perhaps the clearest example of this.
    * On the supply side, deployment programs (coupled with a price for carbon) will always be much, much more important than R&D programs because new technologies take an incredibly long time to achieve mass-market commercial success. New supply TILTs would not simply emerge at a low cost. They need volume, volume, volume — steady and large increases in demand over time to bring the cost down, as I discuss at length below.
    * No existing or breakthrough technology is going to beat the price of power from a coal plant that has already been built — the only way to deal with those plants is a high price for carbon or a mandate to shut them down. Indeed, that’s why we must act immediately not to build those plants in the first place.
    * If a new supply technology can’t deliver half a wedge, it won’t be a big player in achieving 450 ppm.

    For better or worse, we are stuck through 2050 with the technologies that are commercial today (like solar thermal electric) or that are very nearly commercial (like plug-in hybrids).

    I have discussed most of this at length in previous posts (listed below), so I won’t repeat all the arguments here. Let me just focus on a few key points. A critical historical fact was explained by Royal Dutch/Shell, in their 2001 scenarios for how energy use is likely to evolve over the next five decades (even with a carbon constraint):

    “Typically it has taken 25 years after commercial introduction for a primary energy form to obtain a 1 percent share of the global market.”

    Note that this tiny toe-hold comes 25 years after commercial introduction. The first transition from scientific breakthrough to commercial introduction may itself take decades. We still haven’t seen commercial introduction of a hydrogen fuel cell car and have barely seen any commercial fuel cells — over 160 years after they were first invented.

    This tells you two important things. First, new breakthrough energy technologies simply don’t enter the market fast enough to have a big impact in the time frame we care about. We are trying to get 5% to 10% shares — or more — of the global market for energy, which means massive deployment by 2050 (if not sooner).

    Second, if you are in the kind of hurry we are all in, then you are going to have to take unusual measures to deploy technologies far more aggressively than has ever occurred historically. That is, speeding up the deployment side is much more important than generating new technologies. Why? Virtually every supply technology in history has a steadily declining cost curve, whereby greater volume leads to lower cost in a predictable fashion because of economies of scale and the manufacturing learning curve.

    WHY DEPLOYMENT NOW COMPLETELY TRUMPS RESEARCH

    A major 2000 report by the International Energy Agency, Experience Curves for Energy Technology Policy has a whole bunch of experience curves for various energy technologies. Let me quote some key passages:

    “Wind power is an example of a technology which relies on technical components that have reached maturity in other technological fields…. Experience curves for the total process of producing electricity from wind are considerably steeper than for wind turbines. Such experience curves reflect the learning in choosing sites for wind power, tailoring the turbines to the site, maintenance, power management, etc, which all are new activities.”

    Or consider PV:

    “Existing data show that experience curves provide a rational and systematic methodology to describe the historical development and performance of technologies….

    “The experience curve shows the investment necessary to make a technology, such as PV, competitive, but it does not forecast when the technology will break-even. The time of break-even depends on deployment rates, which the decision-maker can influence through policy. With historical annual growth rates of 15%, photovoltaic modules will reach break-even point around the year 2025. Doubling the rate of growth will move the break-even point 10 years ahead to 2015.”

    Investments will be needed for the ride down the experience curve, that is for the learning efforts which will bring prices to the break-even point. An indicator for the resources required for learning is the difference between actual price and break-even price, i.e., the additional costs for the technology compared with the cost of the same service from technologies which the market presently considers cost-efficient. We will refer to these additional costs as learning investments, which means that they are investments in learning to make the technology cost-efficient, after which they will be recovered as the technology continues to improve.

    Here is a key conclusion:

    “… for major technologies such as photovoltaics, wind power, biomass, or heat pumps, resources provided through the market dominate the learning investments. Government deployment programmes may still be needed to stimulate these investments. The government expenditures for these programmes will be included in the learning investments.”

    Obviously government R&D, and especially first-of-a-kind demonstration programs, are critical before the technology can be introduced to the marketplace on a large scale. But, we “expect learning investments to become the dominant resource for later stages in technology development, where the objectives are to overcome cost barriers and make the technology commercial.”

    We are really in a race to get technologies into the learning curve phase: “The experience effect leads to a competition between technologies to take advantage of opportunities for learning provided by the market. To exploit the opportunity, hethe emerging and still too expensive technology also has to compete for learning investments.”

    In short, you need to get from first demonstration to commercial introduction as quickly as possible to be able to then take advantage of the learning curve before your competition does. Again, that’s why if you want mass deployment of the technology by 2050, we are mostly stuck with what we have today or very soon will have. Some breakthrough TILT in the year 2025 will find it exceedingly difficult to compete with technologies like CSP or wind that have had decades of such learning.

    And that is why the analogy of a massive government Apollo program or Manhattan project is so flawed. Those programs were to create unique non-commercial products for a specialized customer with an unlimited budget. Throwing money at the problem was an obvious approach. To save a livable climate we need to create mass-market commercial products for lots of different customers who have limited budgets. That requires a completely different strategy.

    Finally, it should be obvious (here), but it apparently isn’t, so I’ll repeat:

    “The risk of climate change, however, poses an externality which might be very substantial and costly to internalise through price alone. Intervening in the market to support a climate-friendly technology that may otherwise risk lock-out may be a legitimate way for the policymaker to manage the externality; the experience effect thus expands his policy options. For example, carbon taxes in different sectors of the economy can activate the learning for climate-friendly technologies by raising the break-even price.”

    So, yes, a price for carbon is exceedingly important — more important, as I have argued, than funding the search for TILTs.

    THE BREAKTHROUGH BUNCH

    Michael Shellenberger says that he (and, separately, NYT’s Revkin, here) interviewed a whole bunch of people who think we need “massive public investments” and breakthroughs. Revkin writes: “Most of these experts also say existing energy alternatives and improvements in energy efficiency are simply not enough.”

    The devil is always in the details of the quotes — especially since everybody I know wants more federal investments on low carbon technologies. And, of course, some of the folks Revkin quotes are long time delayers, like W. David Montgomery of Charles River Associates — who has testified many times that taking strong action on climate change would harm the economy. He says stabilizing temperatures by the end of the century “will be an economic impossibility without a major R.& D. investment.” Well, of course he would. In any case, we don’t have until the end of the century — yes, it would certainly be useful to have new technologies in the second half of this century, but the next couple of decades are really going to determine our fate.

    Both quote my friend Jae Edmonds. Revkin quotes him as saying we need to find “energy technologies that don’t have a name yet.” Shellenberger quotes him saying.

    Fundamental changes in the world’s expanding energy system are required to stabilize concentrations of greenhouse gases in the atmosphere. Incremental improvements in technology will help, but will not by themselves lead to stabilization

    Jae and I have long disagreed on this, and he is wrong. His economic models have tended to assume a few major breakthroughs in a few decades and that’s how he solves the climate problem. Again, I see no evidence that that is a plausible solution nor that we have the time to wait and see.

    I would estimate that the actual federal budget today that goes toward R&D breakthroughs that could plausibly deliver a half wedge or more by 2050 (i.e. not fusion, not hydrogen) is probably a few hundred million dollars at most. I wouldn’t mind raising that to a billion dollars a year. But I wouldn’t spend more, especially as long as the money was controlled by a Congress with its counterproductive earmarks. I could probably usefully spend 10 times that on deployment (not counting tax policy), again as long as the money was not controlled by Congress. Since that may be difficult if not impossible to arrange, we have to think hard about what the size of a new federal program might be. I’ll discuss that further in the Part 6 discussion on policy.

    Roger Pielke, Jr., has said (here) that my proposed 14 wedges requires betting the future on “some fantastically delusional expectations of the possibilities of policy implementation” and that my allegedly “fuzzy math explains exactly why innovation must be at the core of any approach to mitigation that has a chance of succeeding.” Well, we’ve seen my math wasn’t fuzzy (here).

    But you tell me, what is more delusional — 1) that we take a bunch of commercial or very near commercial technologies and rapidly accelerate their deployment tonews it was wedge-scale over the next four decades or 2) that in the same exact time frame, we invent a bunch of completely new technologies “that don’t have a name yet,” commercialize them, and then rapidly accelerate them into the marketplace so they achieve wedge scale?

    And so I assert again, the vast majority — if not all — of the wedge-sized solutions for 2050 will come from technologies that are now commercial or very soon will be. And federal policy must be designed with that understanding in mind. So it seems appropriate to end this post with excerpt from the Conclusion of the IEA report:

    “A general message to policymakers comes from the basic philosophy of the experience curve. Learning requires continuous action, and future opportunities are therefore strongly coupled to present activities. If we want cost-efficient, CO2-mitigation technologies available during the first decades of the new century, these technologies must be given the opportunity to learn in the current marketplace. Deferring decisions on deployment will risk lock-out of these technologies, i.e., lack of opportunities to learn will foreclose these options making them unavailable to the energy system.…

    “… the low-cost path to CO2-stabilisation requires large investments in technology learning over the next decades. The learning investments are provided through market deployment of technologies not yet commercial, in order to reduce the cost of these technologies and make them competitive with conventional fossil-fuel technologies. Governments can use several policy instruments to ensure that market actors make the large-scale learning investments in environment-friendly technologies. Measures to encourage niche markets for new technologies are one of the most efficient ways for governments to provide learning opportunities. The learning investments are recovered as the new technologies mature, illustrating the long-range financing component of cost-efficient policies to reduce CO2 emissions. The time horizon for learning stretches over several decades, which require long-term, stable policies for energy technology.”

  20. 11
  21. Roger Pielke, Jr. Says:

    Joe- I’ll take that response to mean . . .

    “Yes, I’d prefer to continue demonizing and demagoguing those with whom I disagree”

    But at least through the exchange we’ve got the differences on record.

  22. 12
  23. Len Ornstein Says:

    Joe:

    You seem to be too inflexible about the possibility that ‘invention’ can play a big role in a quick and sustainable flattening of the Keeling curve.

    Sustainable Eco-Neutral Conservation Harvest (SENCH) of old-growth tropical forest (harvest of fallen-trees only) with current technology could yield CO2 draw-down equal to about 1 GtC/yr (1 wedge unit) rather quickly, and at low cost and provides added economic motivation to halt deforestation.

    Irrigating Afforestation of the Sahara and Australian Outback (utilizing mature RO desalination technology) could yield about 10 GtC/yr CO2 draw-down at higher cost and longer lead-time for building vast, efficient irrigation systems.

    Using wood in place of coal, is old technology that can be managed with existing plants at MUCH lower costs than carbon capture and sequestration (CCS). Wood in place of oil and gas will take longer – but we may buy time for that.

    Both of these are covered in submissions currently under review at Climatic Change.

    This isn’t “new technology”, but ‘new’ strategies for use of existing technologies.

    You shouldn’t close your mind to the potential role of such ‘inventions’, that require much shorter learning curves to attain massive deployment! These two were not stimulated by massive R&D investment. But who’s to say that some others will not be motivated by more generous governmental support of mitigation R&D?

    Len Ornstein

  24. 13
  25. jromm Says:

    Roger — As I thought, you chose not to rebut my detailed argument on why you are wrong about technology breakthroughs.

    It is hard to see how presenting straightforward analysis or quoting major studies is “demonizing and demagoguing” — but if that’s your best defense against a reasoned argument, I’m happy to let your readers decide who has made the best argument.

    Len: I helped run the office in charge of most federal R&D into Advanced Technology for climate mitigation. I was in charge of technology and market analysis for the office for three years. I have been one of the country’s strongest advocates for energy R&D for almost two decades now.

    If you had read my blog, you would know that using existing technologies in new ways is something I strongly endorse. It is not “invention” as that term is classically used.

    Also, I include about two forestry wedges in my solution set already. We will need every plausible forestry strategy imaginable

    That said, I suspect you would have a hard time selling selective harvesting of old-growth forests to the sustainable forestry community.

    I don’t really see mass irrigation of deserts to grow trees as a major climate solution. First off, we are currently expanding the tropics and subtropics because of climate change, so the arid areas are going to get more arid. Second, desalination is a huge energy user — it would all have to be zero-carbon electricity, of course.

    Third, by mid-century we are going to have 3 billion more people, and we are going to need every last drop of water and arable land for food. Whatever water and usable land we have beyond that I suspect will go towards cellulosic biofuels of one sort or another. I did throw in a wedge of that, though I suspect we’ll need to find biofuels that don’t use arable land or require potable water. Maybe algae.

    In any case, I have a wedge of biofuels. So that is three bio-wedges. Some people, including Dr. Pielke, consider my entire wedges analysis wildly optimistic and “delusional.”

    Though apparently it is more realistic and less delusional to believe that multiple technologies that don’t even exist today could be developed, commercialized, marketed, and then mass-marketed to the point of being multiple wedges. And apparently even questioning this possibility is “demonizing” and “demagoguing.” We live and learn.

  26. 14
  27. Len Ornstein Says:

    Joe:

    Both you and Roger are too glib in your treatments of “technology breakthroughs”. I’m one of those fossils who’s lived through, and contributed to, the post-WII technology breakthroughs and the euphoria they stimulated.

    The future of breakthroughs is both more promising than you allow, but less predictable than Roger might wish.

    Your differences are smaller than either of you pretend – and seem to come from a bit too much of oneupmanship – like your: “That said, I suspect you would have a hard time selling selective harvesting of old-growth forests to the sustainable forestry community.” Selling just about any mitigating technology will be a hard-sell!

    Len

  28. 15
  29. Roger Pielke, Jr. Says:

    Len-

    Thanks, I appreciate your moderating voice. I certainly don’t think that breakthroughs are predictable or certain. I am sure that their probability is increased when we focus research attention (i.e., $$$) on them.

    The exact same sort of debate takes place with regard to public health where some argue that we can dramatically improve public health with existing technologies, whereas others emphasize the search for new vaccines etc.

    I am in the, “yes both” school of thought. On energy the world has seen a decades-long trend of decreasing investments in energy R&D (unlike health, where I’d probably have a different emphasis).

    So I think we need to dramatically up that investment (while at the same time using existing technologies). Joe has waged a vitriolic public campaign of character assassination because I hold this view. I have no problem with the things he advocates (more deployment of existing technologies), I just don’t think that is going to do the trick.

  30. 16
  31. Lupo Says:

    Or do we perhaps just fail to see the possible unfounded assumpion that “to beat 450 ppm” will help to “avoid catastrophic climate impacts”?

    It appears the very nature of this subject is proven by the way the discussion in this post is going! That we are spending time ‘not getting things done’ instead of developing solutions?

  32. 17
  33. steven mosher Says:

    I’m sorry, I missed the step where somebody demonstrated that the Models had sufficient skill to predict that 450 ppm was a danger level.

    Only 7 years of data have been collected, we need to wait at least 20 years to assess the models and determine if 450 is the right number or 550 or 650.

  34. 18
  35. docpine Says:

    As a card-carrying member of the sustainable forest community, I have to point out that the technology referred to “Sustainable Eco-Neutral Conservation Harvest (SENCH) of old-growth tropical forest (harvest of fallen-trees only)” said specifically “harvest of fallen trees only.”

    Joe’s reply was that “That said, I suspect you would have a hard time selling selective harvesting of old-growth forests to the sustainable forestry community.” In our community, “selective harvesting” generally means cutting some live trees and leaving other live trees. That term does not mean removing dead trees (presumably dead, if they are fallen).

    I am simply going by the statements made in this blog and have no other separate knowledge of the SENCH concepts. But I think it’s important to choose words that communicate appropriately to the people in our community.