Archive for September, 2008

The Most Attractive Opportunity

September 7th, 2008

Posted by: Roger Pielke, Jr.

In the Guardian today, Rajendra Pachauri, head of the IPCC, is offering policy advice on climate change. He suggests that the “most attractive opportunity” for emissions reductions is to start by giving up meat for a day a week.

‘In terms of immediacy of action and the feasibility of bringing about reductions in a short period of time, it clearly is the most attractive opportunity,’ said Pachauri. ‘Give up meat for one day [a week] initially, and decrease it from there,’ said the Indian economist, who is a vegetarian.

However, he also stressed other changes in lifestyle would help to combat climate change. ‘That’s what I want to emphasise: we really have to bring about reductions in every sector of the economy.’

I’m not sure where this advice comes from. Giving up meat a day a week (or at all) didn’t make the IPCC’s list (p. 64 in this PDF) of top mitigation opportunities in agriculture:

The most prominent mitigation options in agriculture (with potentials shown in Mt CO2eq/yr for carbon prices up to 100 US$/tCO2-eq by 2030) are (see also Figure TS.20):

• restoration of cultivated organic soils (1260)

• improved cropland management (including agronomy, nutrient management, tillage/residue management and water management (including irrigation and drainage) and set-aside / agro-forestry (1110)

• improved grazing land management (including grazing intensity, increased productivity, nutrient management, fire management and species introduction (810)

• restoration of degraded lands (using erosion control, organic amendments and nutrient amendments (690).

Lower, but still substantial mitigation potential is provided by:

• rice management (210)

• livestock management (including improved feeding practices, dietary additives, breeding and other structural changes, and improved manure management (improved storage and handling and anaerobic digestion) (260) (medium agreement, limited evidence).

In addition, 770 MtCO2-eq/yr could be provided by 2030 by improved energy efficiency in agriculture. This amount is, however, for a large part included in the mitigation potential of buildings and transport [8.1; 8.4].

There are plenty of good reasons to limit consumption of meat, and I am all for the advice. But for the head of the IPCC to suggest that it represents the “most attractive opportunity” for emissions reductions is not supported by the IPCC. So where did this advice come from? Such freelancing of political advice opens the door to charges that the IPCC is using its authority to pursue an agenda of lifestyle changes that are favored by its leaders, but which are not addressed in its reports, which are claimed to be “policy neutral.”

Technocrats vs. Democrats

September 6th, 2008

Posted by: Roger Pielke, Jr.

In today’s Financial Times Christopher Caldwell has an incisive analysis of the politics of “intelligent design”:

The point of intelligent design is to take science down a peg. To warn enthusiasts that they risk “discrediting science itself” is a bit dense. For them, evolution is a potent symbol of the way “scientific materialism” leaves people feeling demeaned, disenfranchised, stripped of prerogatives and less free. This feeling is not groundless. Dostoyevsky and Marx said similar things. The scientific world-view poses challenges to religion only in the course of posing challenges to a whole lot besides. To take one obvious example: fewer offices permit smoking today, but it is a stretch to call this a choice. In the US, at least, there was little democratic participation in the decision. There was scientific research and then there were mandates from health boards and courts. Maybe these mandates were “all to the good”. That does not make them democratic.

The anti-evolution activists in America’s small towns are wrong on the science – but wrong in a way that is of absolutely no consequence to them unless they choose a career in horse-breeding or molecular biochemistry. Their feelings of disenfranchisement, on the other hand, are real and consequential. Experts control an ever larger share of decisions about where roads can be built, what people can ingest, what can be taught and whether the decisions of democratic bodies pass constitutional muster. Like so much else in US public life, the battle over evolution is a class conflict disguised as a religious or moral conflict. It is comforting to look at the fight over evolution as one that pits the educated against the ignorant. It is that. But it is also a fight that pits technocrats against democrats.

Blaming the Media, Wrongly in this Case

September 5th, 2008

Posted by: Roger Pielke, Jr.

Earlier this week I claimed that scientists should take some responsibility for media misinterpretations of their work. Here is an example where scientists are wrongly blaming the media.

Today Science magazine published a paper on sea level rise by Tad Pfeffer (of the University of Colorado) and colleagues. the paper claims to have ruled out a sea level rise of more than 2 meters by 2100. The guys over at Real Climate have taken issue with the suggestion that 2 meters of sea level rise represents a step back from claims of higher potential increases. They write (emphasis added):

. . .we’ve just seen the initial media coverage where this result is being spun as a downgrading of predictions! (exemplified by this Reuters piece, drawing mainly from the U. Colorado press release). This is completely backwards. We stress that no-one (and we mean no-one) has published an informed estimate of more than 2 meters of sea level rise by 2100. Tellingly, the statement in the paper that suggests otherwise has no reference.

There have certainly been incorrect assertions and headlines implying that 20 ft of sea level by 2100 was expected, but they are mostly based on a confusion of a transient rise with the eventual sea level rise which might take hundreds to thousands of years. And before someone gets up to say Al Gore, we’ll point out preemptively that he made no prediction for 2100 or any other timescale.

Real Climate helpfully points to a paper by Jim Hansen that they claim says nothing about a sea level rise greater than 2 meters:

The nearest thing I can find is Jim Hansen who states that “it [is] almost inconceivable that BAU climate change would not yield a sea level change of the order of meters on the century timescale”. But that is neither a specific prediction for 2100, nor necessarily one that is out of line with the Pfeffer et al’s bounds.

Real Climate concludes with an admonition of those ignorant reporters:

Headlines like that in the Reuters piece (or National Geographic) are therefore doing a fundamental disservice to the public understanding of the problem.

Tsk. Tsk. Bad media.

However, a close look at the Hansen article linked by Real Climate suggests that they aren’t really telling the whole story. Here is what Hansen writes (emphasis added):

As a quantitative example, let us say that the ice sheet contribution is 1 cm for the decade 2005–15 and that it doubles each decade until the West Antarctic ice sheet is largely depleted. That time constant yields a sea level rise of the order of 5 m this century. Of course I cannot prove that my choice of a ten-year doubling time for nonlinear response is accurate, but I am confident that it provides a far better estimate than a linear response for the ice sheet component of sea level rise under BAU forcing.

This paper led Joe Romm to write the following at Grist:

Sea level rise of 5 meters in one century? Even if most scientists will not say so publicly, that catastrophe is a real possibility, according to the director of NASA’s Goddard Institute Of Space Studies.

So at least Joe Romm interpreted Hansen’s paper to mean that sea level could conceivably rise by 5 meters by 2100. Real Climate’s claim that “no-one (and we mean no-one) has published an informed estimate of more than 2 meters of sea level rise by 2100″ is not just incorrect, but extremely misleading.

Not only is the head guy at Real Climate employed by Jim Hansen, one of the most prominent climate scientists around, but Real Climate goes so far as to point in very misleading fashion to the very paper in which Hansen suggests that a prediction of 5 meters by 2100 is far more accurate than predictions of less than about a meter and a half. So while it is indeed frustrating that some in the media get things wrong, the best in the media pretty much always get things right. And on sea level rise, it looks like they’ve gotten things right. One thing members of the media won’t like much is being treated like fools.

PIK on “Dangerous Assumptions”

September 5th, 2008

Posted by: Roger Pielke, Jr.

[UPDATE: A colleague just pointed out the Prof. Edenhofer is the newly elected co-chair of the IPCC WGIII.]

Last spring Tom Wigley, Chris Green and I published an article in Nature (PDF) suggesting that the IPCC (and by extension, similar assessments) assumes in scenarios of future emissions trajectories that a large amount of future emissions reductions would occur automatically, without any specific climate policies. You can see the amount of this assumed emissions reduction in the following figure indicted by the light blue portion of the stacked bars (the green portion shows an allowable amount of emissions consistent with stabilization at about 500 ppm, leaving the red amount needed to be addressed by climate policy).

A group of scholars at the Potsdam Institute for Climate Impact Research (PIK) has been kind enough to prepare a response to our piece. The PIK team’s (Ottmar Edenhofer, Brigitte Knopf, Jan Steckel, Gunnar Luderer) article can be found here in PDF. They open with a few strong statements:

In their recent Nature article Pielke et al. (2008) convey as a their key message that the IPCC dramatically underestimates the socio-economic effort it would take to transform the 21st century non-policy – high emission path into a policy-induced low-emission path. Thus, the core statement of working group 3 in the Fourth Assessment Report of the IPCC seems to begin to totter – that an ambitious climate policy can be pursued with the already existing technologies. What the article of Pielke et al. does not explicitly say but suggests implicitly is that the cost estimates of the IPCC would then also be just as unrealistic as the estimation of the technical and economic challenge. Is the optimistic message that the IPCC announced in 2007 to the whole world finally unfounded? Is the challenge greater than assumed, is climate policy now more expensive than alleged? Did the IPCC finally mislead policy makers? Is the IPCC eventually not the “honest broker” between science and politics and pursues own politics with the authority of science?

With their comment “Dangerous Assumptions”, the authors further fostered the suspicion that the IPCC did not provide a down-to-earth analysis of the options and limits of climate policy. The bomb that the authors seemingly drop can however be easily deactivated.

I like their analogy of a “bomb”, for the following reason. They don’t imply that our analysis is wrong (i.e., there is no bomb), simply that it need to be “deactivated”. In fact, a close look at the PIK analysis shows far more area of agreement with our analysis than disagreement. Where there is disagreement, there is apparent misunderstanding. The “bomb” is still ticking.

The PIK team agrees with much of the core of our analysis:

. . . we can agree with Pielke et al. that there are definitely alarming developments in China that suggest that the autonomous technical progress may be slower in the future than the historical trend . . .

. . . In the light of Pielke et al. it needs to be discussed if there will be a break in the development of the worldwide energy system compared to the historical trend. . .

. . . The development of carbon intensity is therefore indeed worrying. . .

. . . It is therefore absolutely unquestionable that the assumptions about the autonomous technical progress considerably influence the mitigation costs.

The PIK team unfortunately builds and proceeds to knock down a straw man when they assert:

Pielke et al. therefore exaggerate immoderately if they assume that the hypothetic scenario of the IPCC, in which the carbon and energy intensity is frozen on today’s level, is the most likely future scenario. . . For the estimation of the future trend, it is however absurd to construct a business-as-usual-scenario that does not produce any technical progress

They write “if they assume” presumably because we said nothing of the sort in our paper. We suggested using a “frozen technology” baseline simply because in an analysis it forces all assumptions of future technological innovation out into the open. The “frozen technology” baseline is an analytical tool, not a prediction about the future, and certainly is not a “business as usual” scenario. One of the best indications that critics accept our technical analysis is the misplaced allegation that our “frozen technology” baseline is a projection. This is just a misreading of our paper. Here is how we describe the “frozen technology” baseline in the original paper (emphasis added):

We also use the emissions-scenario building blocks in our analysis, but adopt a frozen-technology baseline to reveal the full challenge of decarbonization. Using this baseline also reveals the huge amount of emissions-reducing technological change built into the SRES and similar scenarios.

Interestingly the PIK paper concludes with a perspective entirely consistent with our own:

The real question for scientists is: What are the implications for climate policy when the world economy is now facing a renaissance of coal. This debate is urgent and timely.

We said the following:

Because of these dramatic changes in the global economy it is likely that we have only just begun to experience the surge in global energy use associated with ongoing rapid development.

Virtually all scenarios of future emissions fail to reflect the possibility of such changes, which is why it is important to move into the future with a fully appreciation of the uncertainties and possibilities — even if this leads to the uncomfortable knowledge that the mitigation challenge could be larger than that presented by the IPCC and others. While the PIK critique of our paper is welcome, they have done little to defuse the bomb, but rather, made it a bit harder to hear it ticking.

EU Costs of Responding to Climate Change

September 5th, 2008

Posted by: Roger Pielke, Jr.

The Center for European Policy Studies (CEPS) has released a report looking at various cost estimates associated with an EU response to climate change, concluding

“the EU may be required to contribute at least €60 billion annually to global climate change mitigation and adaptation efforts.”

The CEPS study relies on cost estimates from the IPCC, Stern, UNDP, World Bank, and Oxfam (on adaptation). As we have argued here previously, all such studies of the costs of mitigation are highly dependent upon assumptions of future emissions:

All assessments of the costs of stabilizing concentrations of carbon dioxide start with a baseline trajectory of future emissions. The costs of mitigation are calculated with respect to reductions from this baseline. In the Pielke, Wigley, and Green commentary in Nature (PDF) we argued that such baselines typically assume very large, spontaneous decreases in energy intensity (energy per unit GDP). The effect of these assumptions is to decrease the trajectory of the baseline, making the challenge of mitigation much smaller than it would be with assumptions of smaller decreases in energy intensity (and a higher baseline trajectory). Obviously, the smaller the gap between the baseline scenario and the mitigation scenario, the smaller the projected costs of mitigation.

The assumptions in such studies are often difficult to see. They are all but impossible to see when taken in review fashion as done by the CEPS. The bottom line is that the €60 billion annual cost reported by CEPS is misleadingly precise and only a single point in a wide distribution of possible costs. This distribution plausibly includes values an order of magnitude larger. Policy making is likely to suffer if uncertainties in economics are not treated seriously. Here is how I concluded our earlier discussion of costs estimates of mitigation:

What does this exercise tell us about costs estimates of mitigation?

1. They are highly sensitive to assumptions.

2. Depending on assumptions, cost estimates could vary by more than an order of magnitude.

3. We won’t know the actual costs of mitigation until action is taken and costs are observed. Arguments about assumptions are unresolvable.

Meantime, it will be easy to cherrypick a cost for mitigation — low or high — that suits the argument that you’d like to make.

Anyone telling you that they have certainty about the future costs of mitigation — whether that certainty is about high costs or low costs — is not reflecting the actual uncertainty. Action on mitigation will have to take place before such certainty is achieved, and modified based on what we learn.

The Divergence Problem

September 4th, 2008

Posted by: Roger Pielke, Jr.

Over at Nature’s Climate Feedback Anna Barnett has created an interesting graph showing the goals for emissions reductions expressed in a range of settings, as well as actual global emissions. The graph shows clearly the gap between rhetoric and reality.

Nonsense, Incoherence, and Inconsistency at the Hurricane Science and Media Interface

September 3rd, 2008

Posted by: Roger Pielke, Jr.

A new paper has been published today in Nature by Elsner et al. which documents trends in the strongest tropical cyclones in a satellite record. The analysis overcomes some of the challenges of other studies by relying on a single dataset from 1981-2006, thus avoiding the problems of inhomogeneities when datasets are combined, and thus adds some useful knowledge to a growing literature. The paper and its reception also indicate how dysfunctional the hurricane science-media interface is, particularly this time of year.

Consider the following examples of complete misinterpretations of the paper:

From the UK Telegraph:

Global warming will drive the number of major hurricanes to cause devastation up by one third in the coming century, says a new study that claims to have found the first conclusive evidence of the effect.

Actually, the paper offers absolutely no predictions about devastation in the coming century, and in fact, no predictions of the future at all.

From Live Science:

Strong hurricanes are getting stronger, likely thanks to global warming, a new study finds.

Actually, the paper says absolutely nothing about attribution, a point emphasized to me in an email from second author Jim Kossin.

From BBC:

Writing in the journal Nature, they say the number of weaker storms has not noticeably altered.

Maybe changes only matter if they go in one direction. If the total number of storms has not changed, and the number of strong storms has increased, then surely the number of weak storms must have decreased proportionately.

However, on the last point, the BBC’s confusion might be understandable. See if you can make sense of this passage from the paper:

Upward trends can be interpreted as an increase in the number of cyclones exceeding a threshold quantile. For example, at the 80th percentile, on average 17 cyclones globally exceed 49ms. With a 1 C rise in SST, the 80th percentile increases to 51ms. At this threshold level, on average 13 cyclones per year are observed. So the increase in SST of 1 C results in an increase in the global frequency of strong cyclones from 13 to 17 cyclones (31%) per year.

Either there is a transposition in there, or I just don’t get it. I’ve asked Jim Kossin about it but haven’t yet heard back. Any guesses as to how to interpret this passage are welcomed in the comments.

Finally, scientists commenting on the paper show their tribal affiliations all too readily. Climate scientist Judy Curry commented on the new trend analysis:

“It’ll be pretty hard now for anyone to claim that cyclone activity has not increased,” says Judith Curry, an atmospheric researcher at the Georgia Institute of Technology in Atlanta.

The only problem is that when another paper (PDF) came out a few years ago with opposite conclusions, Curry criticized it as having too short a data record:

35 years is marginally short to identify a statistically significant trend (people who criticized our study because the length of the data record is too short raised a legitimate point). 20 years is definitely too short. The reason for this is that both the atlantic and pacific have large multidecadal modes. if you pick a period that is too short, what you are seeing is one piece of the mode.

Perhaps detection of some trends requires a shorter time period than others ;-) Actually The Elsner et al. dataset shows no significant trends in the most intense storms for the period 1986-2006, according to a response to a query I made to Jim Kossin.

Given that Elsner et al. make no mention of what the IPCC calls “internal” or “natural” variability (PDF), the analysis will be little more than a Rorschach Test for its readers. As the IPCC notes (PDF, p. 667 ff.):

An identified change is ‘detected’ in observations if its likelihood of occurrence by chance due to internal variability alone is determined to be small.

Elsner et al. say nothing about how the observed trend might be interpreted in the context of variability. So, setting aside concerns about the data or methods and accepting their findings as reported, there is no basis for assess whether they have found one part of a lower frequency variation or have “detected” (in IPCC parlance) a trend.

What does all this nonsense, incoherence, and inconsistency say about the debate over hurricanes and climate change? The answers are not going to be clear or settled anytime soon. However, if you’ve already made up your mind, or prefer a certain outcome, there is plenty of available science out there to buttress those views.

The very public theater in which the hurricane-climate debate is playing out sure is entertaining, to paraphrase Richard Tol, but it has done very little for the practice of science.

RMS vs. AIR on Gustav Damages

September 3rd, 2008

Posted by: Roger Pielke, Jr.

The two leading catastrophe modeling firms, RMS and AIR, have published very different cost estimates for the insured losses associated with Hurricane Gustav, with the RMS estimates at twice that of AIR.

From AIR:

Catastrophe risk modeling firm AIR Worldwide Corporation estimates that insured losses to onshore properties in the U.S. are between $2 billion and $4.5 billion, with an expected loss of $3 billion.

From RMS:

Insured losses from Hurricane Gustav could range between $4 billion and $10 billion, according to initial estimates based on the latest information from catastrophe risk experts, Risk Management Solutions (RMS). . . Insured losses for onshore damage to residential and commercial properties, as well as business interruption, are estimated to be between $3 billion and $7 billion.

Neither estimate includes losses associated with flooding due to levee damage or rainfall, which are typically not insured (by the private sector).

The fact that the two leading catastrophe modeling firms could differ by [UPDATE: as much as] 100% in their estimates, where each has access to the exact same storm data and exposure information, should serve as a reminder of the fundamental imprecision of this art. I’ll make a note to see where the final insured claims come in at, and see whose estimate fared better. The historical analogues from our normalization dataset would suggest AIR, but we shall see.

ScienceDebate’s Success – Finally Joining the 20th Century

September 2nd, 2008

Posted by: admin

Given my criticism of ScienceDebate 2008 both here and on ScienceBlogs, it seems appropriate to note that they managed to get something in response from one of the campaigns.  The Obama campaign has responded to the 14 questions that ScienceDebate has posed to both campaigns.  The McCain campaign responses are expected soon.

I’m not interested in comparing the responses until both campaigns’ submissions are in.  What I think is worth pointing out is that this big breakthrough is a big breakthrough only for the science advocacy community not connected to biomedical science.  Most other advocacy groups trying to bend the ear of candidates have been working voter guides (different than the ones from your friendly state elections organization) and similar documents for years as means to mobilize their voters.  Here are some examples of what I mean.  Even the technology community is way ahead of the physical scientists.

How different are the responses to these questions from a voter guide put out by another issue group?  For whom is this really new and different?  Before the candidates will really pay attention to those issues, This is why I think through this one small change, this advocacy community has finally joined the 20th century.  There’s a very long way to go.

ScienceDebate 2008 was (and is) thinking small.  They can do better.  I know it’s almost foreign to mention technology in this particular crowd, but when political campaigns are running circles around you with social networking technology, you should be embarassed.  The tools are there.  The question should be why they aren’t being used, not why candidates aren’t paying attention.

Yohe and Lomborg

September 2nd, 2008

Posted by: Roger Pielke, Jr.

Last week I had a post up titled “Yohe vs. Lomborg” focusing on an exchange in the Guardian between Gary Yohe and Bjorn Lomborg. Now, are you sitting down? If not the following might knock you over.

Gary and Bjorn have decided to discuss their differences and write a joint op-ed for the Guardian that has just been published. Can you imagine? Two people who disagree on climate change policies actually working together in the public eye. Shocking behavior. But admirable and exemplary. Here is how they start:

After a very public debate on the Guardian website over the past few weeks, we have learned some lessons. Here are a few that come to mind. First, even carefully-crafted prose can be misunderstood. Throwaway lines, even those inserted parenthetically, can carry as much weight as key sentences. Second, attributing motive is dangerous, distracting, and frequently wrong; it should be avoided, not only because it has consequences for us as individuals, but also because it easily distracts attention from the value of our analytical work.

To the extent that we have both been guilty of imprecision and attribution of motive in the heat of this debate, we are happy to report that cooler heads have prevailed. We recognise that despite our differences in view, we respect each other’s commitment to robust public debate informed by different perspectives. To that end we both have agreed to forswear recent comments, and to wipe the slate clean. With this essay we’d like to show how people who disagree on policy options can still agree to collaborate productively, even under the hot glare of the very public, and very political, debate over climate change.

Good for both Gary and Bjorn, and also for the rest of us. While some folks clearly want to see a brawl based on tribal loyalties, the view we hold here is that policy discussions are so much more useful as discussions. Please read the full text of their joint piece, and feel free to come back here and discuss.

[UPDATE: Richard Tol in the comments has this observation:

. . . climate change is not a policy issue. It’s entertainment for the degreed masses. Bjorn and Gary shook hands in public, and lost their entertainment value. They’ll be replaced by new champions for the cause in no time.]