CSTPR has closed May 31, 2020: Therefore, this webpage will no longer be updated. Individual projects are or may still be ongoing however. Please contact CIRES should you have any questions.
Ogmius Newsletter

The Cherry Pick

By Roger Pielke, Jr.

Lawyers get paid considerable sums to do it.  Journalists do it on a daily basis.   According to their critics, President George Bush and his Administration do it routinely.  I’m sure that I do it, and no doubt you do it as well.  Everyone does it, so it must be OK, right?

“It” refers to the cherry pick -- the careful selection of information to buttress a particular predetermined perspective while ignoring other information that does not.  In other words, take the best and leave the rest.  An obvious example of the cherry pick is the allegation that the Bush Administration emphasized those pieces of intelligence that supported its desire to invade Iraq.  But if you look around, you’ll see it everywhere, embraced across ideological perspectives.  And you’ll see it a lot in debate over highly politicized issues related to science and technology.

For instance, the Union of Concerned Scientists recently released a report signed by numerous scientists, including a number of former officials in the Clinton Administration’s Office of Science and Technology Policy, asserting that the Bush Administration cherry picks science to support its ideological agenda.  In response, John H. Marburger III, science adviser to President Bush, suggested that the report itself selectively chose its evidence and did not accurately portray the Bush Administration’s use of science.

The cherry pick is appealing when one’s commitment to a particular course of action is based on considerations other than information, such as one’s political or religious ideology.  I see this in my courses on policy analysis when a student comes to me and asks something like the following: “I want to argue that the moon is made of green cheese, can you point me toward peer-reviewed articles that support this claim?”  When I ask the student, “What information would change your perspective on this topic?”, often the answer is “None!”

Consider for example, when Diane Sawyer asked President Bush to distinguish between Iraq possessing WMDs versus having the potential to develop them in the future he replied, “So what's the difference?”  The presence (or absence) of WMDs in Iraq in spring 2003 apparently was not a central factor in the President’s decision to go to war, even though their presence was central to the Administration’s public campaign in support of the war.  This illustrates how information is used selectively to market policy options to others with the hope that that information will affect their decision making. Consider Colin Powell who apparently spoke out of step with the rest of the Administration (he was, in his own words, “a little forward on his skis”) when referring to the decision to go to war he suggested that the “absence of a [WMD] stockpile changes the political calculus; it changes the answer you get.”

How the media reports the latest findings – scientific or otherwise -- facilitates the cherry pick.  The constant drip-drip-drip of studies and reports – frequently embargoed by leading journals and agencies to enhance the appearance of newsworthiness – is routinely followed by advocates of this or that perspective scrambling to issue press releases highlighting how the new finding vindicates their perspective and demolishes that of their opponents.

Consequently, the general public may be confused when reading this week that coffee causes cancer, because last week the media reported that coffee prevents cancer.  But this is how science, or intelligence gathering more generally, actually works. The most recent study adds only a bit of information to a vast sea of previous research and knowledge, and consequently is rarely definitive.  Smoking guns are rare.  Often the most accurate appraisal of information is “we simply don’t know for sure.”  But decisions --Drink coffee?  Invade Iraq?  Regulate emissions? -- have to be made anyway.

When faced with uncertainty we frequently base our commitments to particular actions on factors other than facts.  Familiar examples of strategies that we use include:  innocent until proven guilty, the precautionary principle, and the doctrine of preemption.  In such cases, “facts” are simply resources for marketing our preferences to others.  As Ivo Daalder and James Lindsay observe in their recent book on United States foreign policy, “Journalists and intellectuals often assume that beliefs are built on a foundation of facts.  This assumption is usually wrong… people generally come to their beliefs about how the world works long before they encounter facts.”

The cherry pick is facilitated by the world’s fundamental complexity and uncertainty and the corresponding diversity of information at our disposal.  A former aid to Vice President Dick Cheney explained in an interview with Seymour Hersh how the cherry pick works in practice: “There’s so much intelligence out there that it’s easy to pick and choose your case.  It opens things up to cherry-picking.”  This is a familiar situation to those who study the role of scientific information in decision making.  As Daniel Sarewitz  observes, “Rather than resolving political debate, science often becomes ammunition in partisan squabbling, mobilized selectively by contending sides to bolster their positions.”

So what to do?  You can ask yourself the following question in any policy context to see if information actually matters to you:  “What information would lead me to change my commitment to a particular course of action?”  If the answer is “no information” then watch out!  When you make arguments claiming that the “facts” dictate a certain course of action you might be engaging in the cherry pick.

For policy making more broadly, if we want to avoid competing  political positions each using cherry-picked information then we need to encourage leaders and institutions to support “honest information brokers” as a counterweight to special or partisan interests.  But such “honest brokers” need to do more than just focus on “facts;” they also need to place facts into policy context.  Rather than using information to narrow the scope of choice, as is the province of issue advocates, such brokers might contribute to the expansion of choice.  These types of honest brokers might play two important roles.

First, they could facilitate effective decision making by developing policy alternatives that are robust to information uncertainty.  Everyone who has a retirement account is familiar with an investment strategy of “diversification.”  Diversification seeks to accommodate uncertainties in future market performance by developing a portfolio that will perform well under any conditions.  Policies that are robust to information uncertainty might perform well no matter whose “facts” are eventually borne out to be true.

Second, honest brokers foster democratic accountability.  There is a reason why intelligence organizations and scientists are expected to be independent of political pressures.  Yet, we also ask those same groups to provide information that is directly relevant to the needs of policy makers.  Simultaneously meeting the challenges of policy relevance and political independence is not easy.  Even though information often can’t dictate a single course of action, it can inform the scope of choice, and in some cases, point to paths unseen and alternatives untried.

Unlike any time before our era is characterized by easy access to complex information.  The challenge before us is to develop the skills and tools necessary to use such information effectively in policy and politics.   Otherwise information will play little or no role in decision making because everyone has access to information and experts that bolster their particular perspective.  Today, we need honest brokers more than ever.

Roger Pielke, Jr.
pielke@cires.colorado.edu