Does the Truth Matter?

February 1st, 2007

Posted by: Roger Pielke, Jr.

Here are seven paragraphs from the conclusion to Alan Mazur’s excellent book True Warnings and False Alarms: Evaluating Fears about the Health Risks of Technology, 1948-1971 (Resources for the Future, 2004, pp. 107-109, buy a copy here)– the concluding subsection is titled “Does the Truth Matter?” .

Mazur distinguishes between a “knowledge model” and a “politics model” for understanding public debates involving science. These distinctions are somewhat (but not entirely) related to the concepts of the “linear model” and “interest group pluralism” that I discuss in my forthcoming book, which is really about how to reconcile the fact that there are elements of both models in the reality of decision making. Neither of Mazur’s models accurately describes how the world works, we need both. Some of the more useful debates and discussions following my testimony his week reflected a paradigm clash between those who view the world through the lens pf the “knowledge model” and those – like me – who accept that the “politics model” also reflects some fundamental realities as well. Here is the excerpt:

In a democracy, the people or their representatives are free to spend public money as they see fit. Interest groups compete to channel funding to their favorite causes. If U.S. society chooses to allot far more money to cleaning up toxic waste sites, which harm few people, than to prevent teenagers from smoking, which creates an enormous health burden, that is our privilege as a nation.

Still, many risk analysts are disturbed when we fail to maximize the number of lives save per dollar of risk remediation. They point out that actions taken by government to avoid the consequences of an alleged hazard are often unrelated to the severity or scientific validity of the hazard (EPA 1982; Breyer 1993; Graham and Wiener 1995; Mazur 1998). The inference is that policy should be better aligned with science, and that irrational or inefficient elements of policymaking should be eliminated (but see Mazur 1995 and Driesen 2001 for limitations on this positions).

Yet public policy does not always flow directly from scientific knowledge. A value-laden subject decision is always involved, one that requires weighing pros and cons, costs and benefits, winners and losers. A wise policy choice for one party with certain interest may not be the wisest choice for a party with different interests. These considerations raise a question: does scientific evaluation of a warning matter at all?

Essentially two models show how science is applied to public policy. The first – call it the “knowledge model” – assumes that scientists can obtain approximately true answers to their research questions with methods that are fairly objective. This knowledge is used to inform public policy. For example, scientists can determine the health risks from exposure to fluoride at levels adequate to prevent cavities. Policy makers then use this finding as one factor in deciding whether to add fluoride to community drinking water. Such decisions cannot follow from facts alone, but facts ought to influence outcomes. If health risk is high, that should help shift the decision against fluoridation; if low, that should encourage fluoridation. The model makes no sense to anyone who denies that science can find correct answers.

The second model – the “politics model” – can be applied whatever one’s view concerning the objectivity of science. Here partisans use scientific findings as political capital to sway policy in the direction they prefer. If such partisans favor fluoridation, they will claim there is little health risk; if they oppose it, they claim a high health risk. I makes no difference if the findings are correct, objective, or honest as long as they are persuasive. The actors bury findings that work against their position, or attack them as invalid or inapplicable. In the politics model, scientific claims are used polemically, just like any other kind of political argumentation (Mazur 1998; Brown 1991).

The politics model has many proponents. Partisans in a particular controversy often see their goal as sufficiently important to justify any interpretation of scientific data that is favorable to their cause. During breaks from writing this final chapter, I am reading John McPhee’s (1971) laudable biography of David Brower, a major environmentalist of the postwar period. McPhee repeatedly describes Brower’s habit of making up “facts” to support his arguments against industrialists and developers. The biographer seems to regard this as an endearing tactic of the “archdruid” in his advocacy for wilderness preservation. Like McPhee, we sympathize with those who fight the good fight, accepting their argumentation when in other contexts it would be vexing.

But the politics model loses its appeal if applied to the entire array of technical controversies affecting policy. Science that is sufficiently malleable to serve any position in one controversy can serve any position in all controversies, and in that event science does not matter at all. The famous parable of “the tragedy of the commons” tells how each shepherd maximized his own herd’s grazing on the village green until no grass remained for anyone (Hardin 1968). In the same way, if each technical expert interprets data for his or her own convenience, with no attempt at objectivity, there will be no experts left with unimpeachable credibility, and we will all suffer for it. [emphasis added]

3 Responses to “Does the Truth Matter?”

    1
  1. Richard Belzer Says:

    Both the “knowledge” model and the “politics” model are unsatisfying. Knowledge by itself either conveys nothing about values, upon which both personal and public decision making depend, or it hides its values behind curtains of expertise. The “politics” model permits — no, invites — the cultivation and use of falsehood for instrumental purposes. That path is morally corrupt.

    During my stint in government my job was to ensure that White House decision makers had the best available policy-neutral information available to them, if they wanted it. It was not my job to make decisions; rather, it was to inform decisions that were made.

    Using science to inform decisions had a number of possible outcomes. First is the case where informing decision makers caused them to make decisions that were consistent with my values. Obviously this is the most satisfying outcome; it’s the one that hundreds (thousands?) of global climate change scientists hope for. But as scientists we tend to be blind to the values we bring to our research and analysis. So when decision makers choose what we want them to choose, it ratifies both our competence as scientists and the worth of our values.

    The second scenario arose when decision makers, once as fully informed as I could make them, made decisions that conflicted with my values.
    When this happens, it assaults one’s sense of scientific competence and questions the legitimacy of one’s values. Because we are incapable of sustaining challenges on both our scientific and moral integrity, we quickly project our anger and disgust upon the decision maker. It is HE (not me) who is scientifically inept, lazy or foolish. It is SHE (not me) whose values are corrupt, venal and damnable. The mere possibility, never mind high likelihood, that the decision makers values are equally legitimate as our own — and quite possibly more so for having stood for election and won — simply does not occur.

    However difficult it is to adapt to life in this second scenario, it is nevertheless something we must do when we interact in the policy world. For if we do not we end up with the third scenario.

    Sometimes decision makers have no interest in being informed, and spend sometimes considerable effort to become misinformed, and work overtime to misinform others. This is the “politics” model in its ugliest glory. Lies become useful tools for prevailing in political battle.

    It is remarkably easy for scientists to succumb to the temptation to engage in the “politics” model at this base level. It satisfies a deep need for the vindication of values that decision makers have disrespected. But the price of values vindication is the sacrifice of scientific integrity.

    If decision makers want to make what we consider foolish or improper choices, but they have the legitimate authority to decide, it is our duty to provide accurate scientific information and correct scientific error but then back away. To maintain our credibility as scientists we must resist the temptation to color science with our policy preferences. In this way we can attenuate the worst features of the “politics” model.

    If we want to be decision makers we need to endure what decision makers must: stand for election, or seek Executive appointments that confer the authority to decide. But let’s be clear: that is not a scientific job; it’s a political one.

  2. 2
  3. Cortlandt Says:

    Roger,

    Just from the section quoted it seems that in the definition of the political model Mazur includes everything from a certain degree of cherry picking” to lying without distinction. Does Mazur consider a pathological and non-pathological version of the political model? Or is the knowledge model the sole non-pathological option?

  4. 3
  5. Roger Pielke, Jr. Says:

    Cortlandt- Thanks. Unfortunately, Mazur does not pursue this argument further. My book picks up the question from here . . .