Prediction in Science and Policy

February 20th, 2007

Posted by: Roger Pielke, Jr.

In the New York Times today Corneila Dean has an article about a new book by Orrin Pilkey and Linda Pilkey-Jarvis on the role of predictions in decision making. The book is titled Useless Arithmetic: Why Environmental Scientists Can’t Predict the Future.

Here is an excerpt from the book’s description at Columbia University Press:

Writing for the general, nonmathematician reader and using examples from throughout the environmental sciences, Pilkey and Pilkey-Jarvis show how unquestioned faith in mathematical models can blind us to the hard data and sound judgment of experienced scientific fieldwork. They begin with a riveting account of the extinction of the North Atlantic cod on the Grand Banks of Canada. Next they engage in a general discussion of the limitations of many models across a broad array of crucial environmental subjects.

The book offers fascinating case studies depicting how the seductiveness of quantitative models has led to unmanageable nuclear waste disposal practices, poisoned mining sites, unjustifiable faith in predicted sea level rise rates, bad predictions of future shoreline erosion rates, overoptimistic cost estimates of artificial beaches, and a host of other thorny problems. The authors demonstrate how many modelers have been reckless, employing fudge factors to assure “correct” answers and caring little if their models actually worked.

A timely and urgent book written in an engaging style, Useless Arithmetic evaluates the assumptions behind models, the nature of the field data, and the dialogue between modelers and their “customers.”

Naomi Oreskes offers the following praise quote:

Orrin H. Pilkey and Linda Pilkey-Jarvis argue that many models are worse than useless, providing a false sense of security and an unwarranted confidence in our scientific expertise. Regardless of how one responds to their views, they can’t be ignored. A must-read for anyone seriously interested in the role of models in contemporary science and policy.

In an interview the authors comment:

The problem is not the math itself, but the blind acceptance and even idolatry we have applied to the quantitative models. These predictive models leave citizens befuddled and unable to defend or criticize model-based decisions. We argue that we should accept the fact that we live in a qualitative world when it comes to natural processes. We must rely on qualitative models that predict only direction, trends, or magnitudes of natural phenomena, and accept the possibility of being imprecise or wrong to some degree. We should demand that when models are used, the assumptions and model simplifications are clearly stated. A better method in many cases will be adaptive management, where a flexible approach is used, where we admit there are uncertainties down the road and we watch and adapt as nature rolls on.

I have not yet read the book, but I will.

Orrin participated in our project on Prediction in the Earth Sciences in the late 1990s, contributing a chapter on beach nourishment. The project resulted in this book:

Sarewitz, D., R.A. Pielke, Jr., and R. Byerly, Jr., (eds.) 2000: Prediction: Science, decision making and the future of nature, Island Press, Washington, DC.

Our last chapter can be found here in PDF.

5 Responses to “Prediction in Science and Policy”

    1
  1. margo Says:

    “We should demand that when models are used, the assumptions and model simplifications are clearly stated. ”

    What an idea?! You mean it’s not fine to just say: order all 50 references and figure out which equations in each of those references are used in our code?

    I have to admit, with GCM models, all the models and assumptoins don’t always all fit within page limits of peer reviewed articles. So, I’d be satisfied if articles in peer reviewed journals discussing GCM’s simply cited a ph.d. thesis or government laboratory reference — neither or which have page limits– that described the assumptions and model simplifications used during the computations of the results presented.

    This is done in other areas.

  2. 2
  3. Jim Clarke Says:

    “A better method in many cases will be adaptive management, where a flexible approach is used, where we admit there are uncertainties down the road and we watch and adapt as nature rolls on.”

    This is precisely why some of us have much more faith in the ability of a free market (under the rule of law) to lead to productive and peaceful solutions to problems than government legislation and regulation prompted by computer models. A free market is constantly watching and adapting in a way that no limited group or individual ever could, no matter how ‘enlightened’ they believe themselves to be.

    The success of any society is found largely in the freedom of its people to make their own decisions and suffer their own consequences.

    See:

    http://www.townhall.com/columnists/WalterEWilliams/2007/02/07/world_poverty

    Certainly there is a role for government, (as outlined in the preamble to the US Constitution), but it should not be viewed as the ultimate source of solutions. When we place that burden on our elected officials, we force them to turn to things like computer models and ‘experts’ for guidance, ultimately leading to very poor decisions, improper regulations and a gradual decline of the society.

    I am ordering this book today, but it sounds like it will only reinforce my world view and support the conclusions I have already reached.

    When

  4. 3
  5. Jim Says:

    I think there tends to be a similar storyline to “predict vs. adapt” debates which tend, in their own limited way, to become politicized.

    Let me give you an analogy to the climate modeling debate. In the 1980s / 1990s there was a huge movement in US manufacturing to move from forecast-based production planning, using a modeling approach called MRP / DRP to what was viewed as a more “Japanese” approach of Just-in-Time (JIT) manufacturing in which no attempt to forecast was made and the factory simply responded to orders as they arrived. At least this was the most theological version of the approach. Each side had partisans, and both dug into their positions.

    Eventually common-sense prevailed, and the hubristic arrogance of the MRP / DRP models in the face of massive uncertainty was demonstrated through careful analysis by (reasonably) objective analysts, and the usefulness of at least simple predictive models was also demonstrated for specific contexts. Hybrid models, often with unique modifications for different applications, were developed for most plants.

    What always turns out to be the required discipline to resolve these disputes is empirical tracking of predictive accuracy as directly linked to economics.

    That is what it seems to me is missing from the discussion of climate models. Given that the feedback effects (i.e., the contribution of the GCM models) is something like 60 – 70% of the total IPCC ‘consensus’ CO2 sensitivity estimate, that means its pretty central to the AGW debate. I think that the climate modeling community in this analogy is basically the MRP / DRP crowd and the Pilkeys are analogs of the early JIT proponents. The Pilkeys clearly have the high ground in the debate right now because predictive modelers not subject to rigorous external audit always overestimate their own predictive capabilities. There surely are tons of systematic errors to be discovered. That said, the Pilkeys’ “all models suck and are evidence of mental defect” hyperbole will eventually be seen as over-correction.

  6. 4
  7. Dan Says:

    I thought that environmental scientists stopped predicting years ago, at least the ones who graduated after about 1990 or so. We learned projections instead of predictions.

    This book sounds more like it laments engineers believing their models and selling them to the client; and the authors themselves seem to share this view by their ‘adaptive management’ point. It’s certainly what I learned in grad school and what me and my peers do today.

  8. 5
  9. Dan Hughes Says:

    Professor Oreskes has previously published some of her views on modeling here:

    http://www.sciencemag.org/cgi/content/abstract/263/5147/641