$title = "Prediction and Yucca Mountain"; ?> $sectionTitle = "Global Climate Change and Society"; ?> $headerFile = getenv("DOCUMENT_ROOT")."about_us/archives/projects/gccs/header.html"; ?> $footerFile = getenv("DOCUMENT_ROOT")."about_us/archives/projects/gccs/footer.html"; ?> require($headerFile); ?>
Summary
As the United States moves into the 21st century, it continues the struggle to safely dispose of the hazardous byproducts of the nuclear age, namely spent nuclear fuel (SNF) from commercial, defense, and research nuclear reactors and high level waste (HLW) from the construction of nuclear weapons. In response to this pressing problem, Congress enacted the Nuclear Waste Policy Act (NWPA) of 1982, which directed the Department of Energy (DOE) to characterized possible sites for a permanent, deep geologic repository. Amendments to the NWPA in 1987 dictated that further study be restricted to the Yucca Mountain site. In addition, the NWPA mandated that the Environmental Protection Agency (EPA) draft health and safety standards for any permanent nuclear waste repository, and the Nuclear Regulatory Commission (USNRC) became responsible for approving a construction license for future repositories that adhere to EPA standards.
In this regulatory climate, the DOE has commissioned numerous scientific studies by national laboratories and private contractors, which attempt to evaluate the viability of a permanent repository at Yucca Mountain. In addition to the great deal of empirical work that has been performed at the site, much of recent scientific study has made use of complicated computer simulations. These simulations seek to predict the performance of both the natural barrier, which consists of the volcanic tuff surrounding the repository, and the engineered barrier, composed of a steel case and nickel alloy casket, into the next 10,000 years. Specifically, these models must demonstrate that the repository will safely store waste according to the EPA guidelines, which dictate specific limits on dose levels of radiation due to exposure to contaminated groundwater and other sources of radionuclides.
After 20 years of intense scientific and political debate of these and other independent studies, it still remains to be seen whether Yucca Mountain will ever house its apportioned 77,000 tons of highly radioactive material. Despite President George W. Bush’s approval of the site in February 2002 and an act of Congress, which overrode the disapproval of the State of Nevada, the USNRC has up to four years to decide whether Yucca Mountain will receive a license for construction. Much of the fight over the issue in the future will be political and as such will focus on the scientific work performed by the DOE and other federal and private organizations. In order for politicians to make balanced and well thought out decisions, they must understand the nature and limits of the relevant scientific work on the issue. Even more importantly, policy makers must understand how scientific predictions are currently used to assess the performance of the repository and what implications this methodology has for future decisions.
The current methodology uses predictive science to the near exclusion of other approaches of evaluating repository performance, and this leaves any evaluation the DOE performs open to scrutiny based on a number of fronts. Primarily, DOE performance assessments can be criticized due to the level of uncertainty their predictions exhibit, whether that uncertainty is the product of a lack of complete scientific understanding of the process involved in waste transport or a result of a limited ability to measure or estimate certain statistical parameters. The uncertainty related to empirical measurements and statistical parameters in the computer simulations, known as aleatory uncertainty, can be known within a reasonable degree of accuracy. On the other hand, uncertainty due to incomplete scientific understanding is nearly impossible to estimate. Consequently, DOE computer simulations are vulnerable to the questions, “do we really understand what’s going on here,” and “do we really know how uncertain we are?” Even more distressing is the fact that continued empirical and theoretic research might not reduce the total level of uncertainty. A certain degree of aleatory uncertainty is unavoidable, and the fact that any computer simulation predicting events occurring 10,000 years in the future cannot be rigorously validated means DOE simulations could continue to stir scientific and political debate indefinitely.
Moreover, the emphasis on predictive science as an effective indicator of repository performance is indicative of a trend in US radioactive waste management that assumes any problem in waste disposal can be solved through technological fixes and predictive science. This presumption has played a part in the poor performance history of US radioactive waste management and also has contributed to a pervasive lack of trust between the nuclear power industry, environmental advocates, state governments, and the DOE. [1] Over the years, the US government and the DOE have made major improvements in the way SNF and HLW disposal is managed. Yet, the question remains whether current DOE methodology, which still relies on rigid technological solutions and predictive certainty, can achieve the goal of disposing of SNF and HLW in a socially acceptable manner. Is there an alternative methodology that can use science in such a way that repository performance and the perception of repository safety are not contingent upon predictive certainty? By contrasting the US SNF and HLW disposal program with foreign programs and analyzing the US methodology to date, organizations such as the National Research Council (part of the National Academy of Sciences) and the Nuclear Energy Agency have identified an alternative waste disposal approach that makes greater use of qualitative scientific understanding and has had greater success in cultivating trust between national governments and populations nearby proposed repositories.
Termed “adaptive staging,” this methodology focuses on breaking the process of siting and constructing a repository into different stages. [2] At the end of each stage, scientific knowledge is digested and evaluated, the safety of the emerging repository is assessed, and a decision on how to proceed is made based on the results of previous work. This approach rejects the strategy of completely specifying the technical requirements of a repository in the initial steps of the process and instead advocates letting the design develop gradually through “systematic incremental learning.” [3] While a more rigid design process can produce a less expensive repository along a shorter timeline, adaptive staging has more potential for stakeholder input, early diagnosis of technical issues, and for building trust between the DOE, state governments, and concerned populations. One of the key distinctions between this approach and the one used by the DOE and its predecessor over the past 40 years is the way in which predictive scientific information is used.
While the previous DOE methodology relies on predictive science in the design and evaluation of waste repositories, adaptive staging acknowledges the inherent uncertainties of any prediction and focuses the design process on exploration, rather than prediction. In addition, adaptive staging is more open to the use of other types of scientific knowledge in the site evaluation process. For example, uranium deposits located in geological conditions similar to a proposed repository site can provide scientists with information on the long-term transport of radionuclides through geological strata and groundwater. This kind of information can compliment numerical predictions on site performance and enhance public confidence concerning repository safety. Finally, adaptive staging focuses more on maximizing repository performance and cultivating redundancies. Thus, a failure of computer simulations to correctly predict the behavior of any particular component of the system does not necessitate an inability of the repository to minimize the concentration of radionuclides in the surrounding environment.
As has been emphasized by some scientists, the recommendation of Yucca Mountain as the future disposal site for SNF and HLW and the application for a construction license do not set in stone whether Yucca Mountain will ever house nuclear waste or what form any repository at Yucca Mountain might take. [4] Over the next 4 years, the DOE and USNRC will have an opportunity to make further steps towards permanently disposing of the glut of SNF and HLW. These organizations must choose between two different strategies towards accomplishing their goals. They can continue with the application of a linear, rigid framework that details a repository design, timeline, and performance criteria or move towards an adaptive staging approach, which focuses on an evolving repository design and performance criteria informed through incremental learning. The former approach allows the DOE to claim that it will build a compliant and affordable repository in a timely manner, a claim that is necessary for a number of political reasons. The latter methodology provides fewer assurances off the bat but may generate more trust between the DOE and a variety of stakeholders, which in the end might result in the construction of a more socially accepted SNF and HLW disposal option.
[1] Managing the Nation’s Commercial High-Level Radioactive Waste. (Washington, DC: U.S.
Congress, Offlce of Technology Assessment, OTA-O-171, March 1985).
[2] Principles and Operational Strategies for Staged Repository Systems: Progress Report. (Washington, DC: National Academy of Sciences, Board on Radioactive Waste Management of the National Research Council, March 2002).
[3] Ibid.
[4] Apted, M.J, et al. 2002. Yucca Mountain Should We Delay? Science: 296: 659-660