CSTPR has closed May 31, 2020: Therefore, this webpage will no longer be updated. Individual projects are or may still be ongoing however. Please contact CIRES should you have any questions.
Ogmius Newsletter

Ogmius Exchange: Part II

Thoughts on Catastrophic Terrorism in America

By Eugene Skolnikoff

It does not take much imagination to see the many ways in which science and technology can be used by terrorists, nor to realize their relevance to measures to protect against terrorism. That relationship has received much attention in the government, in the public, and in the scientific and engineering communities in the year since Sept. 11. Many fearful scenarios have been suggested, along with sober studies of the dangers we face. Much attention has been given to possible nuclear, biological or chemical threats, with parallel discussion of the need for technologies to detect or protect or mitigate the effects of such attacks if they occur. The agenda is long and has resulted in science and technology being given a reasonably prominent place in the new Department of Homeland Security (DHS).

We must recognize, however, that a modern technological society inevitably is a vulnerable society. We have come to depend on large-scale technological systems –energy, communications, transportation, financial networks, water and food distribution, among others– whose interruption would have highly disruptive effects. Vulnerability of the systems can be lessened, but not eliminated if the systems are to operate and serve their function. Moreover, that vulnerability can be attacked not only by sophisticated science or technology, but also by everyday technologies, as we saw so tragically a year ago.

That ability to use any technology as a weapon is one of the “bedrock” attributes of science and technology: the fundamental characteristic that all scientific and technological knowledge can be used for beneficial or malign purposes; all knowledge is “dual-use.”

There are other bedrock attributes also relevant to the terrorist threat. One is that knowledge inevitably spreads. Barriers can delay, but not prevent transfer of knowledge to those who seek it and have the competence and resources to assimilate and use it. Transfer may not be easy, often it is quite difficult, but cannot be permanently prevented.

Another attribute is that the most significant applications of a new technology may be far from the original intended purpose of its development. Especially so when synergisms among scientific disciplines and technologies give rise to applications not foreseen within the individual fields. The best example is the ubiquitous silicon-based chip, so essential to high-performance computers, that depends on material science, lithography, computer science, and other disciplines for its design and manufacture.

All of these and related factors ought to put a different face on attempts to limit the knowledge available to “rogue” states or potential terrorists, and on the agenda for research and development. Disquieting signals have been appearing from Washington indicating an intention to discourage or prevent the publication of research results that are seen as having possible application to terrorist weapons, and to limit the fields in which foreign students may be engaged. There are legitimate questions about whether there ought to be any restrictions applied to publication (e.g. the detailed design of nuclear weapons), but the recent difficult history, predating Sept. 11, of the implementation of rules for controlling unclassified information under the International Trade in Arms Regulations (ITAR) makes it clear that it would be unwise to leave such matters to government decision alone and to implementation by national security agencies. Under the ITAR, onerous impediments to research in the space sciences have been imposed in the name of non-proliferation. The issues surrounding non-proliferation are real and important, but the regulations have been implemented with little apparent understanding of the effects on the research community or of the consequent implications for national security. Though the effects have been primarily in space sciences so far, the ITAR lists other disciplines to which it could be applied.

Blanket restrictions on openness of information and on the participation of qualified foreign students will not prevent information from reaching undesirable hands. The research community does, however, have an obligation to understand what controls, if any, should be instituted, and how they might be implemented. The recent policy adopted by the American Society of Microbiologists to monitor questionable papers in the editorial review process may provide a viable model. But, broader restrictions on information or people would only serve to reduce the vitality of the research and development enterprise in the US, a vitality that will be essential to meet the broadened agenda posed by the terrorist threat.

That agenda now includes a range of goals, from frontier molecular biology on ways to detect and counter new disease pathogens to less sophisticated research on how the nation’s vulnerabilities might be reduced or the effects of disruption minimized. The “simple” goals-- to improve the nation’s public health system, or monitor the massive container traffic in the nation’s ports, or build redundancy at reasonable cost in communications networks– may prove to be as important as the fundamental research in the laboratory. Whether esoteric research at the frontier or down-to-earth improvement and implementation of well-understood technologies is required, the quality of the R/D enterprise is essential. Over the years, we have learned what is essential to maintain that quality; we must not cripple it now through measures that superficially may appear appropriate but in fact are damaging to the resource we are trying to protect and ultimately will lessen rather than strengthen national security.

Eugene B. Skolnikoff
Professor of Political Science Emeritus
Massachusetts Institute of Technology
ebskol@mit.edu