oil_rig_explosionAs I write, the Deepwater Horizon well in the Gulf is once again gushing unchecked as BP tries to install a new cap that could end the spillage. A recurrent theme in the discussion of this massive spill is that we shouldn’t trust “fail-safe” technologies or the experts who reassure us that catastrophes cannot happen. Naomi Klein wrote in the Guardian that “This Gulf coast crisis is about many things – corruption, deregulation, the addiction to fossil fuels. But underneath it all, it’s about this: our culture’s excruciatingly dangerous claim to have such complete understanding and command over nature that we can radically manipulate and re-engineer it with minimal risk to the natural systems that sustain us. But as the BP disaster has revealed, nature is always more unpredictable than the most sophisticated mathematical and geological models imagine.” Klein quotes Carolyn Merchant, a professor at the University of Wisconsin at Madison and a noted proponent of deep ecology, as saying: “The problem as BP has tragically and belatedly discovered is that nature as an active force cannot be so confined. Unpredictable, chaotic events [are] usual [in ecological systems].”

Roger Witherspoon writes about the “Myth of Technological Infallibility” underlying the arrogance and hubris that led President Obama on April 2, 2010 to give his tragically ill-timed assurance that opening up offshore oil exploration was safe: “It turns out, by the way, that oil rigs today generally don’t cause spills. They are technologically very advanced.” EPA Administrator Dr. Lisa Jackson explained in a May 24 press conference that there was no federal oversight of emergency plans because “we were told over and over by the industry that it could not happen. So we have few tools out there.”

Witherspoon, like many others, linked the oil spill to the unknown dangers of rushing headlong into a new era of nuclear energy, in an effort to deal with carbon emissions. Witherspoon argues that the U.S. Nuclear Regulatory Commission shares the mindset of underestimating risks and being too close to the industry it regulates. For example, the NRC has belatedly recognized terrorism as a threat, but decreed that commercial nuclear operators do not have to plan for such an event because risks cannot be assessed and terrorism prevention is a federal responsibility.

Oil rigs and nuclear power plants are highly complex technical systems, in which the failure of one component can potentially cascade into a larger scale disaster. As oil rigs penetrate ever deeper waters to tap high-pressure deposits, it is difficult to assess the risks and build in adequate margins of safety. But these are not just engineering challenges: the oil and nuclear industries are woven into organizational, economic, and political systems; their technologies and production practices are shaped by market forces, bureaucratic operating procedures, and regulatory agencies. They are complex dynamic systems with unpredictable behavior when certain thresholds are crossed, just like the climate and the economy (as I discussed in A Tale of Two Meltdowns).

Detailed case studies of various disasters by organizational sociologists reveal a common pattern of how complex technologies interact with organizational processes and routines, hierarchical power structures, pressures to cut costs, and lax oversight. Together, these can lead to inertia, distorted cognition, the neglect of warning signals, and poor decisions. In engineering-intense organizations, there is often a hyper-masculine culture in which expressions of concern about risk are treated with scorn. Normal Accidents, Perrow’s classic study of the nuclear accident at Three Mile Island, concluded that catastrophic accidents were “normal” in the context of highly complex socio-technical systems. Even the most carefully designed safety systems could not always prevent the interaction of human and technological failures from cascading into major calamities. Perrow describes in vivid detail the managerial pressures to ignore risks, to stay on schedule and keep costs under control. Perrow found that information available to decision makers was inadequate, delayed, and sometimes inaccurate, and often subject to misinterpretation under crisis conditions. When people do intervene, there are frequently unanticipated effects that exacerbate matters. Diane Vaughan’s analysis of The Challenger Launch Decision demonstrated very similar characteristics.

Perrow concluded that the unpredictability of complex systems make the risks of nuclear power fundamentally unmanageable, and there are voices expressing the same attitude toward deep sea drilling. But do we have to embrace the deep ecology position that nature “cannot be so confined”? Nature is reliably confined and controlled in the combustion chambers powering cars, planes, and electric power generation. The economy cannot be precisely controlled, but it can be steered. Of course, using historical experience to guide for future decisions in relation to low-probability but high impact events can underestimate risks, especially when technologies are pushing new frontiers.

But there are no absolutes here: the question is always how reliable are the systems, and what are the consequences of catastrophic failure, in time and geographic reach?  Failures cannot be eliminated from complex systems, but they can be managed to tolerable levels. Lean production systems employ statistical process control and input from workers to improve quality and reliability, from the component level to the whole production process. In my doctoral thesis work, I studied how this approach could stabilize international supply chains which had been subject to chaotic disruptions. The Federal Aviation Authority examines airplane safety records and mandates technical as well as procedural changes.

In general, there has been too little attention to the non-technical aspects of risk management, the economic and organizational pressures and wider governance systems. Yet the overall safety record for risky technologies is not bad. More than 4000 offshore oil platforms operate routinely in the Gulf of Mexico alone, and it has been over 30 years since the last major offshore blow out. France has operated 59 nuclear power stations for decades without major catastrophe. About 50,000 commercial flights are operated each day around the world.

If BP succeeds in installing the new cap and staunching the oil flow in the next week or two, this will count as a major regional disaster, but not necessarily one that should prevent all offshore drilling in the future. Within a couple of years, bacteria will have digested most of the oil, and life will return to the coastal regions. There are technological, political, and economic lessons to be learned, and with a bit of luck, we could go another 50 years till the next big blowout.

The risks associated with oil are modest compared with nuclear power and weapons production. Even if power plants can be operated safely, the waste disposal problem remains stubborn. The New York Times recently reported that “the amount of plutonium buried Hanford Nuclear Reservation in Washington State is nearly three times what the federal government previously reported.” Production of plutonium stopped at the 560 square mile site in the 1980s, and clean up has barely begun because nobody seems to know exactly what was dumped where, or how to deal with contaminated soil. Plutonium is highly toxic and can slowly seep into groundwater and the Columbia river. With a half-life of 24,000 years, it needs to be contained for eons of time during which civilizations, languages, and the climate will all undergo profound shifts.

There are some rumors circulating on the web that the BP blowout could trigger a massive release of methane, unleashing tsunamis and toxic gas clouds that would cause massive devastation to the region. A frightening scenario indeed, but the most credible report I can find does not see this as a serious threat. The real risk is that we get back to the business of safely pumping and burning oil and gas as usual, pushing the climate through critical thresholds and triggering global, irreversible changes.

The deep ecology position itself carries some hidden dangers. It reflects the same kind of deep populist distrust of scientific expertise that has animated climate deniers (and which this week’s report clearing the U. of East Anglia scientists of major wrongdoing will do little to allay). And the claim that the existing order is “natural and sacred” has traditionally been used by elites to justify the status quo. Progressive politics demands that we “denaturalize” our systems of production and governance, our assumptions about hierarchy, and our faith – and fears – of science and technology. Progressive politics requires that we be alert to the ways in which science and technology are embedded in social, economic, and political systems, and that we actively manage these systems to transition to a more socially, economically and environmentally sustainable system.