Monday, October 24, 2016

SC137-6

http://www.resilience.org/stories/2016-10-16/deepwater-horizon-and-our-emerging-normal-catastrophes

Deepwater Horizon and our emerging 'normal' catastrophes

While watching the recently released film "Deepwater Horizon" about the catastrophic well blowout in the Gulf of Mexico that caused the largest oil spill in U.S. history, I remembered the term "fail-dangerous," a term I first encountered in correspondence with a risk consultant for the oil and gas industry.

We've all heard the term "fail-safe" before. Fail-safe systems are designed to shut down benignly in case of failure. Fail-dangerous systems include airliners which don't merely halt in place benignly when their engines fail, but crash on the ground in a ball of fire.

For fail-dangerous systems, we believe that failure is either unlikely or that the redundancy that we've build into the system will be sufficient to avert failure or at least minimize damage. Hence, the large amount of money spent on airline safety. This all seems very rational.

But in a highly complex technical society made up of highly complex subsystems such as the Deepwater Horizon offshore rig, we should not be so sanguine about our ability to judge risk. On the day the offshore rig blew up, executives from both oil giant BP and Transocean (which owned and operated the rig on behalf of BP) were aboard to celebrate seven years without a lost time incident, an exemplary record. They assumed that this record was the product of vigilance rather than luck.

And, contrary to what the film portrays, the Deepwater Horizon disaster was years in the making as BP and Transocean created a culture that normalized behaviors and decision-making which brought about not an unavoidable tragedy, but rather what is now termed a "normal accident"--a product of normal decisions by people who were following accepted procedures and routines.

Today, we live in a society full of "normal accidents" waiting to happen that will be far more catastrophic than the Deepwater Horizon tragedy. One of those "accidents" is already in progress, and it's called climate change.

People in societies around the globe are doing what they are supposed to be doing, what they routinely do, to stay alive, produce and enjoy what they produce. They do not think of themselves as doing something which is bringing about the biggest "accident" of our time, climate change. No one set out to change the climate. And yet, this is the result of our normalized behavior.

Climate change still appears to many to be building slowly. This summer was hotter than last summer and the one before that. But we've coped. We stay inside in air-conditioning on especially hot days--ironically so, as the fossil fuels making the electricity for the air-conditioner are adding to the warming itself.

It is as if we are all on the Deepwater Horizon just doing our jobs. We notice there are a few things wrong. But, we've dealt with them before, and we can deal with them again. The failures and the breakdowns are accepted as just part of how we do business. And we've managed to avoid anything truly bad up to now. So, we conclude, we must be doing things safely.

Part of the normalization of our response to climate change is the spread of renewable power sources. I have long supported the rapid deployment of renewable power, suggesting that we need the equivalent of a warlike footing to deploy enough to bring about serious declines in fossil fuel use. And, while renewable energy is growing by leaps and bounds, it is not growing nearly fast enough to meet the challenges of climate change.

And yet, society at large has relaxed into the idea--promoted by the industry--that renewable energy is well on its way to creating a renewable energy society despite the fact that more than 80 percent of our energy still comes from fossil fuels. We have normalized this response as adequate in the public mind. There remains no generalized alarm about climate change.

Certainly, there are scientists, activists and others who are genuinely alarmed and believe we are not moving nearly fast enough. But this alarm has not translated into aggressive policy responses.

The argument that things have worked just fine in the past so there is no reason to believe they won't work out in the future is a well-worn one. And, it seems to be valid because so many people say it is. (Steven Colbert might even say that this assertion has a certain "truthiness" to it.)

But there is a reason that financial prospectuses say that past performance is no guarantee of future results. Likewise, no bad accidents in the past are not a guarantee of no bad accidents in the future. It is in the structure of how we behave that the risks build. The tipping point finally reveals that we have been doing risky things all along.

If you play Russian roulette with a gun having 100 chambers, you won't think that skill had anything to do with the fact that you aren't dead after five pulls. But if you don't know you are playing Russian roulette (hidden dangers with hidden connections), then the fact that you aren't dead after 50 pulls (50 repetitions of the hidden dangerous conduct) won't seem like luck, but simply the result of sound procedure.

Climate change, of course, isn't the only place where we have normalized procedures which appear to be reducing risk, when, in fact, we are increasing it. Our monocrop farms and the small variety of major crops grown on them using modern industrial farming methods are supposed to reduce the risk of major crop losses and thus of famine. In fact, these methods are depleting the soil and undermining its fertility in ways that will ultimately lower farm productivity. And monocrop farming is an invitation to widespread crop loss. Polyculture tends to prevent the spread of devastating plant diseases while monoculture tends to promote that spread.

We can talk about the normalization of industrial fishing as well. It is designed to increase our harvest of food to feed growing human populations thereby reducing our risk of food shortages and giving us another source of nutrition. In fact, industrial fishing practices are threatening the viability of practically every fishery around the world.

In addition, temporarily cheap oil and natural gas are lulling us into a complacency about our energy supplies. Energy depletion that just two years ago seemed to be indicated by high prices is rarely discussed now. We are projecting the current moment into the future and believing that the rising energy price trend of the last 15 years is meaningless.

Practically everything we do to reduce risks to human populations now creates broader, longer term risks that could turn catastrophic. The Slate article linked above references the "high-reliability organization." Such organizations which seek to avoid catastrophic failures share certain common characteristics:

1) Preoccupation with failure: To avoid failure we must look for it and be sensitive to early signs of failure.

2) Reluctance to simplify: Labels and clichés can stop one from looking further into the events.

3) Sensitivity to operations: Systems are not static and linear but rather dynamic and nonlinear in nature. As a result it becomes difficult to know how one area of the organization’s operations will act compared to another part.

For our global system as a whole to act like a high-reliability organization, we would have to turn away from technopian narratives that tell us we will always come up with a new technology that will solve our problems including climate change--while forcing us to change our lives very little.

Instead, we would anticipate and scan for possible failure, no matter how small, to give us warning about perils to our survival. There are plenty of signs flashing warnings to us, but we have not fully comprehended their gravity.

When it comes to energy supplies, we are often faced with the simplifying assertions as mentioned above that are designed to prevent us from examining the topic. People in the oil industry like to say that the "resource is huge." They don't tell you that "resource" simply refers to what is thought--on sketchy evidence--to be in the ground. What is actually available to us is a tiny fraction of the resource at today's prices and level of technology.

The effects of the recent bankruptcy of one of the world's largest ocean freight companies have given us a window into the outsized effects of a failure of just a small portion of our complex system of worldwide logistics.

If we had run our society as a high-reliability organization, we would have heeded warnings made decades ago. I like to tell people that the American public first learned that oil was a finite resource when Clark Gable told them so near the end of the 1940 film "Boom Town," a remarkable speech for the time.

American leadership found out that we would have to make a transition to a non-fossil fuel economy way back in 1954 in Harrison Brown's widely read The Challenge of Man's Future--and, that such a transition would be fraught with peril if not begun early enough.

Other warnings included Limits to Growth in 1972, a book widely misunderstood as predicting rather than modeling our predicament. More recently there was Jared Diamond's Collapse.

In general, what we as a society have chosen to do is to create narratives of invincibility, rather than heed these warnings. We are, in effect, normalizing highly risky behavior.

Perhaps our biggest failure is noted in item three above. We think of the world we live in as static and linear rather than dynamic and nonlinear. That has given us a false sense that things move gradually and predictably in our world, the same false sense that led to the Deepwater Horizon disaster.

No comments:

Post a Comment