“… your scientists were so preoccupied with
whether or not they could, they didn't stop to
think if they should.” –Dr. Ian Malcolm,
Jurassic Park
Large, complex organizations, like BP, would do well to heed the Law of Unintended Consequences. This is the notion, attributed to sociologist Robert K. Merton, that any complex system will have outcomes that cannot be foreseen by the actors in that system. It is not, as risk managers in BP would claim, that there is a tolerably low probability that something undesirable (e.g., explosion on deep sea oil rig platform) will occur; rather, the complexity of the system means something undesirable is certain to occur.
David Brooks, in a New York Times op-ed column about the Deepwater Horizon explosion, writes
…
the real issue has to do with risk assessment. It has to do with the bloody
crossroads where complex technical systems meet human psychology.
Over
the past decades, we’ve come to depend on an ever-expanding array of intricate
high-tech systems. These hardware and software systems are the guts of
financial markets, energy exploration, space exploration, air travel, defense
programs and modern production plants.
These
systems, which allow us to live as well as we do, are too complex for any
single person to understand. Yet every day, individuals are asked to monitor
the health of these networks, weigh the risks of a system failure and take
appropriate measures to reduce those risks.
One of the explanations for why unintended consequences occur is that people fail to see every aspect of a problem. Stephen Haines, in an article in the March/April issue of Training Magazine, likens this to trying to solve the Rubik’s Cube by arranging one color on one side and then moving on to arrange the next color. The puzzle can’t be solved this way. But that is how we tend to approach complex situations. We deal with one element at a time rather than looking at the system as a whole. And, as Brooks suggests, no single person is capable of understanding the total system.
In examining lessons-learned from another disaster, the Challenger space shuttle, Malcolm Gladwell concludes that if we can’t handle the unintended consequences, maybe we shouldn’t be in that business. He writes:
What
accidents like the Challenger should teach us is that we have constructed a
world in which the potential for high-tech catastrophe is embedded in the
fabric of day-to-day life. At some point in the future-for the most mundane of
reasons, and with the very best of intentions-a NASA spacecraft will again go
down in flames. We should at least admit this to ourselves now. And if we
cannot-if the possibility is too much to bear-then our only option is to start
thinking about getting rid of things like space shuttles altogether.
However, the Law of Unintended Consequences isn’t only about interaction between humans and technology. Any complex social organization is subject to the Law. Dubner and Levitt, authors of Freakonomics, point out that the system for doctors treating deaf patients, the system for requiring debt relief every seven years (sabbatical year) according to Jewish law, and the system for protecting endangered species have all had some of the opposite effects for which they were intended. Any large company can count on the fact that policies and procedures will have effects that cannot be anticipated. For example, lines of communication and authority, while important in managing productivity, can also be barriers to the right information getting to the right people at the right time, which can have disastrous consequences.