A review by mburnamfink
Normal Accidents: Living with High Risk Technologies - Updated Edition by Charles Perrow

3.0

Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides numerous examples from areas of technology like nuclear power, maritime transport, chemical processing, spaceflight and mining. What he does not do adequately explain why some systems are to be regarded as inherently unsafe (nuclear power) and others have achieved such dramatic increases in safety (air travel).

Perrow defines complexity as the ability of a single component in a system to affect many other components, and tight coupling as a characteristic of having close and rapid associations between changes in one part of the system and changes in another part. The basic idea is that errors in a single component cascade rapidly to other parts of the system faster than operators can detect and correct them, leading to disaster. In some cases, this is incontrovertible: A nuclear reactor has millisecond relationships between pressure, temperature, and activity in the core, all controlled by a plumbers nightmare of coolant pipes-and there's little operators can do in an emergency that doesn't potentially vent radioactive material to the environment. However, it seems to me that complexity and tight coupling are a matter of analytic frames rather than facts: complexity and coupling can be increased or reduced by zooming in or out. The choice of where the boundaries of a system exist can always be debated. My STS reading group is looking at alternative axis for analyzing systems, but I'd note that the systems that seem particularly accident prone are distinctly high energy (usually thermal or kinetic, or the potential forms of either). When something that's heated to several hundred degrees, could catch fire and explode, is moving at hundreds of miles per hour, or is the size of a city block, does something unexpected it's no surprise that the results are disastrous. And whatever Perrow might recommend, there is no industrial civilization without high energy systems.

One major problem is that Perrow's predictions of many nuclear disaster simply haven't come true. Three Mile Island aside, there hasn't been another major American nuclear disaster. Chernobyl could be fairly described as management error: while an inherently unsafe design, the reactor was pushed beyond its limits by an untrained crew as part of an ill-planned experiment before the disaster. Fukushima was hardly 'normal', in that it took a major earthquake, tsunami, and a series of hydrogen explosions to destroy the plant. The 1999 afterward on Y2K is mostly hilarious in retrospect.

Perrow rightly rails against operator error as the most frequent cause of accidents. Blaming operators shields the more powerful and wealthy owners and designers of technological systems from responsibility, while operators are often conveniently dead and unable to defend themselves in court. The problem is that his alternative is the normal accident-a paralyzing realization that we must simply live with an incredible amount of danger and risk all around us. Normal accidents offers some useful, if frequently impractical advice for creating systems that are not dangerous, but more often it tends to encourage apathy and complacency.

After all, if accidents are "normal", we should get used to glowing in the dark.