Reviews

Normal Accidents: Living with High-Risk Technologies by Charles Perrow

mattbnz's review

Go to review page

3.0

I read this as part of a reading club at work, since it's commonly claimed that our complex computer systems exhibit many of same sort of failure modes and system incidents as described in this book.

I found it an interesting read, and certainly there are some core truths and useful concepts presented in the book, primarily in the first half, but the latter portions of the book, particularly the chapters on recombinant DNA didn't really resonate with me at all.

The age of the book is clearly evident in the examples and language. It doesn't necessarily get in the way of the message, but an updated edition wouldn't hurt either!

venturecrapitalism's review against another edition

Go to review page

5.0

Thought-provoking! I work in behavioral health crisis prevention and was pleasantly surprised at how applicable the contents of this book were to my work. In the hands of another author, this material could easily turn into a dense specialist-only tome, but I found it remarkably readable.

jimcaserta's review against another edition

Go to review page

5.0

This is an excellent read and extremely relevant to today, given the pandemic we’re experiencing. I was pointed to this book by Zeynep Turfecki as a way to think about ‘systems’, how things are related, and how failures can cascade. The original book is from 1984, and as I was reading I thought, ‘what about Bhopal and Chernobyl?’ Then I get to the afterword, and there it is! This is definitely a book that retrospective helps provide context. The Y2K treatment in the afterword is good. It is ironic that the dot-com crash would be the event that caused a recession, not Y2K. The book is somewhat dense, with lots of acronyms. I read on Hoopla and it was not easy to reference the acronym helper at the end of the book. I’d love to see this book revisited again brining in the 737Max failure as well as overall Coronavirus response. I would have liked to have kept better notes as I read, but was reading and discussing with a friend. The book can be terrifying at times, and if you have fears of nuclear war, airplane crashes, or other disasters, I would not read this book. I thoroughly enjoyed this and learned a lot from it.

ktimmers's review against another edition

Go to review page

3.0

Amazingly accessible for what it is. Boiled down, the idea is that we are now creating systems so complicated that they will interact with themselves or their surroundings in ways we can’t predict or quickly solve. Interesting premise…. but was eerie reading it almost 40 years later!

I enjoyed reading about some of the crazy sea accidents especially. But this is a bit long for what I got out of it.

ewelshie's review against another edition

Go to review page

challenging informative slow-paced

4.25

Chapter 3 in particular is a little slow to get through since it's mostly defining terms. But Perrow manages to mesh together a variety of real world examples into a convincing argument.

mburnamfink's review against another edition

Go to review page

3.0

Normal Accidents is a momument in the field of research into sociotechnial systems, but while eminently readable and enjoyable, it is dangerously under-theorized. Perrow introduces the idea of the 'Normal Accident', the idea that within complex and tightly coupled sociotechnical systems, catastrophe is inevitable. The addition of more oversight and more safety devices merely adds new failure modes and encourages the operation of existing systems with thinner margins for error. Perrow provides numerous examples from areas of technology like nuclear power, maritime transport, chemical processing, spaceflight and mining. What he does not do adequately explain why some systems are to be regarded as inherently unsafe (nuclear power) and others have achieved such dramatic increases in safety (air travel).

Perrow defines complexity as the ability of a single component in a system to affect many other components, and tight coupling as a characteristic of having close and rapid associations between changes in one part of the system and changes in another part. The basic idea is that errors in a single component cascade rapidly to other parts of the system faster than operators can detect and correct them, leading to disaster. In some cases, this is incontrovertible: A nuclear reactor has millisecond relationships between pressure, temperature, and activity in the core, all controlled by a plumbers nightmare of coolant pipes-and there's little operators can do in an emergency that doesn't potentially vent radioactive material to the environment. However, it seems to me that complexity and tight coupling are a matter of analytic frames rather than facts: complexity and coupling can be increased or reduced by zooming in or out. The choice of where the boundaries of a system exist can always be debated. My STS reading group is looking at alternative axis for analyzing systems, but I'd note that the systems that seem particularly accident prone are distinctly high energy (usually thermal or kinetic, or the potential forms of either). When something that's heated to several hundred degrees, could catch fire and explode, is moving at hundreds of miles per hour, or is the size of a city block, does something unexpected it's no surprise that the results are disastrous. And whatever Perrow might recommend, there is no industrial civilization without high energy systems.

One major problem is that Perrow's predictions of many nuclear disaster simply haven't come true. Three Mile Island aside, there hasn't been another major American nuclear disaster. Chernobyl could be fairly described as management error: while an inherently unsafe design, the reactor was pushed beyond its limits by an untrained crew as part of an ill-planned experiment before the disaster. Fukushima was hardly 'normal', in that it took a major earthquake, tsunami, and a series of hydrogen explosions to destroy the plant. The 1999 afterward on Y2K is mostly hilarious in retrospect.

Perrow rightly rails against operator error as the most frequent cause of accidents. Blaming operators shields the more powerful and wealthy owners and designers of technological systems from responsibility, while operators are often conveniently dead and unable to defend themselves in court. The problem is that his alternative is the normal accident-a paralyzing realization that we must simply live with an incredible amount of danger and risk all around us. Normal accidents offers some useful, if frequently impractical advice for creating systems that are not dangerous, but more often it tends to encourage apathy and complacency.

After all, if accidents are "normal", we should get used to glowing in the dark.

garyboland's review

Go to review page

5.0

Absolutely essential reading for anyone who works in technology. A brilliant insight on how tightly coupled complex systems have accidents built into them. This all that has to happen is that once enough time passes they become inevitable (accident waiting to happen). Highly recommend

ericwelch's review

Go to review page

5.0

8/14/2011 I keep recommending this book and with the BP disaster, it continues to be very, very timely. One of the points made by Perrow is that when complex technology "meets large corporate and government hierarchies, lack of accountability will lead inexorably to destructive failures of systems that might have operated safely." (From a review of [b:A Sea in Flames: The Deepwater Horizon Oil Blowout|9678872|A Sea in Flames The Deepwater Horizon Oil Blowout|Carl Safina|http://ecx.images-amazon.com/images/I/51fFGw05k1L._SL75_.jpg|14566774] by Gregg Easterbrook in the NY Times April 23, 2011.)

Note added 3/2/09: Perrow's discussion of the problems inherent in tightly coupled systems is certainly timely given the intricacies of the recent financial disaster. Certainly of a tightly coupled system that cause the entire system to collapse when only one component fails.
**
This is a totally mesmerizing book. Perrow explains how human reliance on technology and over-design will inevitably lead to failure precisely because of inherent safety design. Good companion book for those who enjoy Henry Petroski.

Some quotes: "Above all, I will argue, sensible living with risky systems means keeping the controversies alive, listening to the public, and recognizing the essentially political nature of risk assessment. Unfortunately, the issue is not risk, but power; the power to impose risks on the many for the benefit of the few (p. 306)," and further on, "Risks from risky technologies are not borne equally by the different social classes [and I would add, countries:]; risk assessments ignore the social class distribution of risk (p. 310)." and "The risks that made our country great were not industrial risks such as unsafe coal mines or chemical pollution, but social and political risks associated with democratic institutions, decentralized political structures, religious freedom and plurality, and universal suffrage (p. 311).