This is such an unusual post, which I ran into at SmallDeadAnimals, but is more than just about engineering, but reminded me all the way through about the classical theory of the cycle. Its title is, How Complex Systems Fail, and is a lesson for engineers, but economists would do well to pay attention themselves. I will highlight some of the 18 listed points. But do look at it all. And if you are an economist, replace the words “catastrophe” and “accident” with the words “recession” and “downturn” and the point should spring from the page.
11) Actions at the sharp end resolve all ambiguity.
Organizations are ambiguous, often intentionally, about the relationship between production targets, efficient use of resources, economy and costs of operations, and acceptable risks of low and high consequence accidents. All ambiguity is resolved by actions of practitioners at the sharp end of the system. After an accident, practitioner actions may be regarded as ‘errors’ or ‘violations’ but these evaluations are heavily biased by hindsight and ignore the other driving forces, especially production pressure. . . .3) Catastrophe requires multiple failures – single point failures are not enough.
The array of defenses works. System operations are generally successful. Overt catastrophic failure occurs when small, apparently innocuous failures join to create opportunity for a systemic accident. Each of these small failures is necessary to cause catastrophe but only the combination is sufficient to permit failure. Put another way, there are many more failure opportunities than overt system accidents. Most initial failure trajectories are blocked by designed system safety components. Trajectories that reach the operational level are mostly blocked, usually by practitioners. . . .6) Catastrophe is always just around the corner.
Complex systems possess potential for catastrophic failure. Human practitioners are nearly always in close physical and temporal proximity to these potential failures – disaster can occur at any time and in nearly any place. The potential for catastrophic outcome is a hallmark of complex systems. It is impossible to eliminate the potential for such catastrophic failure; the potential for such failure is always present by the system’s own nature. . . .8) Hindsight biases post-accident assessments of human performance.
Knowledge of the outcome makes it seem that events leading to the outcome should have appeared more salient to practitioners at the time than was actually the case. This means that ex post facto accident analysis of human performance is inaccurate. The outcome knowledge poisons the ability of after-accident observers to recreate the view of practitioners before the accident of those same factors. It seems that practitioners “should have known” that the factors would “inevitably” lead to an accident. Hindsight bias remains the primary obstacle to accident investigation, especially when expert human performance is involved.