The post is about a JetBlue plane that, due to bureaucratic risk-minimization, was not able to take the most sensible course of action when a mechanical malfunction happened -- instead, it put its crew and passengers at more risk, because the people in positions of authority on the ground didn't want to make the decision that would result in them having responsibilty for whatever calamity could have (but didn't) happen during landing. Britt relates a few other related points on bureaucracy and its limitations as a mechanism for efficient collective action, but the most interesting part of the post for me was this quote:
it's rarely the initial problem that bites an aviator in the ass. It's the second problem, combined with the first, that then spins off harmonics of woe and wisps of chaos.Even leaving aside the wonderfully poetic turn of phrase there -- harmonics of woe indeed! -- that's a beatifully true statement. Most systems are tolerant of disruption along one axis at a time; after all, when engineers build something they typically think about the common individual failure modes and account for them in the design. Natural systems exhibit the same characteristic. Because selective pressures occur individually most of the time, constellations of resilience usually take longer to evolve.
When two things go wrong at once, that's when the shit really hits the fan.