Failure in Systems Thinking: 10 Quotes
“A complex system can fail in an infinite number of ways.” (John Gall, “General Systemantics: How systems work, and especially how they fail”, 1975)
“A complex system that works is invariably found to have evolved from a simple system that worked. The inverse proposition also appears to be true: A complex system designed from scratch never works and cannot be made to work. You have to start over, beginning with a working simple system.” (John Gall, “General Systemantics: How systems work, and especially how they fail”, 1975)
“The system always kicks back. — Systems get in the way — or, in slightly more elegant language: Systems tend to oppose their own proper functions. Systems tend to malfunction conspicuously just after their greatest triumph.” (John Gall, “Systemantics: The underground text of systems lore”, 1986)
“Physical systems are subject to the force of entropy, which increases until eventually the entire system fails. The tendency toward maximum entropy is a movement to disorder, complete lack of resource transformation, and death.” (Stephen G Haines, “The Managers Pocket Guide to Systems Thinking & Learning”, 1998)
“It is no longer sufficient for engineers merely to design boxes such as computers with the expectation that they would become components of larger, more complex systems. That is wasteful because frequently the box component is a bad fit in the system and has to be redesigned or worse, can lead to system failure. We must learn how to design large-scale, complex systems from the top down so that the specification for each component is derivable from the requirements for the overall system. We must also take a much larger view of systems. We must design the man-machine interfaces and even the system-society interfaces. Systems engineers must be trained for the design of large-scale, complex, man-machine-social systems.” (A Wayne Wymore, “Systems Movement: Autobiographical Retrospectives”, 2004)
“[…] in cybernetics, control is seen not as a function of one agent over something else, but as residing within circular causal networks, maintaining stabilities in a system. Circularities have no beginning, no end and no asymmetries. The control metaphor of communication, by contrast, punctuates this circularity unevenly. It privileges the conceptions and actions of a designated controller by distinguishing between messages sent in order to cause desired effects and feedback that informs the controller of successes or failures.” (Klaus Krippendorff, “On Communicating: Otherness, Meaning, and Information”, 2009)
“Pragmatically, it is generally easier to aim at changing one or a few things at a time and then work out the unexpected effects, than to go to the opposite extreme. Attempting to correct everything in one grand design is appropriately designated as Grandiosity. […] A little Grandiosity goes a long way. […] The diagnosis of Grandiosity is quite elegantly and strictly made on a purely quantitative basis: How many features of the present System, and at what level, are to be corrected at once? If more than three, the plan is grandiose and will fail.” (John Gall, “The Systems Bible: The Beginner’s Guide to Systems Large and Small”[Systematics 3rd Ed.], 2011)
“Complex systems seem to have this property, with large periods of apparent stasis marked by sudden and catastrophic failures. These processes may not literally be random, but they are so irreducibly complex (right down to the last grain of sand) that it just won’t be possible to predict them beyond a certain level. […] And yet complex processes produce order and beauty when you zoom out and look at them from enough distance.” (Nate Silver, “The Signal and the Noise: Why So Many Predictions Fail-but Some Don’t”, 2012)
“If an emerging system is born complex, there is neither leeway to abandon it when it fails, nor the means to join another, successful one. Such a system would be caught in an immovable grip, congested at the top, and prevented, by a set of confusing but locked–in precepts, from changing.” (Lawrence K Samuels, “Defense of Chaos: The Chaology of Politics, Economics and Human Action”, 2013)
“Although cascading failures may appear random and unpredictable, they follow reproducible laws that can be quantified and even predicted using the tools of network science. First, to avoid damaging cascades, we must understand the structure of the network on which the cascade propagates. Second, we must be able to model the dynamical processes taking place on these networks, like the flow of electricity. Finally, we need to uncover how the interplay between the network structure and dynamics affects the robustness of the whole system.” (Albert-László Barabási, “Network Science”, 2016)
More quotes on “Failure” at the-web-of-knowledge.blogspot.com.