Sowing the Seeds of Resilience

“All of the interesting systems (e.g. transportation, healthcare, power generation) are inherently and unavoidably hazardous by their own nature. The frequency of hazard exposure can sometimes be changed but the processes involved in the system are themselves intrinsically and irreducibly hazardous. It is the presence of these hazards that drives the creation of defenses against hazard that characterize these systems.” How Complex Systems Fail: Being a Short Treatise on the Nature of Failure; How Failure is Evaluated; How Failure is Attributed to Proximate Cause; and the Resulting New Understanding of Patient Safety. Richard I. Cook, MD (2000).

“Inside of Utopia, all the seeds of ambition, of faction, are rooted out with all the other vices…. The union of the citizens being thus highly consolidated within, excellence and energy institutions defend the republic against the dangers from without.” Utopia, Sir Thomas More (1516).

In the August blog of this series, Agile Type 1 and Agile Type 2 innovation ecosystems were postulated. Agile Type 1 were imagined as ideal ecosystems in which a rapid flow (or ‘diffusion’ to use a more traditional term) of ideas, solutions, knowledge, and so forth occurs through a system and its networks. Agile Type 1 ecosystems will be capable of rapid self-organization, be highly responsive to system environment changes, and respond efficiently to errors and external shocks. It was also suggested that Agile Type 2 innovation ecosystems can be defined as being more vulnerable than the ideal Agile Type 1, but much closer to reality.

Dr. Richard Cook is a physician at the University of Chicago’s Cognitive Technologies Laboratory who has analyzed and written extensively about the failure of complex systems. Let’s look into D. Cook’s research from How Complex Systems Fail, cited on the Cognitive Technologies Laboratory website, to see what this tells us about Agile Type 2 innovation ecosystems and about how we should build innovation ecosystems which will withstand all the ills that complex adaptive systems are heir to.

First, we have to accept that complex systems are intrinsically hazardous systems and, as noted in the quote above, “It is the presence of these hazards that drives the creation of defenses against hazard that characterize these systems” – that is the emergence in complex systems of defenses against failure. In innovation ecosystems such defenses might be culture, knowledge, trust, diversity and openness, and various forms of physical and intellectual resources and capacity.

This brings to mind Ashby’s Law, also known as the Law of Requisite Variety, which states “the variety in the (network) control system must be equal to or larger than the variety of the perturbations in the system in order to achieve control.” In other words, if you are being attacked having many options is an effective strategy to manage the threat – as US President Kennedy proposed in his 1961 flexible defense policy. Conversely, tightly controlled (not so agile) systems designed to operate efficiently under prevailing conditions, with too many strong links and too few weak ones, reduce communications and become unresponsive to external shocks leading to instability or even collapse. Again showing the value of perturbations. However, it’s worth remembering that it is also a feature of complex systems that small changes may give rise to disproportionally large consequences.

In fact, perturbations are necessary for ecosystem networks to survive. We may think of this as an innovative ecosystem needing a constant flow of energy throughout its networks. Networks with many weak links allow perturbations to be dissipated and the system remains intact. Incidentally, in our September Blog investigations were cited indicating that the speed at which an innovation moves through a network increases when there are a “greater number of errors, experimentation, or unobserved payoff shocks in the system” (also called noise or variability). More about this next month.

Dr. Cook also suggests that “Human practitioners are the adaptable element of complex systems” in optimizing the system’s productive capacity and reducing vulnerability to failure. We know that a feature of complex systems is adaptability. Adaptation may be catalyzed by early detection of changes in system performance and the provision of new paths to recover from perturbations and shocks; as we have seen, the presence of weak links helps here. Adaptation allows systems to be more resilient (the ability to bounce back) from internal confusion or external disturbances, subject to the always present constraints of finite time and resources.

We will end this month with another finding from Dr. Cook’s investigations into accidents varying from aircraft crashes to errors in hospital patient care, namely “Hindsight biases post-accident assessments of human performance.” This means that when the outcome of some event, or more likely a series of events, leading to an accident or, for ecosystems, a collapse due to shocks, is known, then an after-the-event analysis is frequently inaccurate or misleading. Knowledge of the outcome reduces our ability to re-create stories from the viewpoint of those involved. For example, we might say of some event “surely they should have known that such and such a policy would lead to problems.” Several of the Blogs in this series have promoted the learning benefits of extracting re-usable knowledge components from descriptive cases, i.e. stories. So how could hindsight bias, in constructing an ex post facto narrative, affect the learning value of these re-usable knowledge facets? I’m not sure. It’s worth thinking about, perhaps in the context of previous discussions in these Blogs of causality.

We can all think of many system examples of hazards and resilience ranging from the disintegration of Communism in Europe to companies which were ill prepared for technological change, such as Kodak’s slow response to digital photography. Cities and regions – clearly complex systems – have experienced the consequences of Ashby’s Law where a major local employer or even an entire industry has declined, reduced employment due to improved production technologies, or moved elsewhere. Even Thomas More’s Utopia might have eventually collapsed from a lack of weak links and consequent poor resiliency – if not from boredom.

Next time: Is noise good for us?

All blogs in this series can be found at

2 Comments on “Sowing the Seeds of Resilience”

  1. […] again the issues of network resiliency, perturbations, and noise, introduced in our September and October Blogs in this series. In September’s Blog investigations were cited indicating that the speed at […]

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s