Mining for Side-Effects in Complex Adaptive Systems
July 06, 2020
A complex adaptive system (CAS) is a system in which a perfect understanding of the individual parts does not automatically convey a perfect understanding of the whole system's behaviour. Most systems where humans play central roles are CAS including teams and organizations.
In such a system we cannot confidently make statements such as If we do A, then B will happen. We can only hypothesize about how doing A will impact the evolution of the larger system. This is because actions have unintended side-effects that go unmonitored. B may very well happen if we do A but it will not be the only thing that happens. C will happen as well. C can reduce the efficacy of B, amplify it, or create other unforeseen effects.
We are wired with strong confirmation bias and unless we acknowledge the wider impact of our actions we will only see B happen because B is what we're looking for (e.g., the Gorilla experiment, radiologist experiment). Those experiments are universal in their nature and underplaying their findings by suggesting that it doesn't apply to our context is ignorance bordering on neglect.
Awareness of this phenomenon is key in any change initiative as it is not intuitive. We are led to believe that if we follow certain practices and make decisions rooted in agreed upon management principles then we are able to control outcomes. This thinking is wide of the mark for two reasons: it does not take into explicit account the unintended implications of our actions, and it is amplified by our confirmation bias.
The concept of exaptation is related. Exaptation is a term used in evolutionary biology to describe a trait that has been co-opted for a use other than the one for which natural selection has built it. If we introduce a change A to effect B, we may see A being used as a tool to facilitate C even though it was not intended for it. This is an idea that can generate innovation provided a culture of experimentation and freedom. Contrast this with adaptation where B is specifically planned to fill an evolutionary need.
It is extremely common to see this phenomenon in software development. Five such examples:
- Team metrics used as performance indicators by management
- Remote work policies increasing working hours
- Timeboxes (e.g., sprints) used to create project planning milestones
- JIRA used as a personal kanban
- Reducing a QA person increasing quality
Combining the three ideas we can acknowledge that:
- confirmation bias heavily affects our analysis
- there will be side-effects to our actions that will go unnoticed (because of our confirmation bias)
- our actions will result in exaptation
Then we can conclude that we need to dedicate just as much time monitoring the emergence of intended results than unintended ones. Going further, it is in the unintended results that we are far more likely to see innovative practices emerge. Innovation is not a linear process and often jumps out of nowhere because it is born through exaptation, i.e., repurposing an idea for something that was not its original intent. There are plenty of evolutionary biology and software examples of this phenomenon: dinosaurs flying, human's developing grammar, Microsoft dominating the PC market, cookies resulting in ad tracking, Gmail accounts being used to login to non-Google websites...the list is endless.