The purpose of agriculture is to feed people. That is a good and worthy purpose. From its beginning, smart people who later became agricultural engineers thought about how agriculture could become better in feeding people – feeding more people. In time, while feeding people is still the proclaimed purpose, there has been a ‘purpose creep’ towards making more money for the agricultural industrial complex. Initially, the focus was on increasing productivity, later it moved to creating higher income through increasing efficiency. So the purpose moved from feeding people to increasing the efficiency of agricultural production: larger fields, monocultures, more standardised crops, mechanisation, more chemicals. Indeed, engineers have come up with so many amazing tricks: looking at today’s agricultural multinationals, one has to say that all these smart people have certainly achieved the purpose they set out to achieve. Yet what have they also achieved: land degradation, water pollution, loss of biodiversity (both in crop diversity, but also in all other species either directly through the use of pesticides or by destroying their habitat or food source), massive contributing to climate change, poverty (either taking small holder farmers work and/or market away or by exploiting workers and suppliers), reduced nutritional value of crops, and so on. What a mess! The question is how did we get into this mess as we had such a good purpose: feed the people. The answer is: people are notoriously bad in implementing change following a specific purpose. Understanding this is one of the most important shifts necessary now if we want to save humanity from the looming catastrophes – and I don’t think I’m unnecessarily dramatising here.
Another example would be how health and safety is handled in many contexts. While each individual rule on how we need to behave to keep safe (e.g. not climb on a chair to change a bulb but rather to take a proper ladder) looks sensible, they are often driven into the extreme (just look at the signs with lots of small print next to children’s playgrounds). But what is actually much more serious is the systemic effect of having all these rules. Because playgrounds need to be built in a way so children cannot hurt themselves, kids don’t learn how to keep safe by judging danger and risk. And this perpetuates throughout society, leading to an attitude that if it is not written somewhere that we should not do something then it must be safe to do. We turn off our brains and don’t assess risks ourselves – or we never learn to do that in the first place.
A third example comes from international development and poverty reduction. With full agreement that we need to eradicate poverty, the way we set at the task single-mindedly often leads to all sorts of problem happening all around us, in many cases aggravating the problem. While we might be able to lift some people out of immediate poverty by giving them some seed and agricultural equipment, we do not pay enough attention what this does to the economy around those people. Event the now more popular ‘systemic approach’ still usually just generate a solution for a specific problem, applying the same logic just on a larger scale. For example, they define their purpose in changing traders’ mindset to buy more from or sell more to poor small-scale entrepreneurs. While in itself that seems sensible, it still just looks at one particular, isolated aspect of the whole economy and society.
There are endless examples like these, you certainly can find some in your immediate context. I continuously keep making such mistakes in my own life, from how I handle my relationships to how I raise my daughter. But why is that? As part of my exploration of Warm Data, I have come across some convincing explanation for this phenomenon. Gregory Bateson has written some interesting articles on what he called ‘conscious purpose’. Let me try to write down his arguments in my own words:
- Humans, societies and the ecosystems we are part of are complex systems. While Bateson uses the language of cybernetics like loops, circularity and homeostasis, nowadays we would rather use the language of complex adaptive systems, like self-organisation, disposition, entanglement or constraints. But essentially the consequences are the same – there is an infinite number of interconnections and interdependencies among the many elements of a complex system, which we cannot see in their entirety. Yet there are general patterns in these systems (like inertia, self-correction, preservation of purpose/identity, exponential growth, power laws, etc.) that are similar on the different levels and across different types of systems – human individuals, societies and the wider ecosystem we are part of.
- These complex systems are evolving systems and they are in a finely balanced, dynamic equilibrium, so that individual processes are generally constrained by other processes and no individual process can grow exponentially, tipping the system out of its equilibrium and into chaos. For example, the growth of cells is constrained so the human organs stop growing when they have reached the required size or the growth of individual species is constrained in an ecosystem like a forest so they don’t take over and disturb the balance of species, niches and nutrient cycles. Every now and then, of course, this equilibrium gets disturbed and the constraints fail to keep a hold of the exponential growth of some parts of the system and the result is cancer or a forest that is struck down by a pest.
- Our human consciousness has evolved in a way so it does not perceive or reflect this whole complexity, neither the complexity within our bodies nor that outside our bodies in society or ecosystem. In current literature, this is described in different ways. We know that before something enters our conscious mind after being perceived by our senses, it is first filtered and repackaged by our unconscious mind so we can make sense of it (we are not getting the raw stimuli from our retina but rather a repackaged image, which is largely influenced by our experience as well as social expectations on how something should look (we haven’t always known that the colour blue looks blue, we learned that). We also only see things we expect – a phenomenon called inattentional blindness. There was an experiment done with radiologists that were shown a CT scan of a lung with a silhouette of a giant ape, 48 times larger than a lung nodule, in one corner of the image. An astonishing 83% of the radiologists failed to register the gorilla because they were trained to scan for nodules. What is true for vision is also true for emotions. The unconscious mind receives some data from the sensory organs and repackages these into something we recognise as an emotion or a feeling. From an evolutionary perspective, that’s a good thing. It saves us lots of energy. Always perceiving every little bit of data and having to consciously sort through it would make our lives pretty horrid. (But beyond the evolutionary argument, Bateson also makes a logical argument that a part of the whole, i.e. the conscious mind as part of the whole mind, can, in principle, not represent the whole.)
- An important point Bateson makes is that the selectiveness of the consciousness is influenced by purpose, leading to a systematic (non random) bias in the information we perceive. If we have a specific purpose in mind, the filtering is done to highlight things that are directly related to that purpose, while other aspects of reality are hidden. One instance in which we can see this happening is in confirmation bias. If we have a hypothesis about how something works, information that disconfirms this hypothesis is literally hidden by our unconscious mind and we cannot perceive it consciously.
So the consequence of all of this is that when we give ourselves a purpose like feeding humanity, making lots of money, keeping every child safe on the playground, or eradicating poverty, this conscious purpose influences how we perceive reality and, hence, how we decide what we need to do to achieve the purpose. In particular, we tend to not see the intricate nature of the complex systems we dealing with but rather only see direct causal links between what we want to achieve and what we need to do to get there. That everything we do has consequences beyond this linear chain due to the interconnectedness of complex systems remains hidden from our conscious mind. Many problems we are facing in our age is the direct consequence of this solutionist and linear way of thinking that is the consequence of conscious purpose. Climate change, soil degradation, poverty, inequality, water pollution, social and political unrest, polarisation, etc. Remember the adage “today’s problems are yesterday’s solutions”? That is exactly it.
Bateson formulated this in the following way:
Purposive consciousness pulls out, from the total mind, sequences which do not have the loop structure which is characteristic of the whole systemic structure.[1:440]
Or in other words, we only consciously perceive seemingly linear chains of causation biased by our purpose while reality is much more complex.
In another article, Bateson puts it like this:
But, if the total mind and the outer world do not, in general, have this lineal structure, then by forcing this structure upon them, we become blind to the cybernetic circularities of the self and the external world. Our conscious sampling of data will not disclose whole circuits but only arcs of circuits, cut off from their matrix by our selective attention. Specifically, the attempt to achieve a change in a given variable, located either in self or environment, is likely to be undertaken without comprehension of the homeostatic network surrounding that variable.[2:451]
If you allow purpose to organise that which comes under your conscious inspection, what you will get is a bag of tricks.[1:439]
Tricks like monoculture to be better able to target pesticides to avoid weeds, massive fields and mechanisation to be able to increase effectiveness (which again requires monocultures because a specific harvesting machine can only handle one corp), rubber floors to soften children’s fall on the playground, or subsidised hybrid seed for the poor farmers in the developing world. All these are tricks and some are very sophisticated and have had massive benefits for the human species. Yet they all have ignored the intricacies of the overall complex living system and how implementing these tricks will have unintended consequences in the short and long term.
Consciousness … is organised in terms of purpose. It is a short-cut device to enable you to get quickly at what you want; not to act with maximum wisdom in order to live, but to follow the shortest logical or causal path to get what you next want, …[1:439]
Consciousness being a short cut to get us quickly to the next thing we want is a great evolutionary advantage. It helps us to get food quick, to reproduce, to keep safe. Yet it has become dangerous as the human species has developed means and technologies to influence the whole planet, so conscious purpose is now not only small and locally influencing our lives (where nature can give us quick feedback like a reduction in the number of rabbits we can find to hunt, leading us to feel hungry), but globally with strongly reduced feedback in the short term as the consequences of our actions take much longer to show and are going far beyond local (see climate change as example). So while our consciousness thinks of growing food, we do that at a scale that changes the whole ecosystem thanks to technological advances.
Or, again, in Bateson’s words:
But what worries me is the addition of modern technology to the old system. Today the purposes of consciousness are implemented by more and more effective machinery, transportation systems, airplanes, weaponry, medicine, pesticides, and so forth. Conscious purpose is now empowered to upset the balances of the body, of society, and of the biological world around us. A pathology—a loss of balance—is threatened.[1:440]
The consequence for me from all this is that instead of running after the next solution to a problem, we need to change the way we think about change. I’m not saying we should try to better understand the whole system and how it works before intervening (as is attempted by people who promote system mapping). That’s not possible, in principle, as shown by Bateson. What we can do, though, is to better understand the nature of complex systems and their characteristics in order to help us understand how one should act in them. The Cynefin Framework and the field of anthro-complexity that was built on its back by Dave Snowden gives us some clues:
- don’t try to get an understanding of all the interrelationships in a system, rather try to understand dispositions;
- start probing the system with small portfolios of safe-to-fail probes;
- observe closely what happens by looking out for weak signals;
- amplify patterns that are positive and dampen negative patterns.
Dave Snowden wrote in a recent Tweet:
… there is a massive difference between understanding that everything is entangled, small things can catalyse significant change and unintended consequences on the one hand and deluding yourself that you can or should see the system as a whole.Dave Snowden via Twitter
Also Warm Data gives us some clues: create relationships that create relationships that create relationships and bring in as many different contexts into the discussion as possible; there is no need to define a conscious purpose as building relationships and opening up the contexts will shift people’s perceptions which will inevitably lead to some adaptations in behaviour.
: Bateson, Gregory (1968). Conscious Purpose versus Nature. In: Bateson, Gregory (1972). Steps to an Ecology of Mind. The University of Chicago Press: Chicago and London.
: Bateson, Gregory (1968). Effects of Conscious Purpose on Human Adaptation. In: Bateson, Gregory (1972). Steps to an Ecology of Mind. The University of Chicago Press: Chicago and London.