Originally published October 2003, "Current Implications" section added by Heidi Burgess in April 2017.
Why are conflicts sometimes so challenging to understand? Why does the work of designing and deploying interventions seem more like art than science? There are real difficulties that arise from the history of the cultures and individuals involved, along with many other important factors that contribute to intractable conflict. But what is the nature of the whole system to which all these complicating factors contribute?
Those of us embedded in European or North American cultures face a significant challenge in understanding and describing complex social systems such as complex intractable conflicts. There are epistemological assumptions so deeply embedded in these cultures' education and worldview, that one is not even generally aware of them. However, these assumptions make it difficult to understand and/or deal with complexity.
This post is also part of the
These assumptions include:
- Every observed effect has an observable cause.
- Even very complicated phenomena can be understood through analysis. That is, the whole can be understood by taking it apart and studying the pieces.
- Sufficient analysis of past events can create the capacity to predict future events.
These assumptions have proven marvelously potent in developing our understanding of the physical world. They have served less well, however, in illuminating how communities of humans interact and behave. It is not for lack of intense effort that these assumptions have fallen short; the social sciences, in the second half of the 20th century, were dedicated to applying these principles to human phenomena.
The limitations of this Post-Enlightenment or Modern analytical approach had already become obvious from studying physical phenomena. During the last several decades of the 20th century, new tools were developed to help understand complex natural phenomena, weather, for example. Such systems are predictable in their general behavior but are notoriously impossible to predict in detail. This area of study was originally termed "chaos," and has now matured into collection of topics known as "complexity science."
This essay provides an overview of the applications of complexity science to human systems. We can say very little about human conflict directly in this short introductory piece. Instead, we will provide references to the currently slim literature on the topic and ask you to question your own assumptions about human communities in conflict. Those of us immersed in Western Culture have much to learn. We must grow to appreciate that not all systems are "complicated determined" systems, but rather, many are "complex adaptive systems." Just what the distinctions are will consume the remainder of this essay.
The first challenge is to define what a system is. A system is an assembly of elements hooked together to produce a whole in which the attributes of the elements contribute to a behavior of the whole. The human body is a very complex system, made up of millions of cells with different functions. Galaxies are systems, as are cities, ecosystems, and complex machines such as airplanes or computers.
As you can see by these examples, this definition is very inclusive. What isn't a system? A collection of items does not necessarily constitute a system; for example, a bag of marbles would not generally be considered a system because the elements are not "hooked together" to produce a whole that is determined by the behavior of each individual marble. Yet, even though all of these examples (save the marbles) fit the general definition of "system," they are very different from each other.
Adaptive v. Determined Systems
One of the distinctions that can be made among systems is the extent to which the system response is determined. By this, we mean to what extent the inputs and outputs are exactly and reproducibly connected.
Let us first look at the case of the Boeing 747 aircraft. Without knowing the distinctions we are making here, everyone agrees that each aircraft should be a fully determined system. Every time the pilot pulls the control yoke back, the airplane should climb. Every time the pilot pushes the yoke forward, the airplane should descend. The pilot assumes this will be true; we as passengers assume it will be true as well. To produce this collective system response, numerous components and elements of the airplane also must also work in fully determined ways.
The other feature of determined systems is that the relationship between the inputs and the outputs is linear. In its simplest form, this means that small inputs will create small outputs, and that large inputs are required for large outputs. When the pilot pushes on the yoke a little bit, the plane descends slowly. It would be very disconcerting to both pilot and passengers if this were not predictable.
As useful as linear and determined systems are, there is a whole other class of systems that behave very strangely indeed. These are adaptive systems. Such systems as ant colonies, beehives, and the human immune system are examples among many thousands. Living systems have been chosen here, but there are numerous nonliving systems that are also adaptive.
What distinguishes these systems? First of all, the elements of these systems are usually called "agents," which are connected to their neighbors and have simple rules that characterize each agent's response to changes in its environment. While an agent's communication with neighbors is good and the agent's responses always follow the rules, the behavior of the whole system (the hive, the flock, the colony, etc.) doesn't behave in determined ways.
How can this be? This is where the limitations of modern Western thinking hamper our imaginative capacity; we believe that if the agents behave in simple and predictable ways, the system will also be predictable.
One of the simplest adaptive systems is a flock of birds. We have all watched in amazement the graceful and coordinated movements of a flock of birds. Yet there is no bird-in-chief directing the action. There is no script distributed to each bird prescribing the actions of the flock. However, this collective behavior can be modeled very nicely. In these models, individual birds have a degree of decision-making capacity, but all the flight decisions must follow the simple rules. Each must:
- avoid hitting neighbors or obstacles,
- align flight to match the neighbors, and
- fly an average distance from the neighbors.
From these simple rules, very complex flocking behavior proceeds. You can watch such a simulation on the Web at the Boids site developed by Craig Reynolds. What you see is not a film or animation; it does not repeat itself and is not predictable.
To review, adaptive systems are constituted of agents that are connected to their neighbors and have a degree of freedom in responding to changes, but must respond within simple rules. This system structure produces system responses that are not determined and can be highly nonlinear.
Complex vs. Complicated
A second distinction is between system responses that are complicated, and those that are truly complex. This distinction is critical. What is surprising is the number of systems that manifest complex characteristics.
As is shown in the figure below, in complicated systems, the elements and their connections are equally important. In a 747 the yolk and the engine and the flaps and the connections between them are all critical for the proper operation of the airplane. Secondly, simple algorithms (rules) produce simple and predictable responses. Every time the pilot pulls the yoke back, the airplane climbs. The response of the component and of the whole system is fully determined.
In complex systems, the connections are critical, but individual agents are not. So the connections between the birds are critical, but if one bird gets injured and falls behind, it does not affect the rest of the flock. Simple rules result in complex and adaptive responses -- they are not predictable. Each of the agents has a choice of responses within the confines of the rules. So their individual behavior is not determined exactly, as it is in complicated determined systems.
Complicated linear and determined systems produce controllable and predictable outcomes. Complex adaptive systems can produce novel, creative, and emergent outcomes.
The notion of emergence is also important in the field of complexity. A phenomenon is said to be emergent if the application of traditional analytical tools cannot explain the system's behavior. In such cases, the whole is behaving in ways that cannot be explained through study of the parts (or agents). There are system properties and system characteristics that cannot be explained by a combination of component behaviors.
Much work is being conducted to explore complex adaptive systems; some of the most interesting is listed in the Additional Resources section at the end of this essay.
The examples we have used so far of complex adaptive systems have been biological, but we have not explicitly discussed humans and groups of humans.
Each of us is a fully complex adaptive system all by ourselves. The human brain is the most complex system known to us, in the universe with one hundred billion (1011) neurons and ten thousand trillion (1016) connections (synapses) among those neurons. At each of these synapses, complex interactions occur among electrical charges and over 100 chemicals. Much work is currently under way to examine aspects of the emergent property of the brain that we know of as consciousness. From the earlier discussion, one can see that an individual's actions might be generally predictable, but those actions can never be precisely predictable. In addition, our human self-awareness (an emergent characteristic) generally allows us to choose how we interact with one another or a group. The ants, the bees, and the birds do not choose the characteristics of their interaction; humans have a much greater freedom of choice.
Here are examples of two very different kinds of collective activities in which humans participate: a marching band and a jazz ensemble.
First, we will consider the marching band. This is a human system that behaves very much like a linear determined system. In joining this system, individual players voluntarily surrender their right to local freedom of action. For the band, there is a single leader who dictates all activity. The range of allowable local improvisation is extremely limited. As viewer and listeners, we marvel at the machine-like precision of these bands. In fact, a part of our fascination comes from the almost unnatural look and feel of such groups.
A very different musical group is the jazz ensemble. In many such groups, there is no hierarchical direction and no mechanical loyalty to a set of prescribed actions. Instead, members agree to subscribe to only general rules and are free to improvise widely. Similar to the flock of birds, the general characteristics of the music can be anticipated, but each rendering will be different and often surprising. Such results are not created or directed by individuals, but are emergent responses of the whole system.
Any group or collection of people can thus be seen to constitute a complex adaptive system. This type of theoretical treatment of human systems is only in its infancy. In fact, this short summary highlights how daunting is the task of understanding and anticipating how people relate and respond to each other.
What would it mean to look at intractable conflict through the lens of complex system theory? The first, and humbling, observation is that our present and future study of conflict can only provide us possibilities that might lead to resolution. Our work will never produce a deterministic set of formulae to create resolution. At the same time, our discoveries regarding complex systems validate the intuition of many dispute-resolution workers about humans in conflict. Among the notions that are validated are:
- There are no "neutral observers." Anyone observing is affecting the system.
- There is not a single "objective reality" that describes the system in conflict.
- Our definition of "the system" is arbitrary, since the interconnectedness of people in contact with other people is pervasive.
This is not a challenge to the values and aspirations of those working in the dispute-resolution field. It is instead a call to recognize that anyone who joins a conflict in any role becomes a part of that system.
Does this mean that intervention in difficult or intractable conflict is destined to be fruitless? Of course not. But it does mean that we can not ever manage a system in conflict as if it were a Boeing 747. How then might we support the resolution of conflict?
The attribute of complex systems that provides direction for intervention is the nonlinear self-organizing property. In these systems, whether a jazz ensemble or an ant colony, agents in the system adjust to every stimulus in ways that are not linear. That is, small input changes can produce large output changes. This is actually very encouraging, for it suggests that small inputs into a protracted or intractable conflict can conceivably produce large effects. People working the field of dispute resolution need to be willing to embark on "enlightened experiments." That is to say, change something and work with the system while it adjusts to the change. If a positive result is not immediately apparent, wait awhile. It may yet be coming. Many times, these initial changes will not produce a significant reorganization of the system, but there can be changes that will result in reorganization within the system that will be beneficial. Such "enlightened experiments" could include altering aspects of the negotiations, such as changing the venue, changing the negotiation teams, adding culture-specific features to the negotiation, etc. Although it is impossible to tell which change will make the biggest difference, small changes in complex adaptive systems can lead to significant changes and potential negotiation breakthroughs.
In summary, making a difference in the midst of intractable conflict will not come from a reductionistic analysis of the system, conducted in hopes of designing and deploying a "definitive" intervention. Instead, evolutionary progress toward resolution can be possible through mindful experiments from within the conflict and then moving with the self-organization that follows.
The impossibility of predicting and controlling conflict need not result in a sense of hopelessness or resignation. It can, instead, propel us to a deeper exploration of the nature of complex adaptive systems and the amazing possibilities that reside within such self-organizing systems for constructive change.
This essay was written in 2003, when few conflict resolution scholars or practitioners were thinking in terms of systems. Indeed, Wendell taught many of us about this topic at the first conference we had to plan the BI Knowledge Base.
Since then systems and complexity thinking has gotten much more popular in the conflict and peacebuilding fields. John Paul Lederach, Louise Diamond and John McDonald, William Ury, Peter Coleman, Robert Ricigliano, and Charles Hauss have all written books about complexity and systems approaches to peace.
Policy makers increasingly complain about how complex public policy problems are--and they are very right. Unfortunately, however, few really understand the difference between complex adaptive systems (which all significant public policy issues are) and complicated systems. As a result, they often try to implement solutions that are far too simple to be effective.
Building walls, implementing travel bans, blocking trade are all solutions to static, at most complicated problems. In complex adaptive systems, they are all likely to inflict unexpected negative consequences that may well be more harmful than helpful. Until policy makers understand the distinction being made in this essay--and act accordingly, we really are not going to be able to solve any of our vexing policy problems.
Heidi Burgess, Feb. 2017
Use the following to cite this article:
Jones, Wendell . "Complex Adaptive Systems." Beyond Intractability. Eds. Guy Burgess and Heidi Burgess. Conflict Information Consortium, University of Colorado, Boulder. Posted: October 2003 <http://www.beyondintractability.org/essay/complex-adaptive-systems>.