Return to the complete nonlinear dynamics & complexity glossary or click on tabs to access alphabetically listed terms.
See Full Definitions A – B
Adaptation
Algorithm
Algorithmic Complexity
Anacoluthian Processes
Artificial Life
Attractor
Types of Attractors
Basins of Attraction
Autopoeisis
Benard System
Bifurcation
Boundaries (Containers)
Butterfly Effect
See Full Definitions C
Catastrophe Theory
Cellular Automata
Chaos
Chunking
The Church-Turing Thesis
Co-evolution
Coherence
Complexity
Algorithmic Complexity
Complex Adaptive System (CAS)
Concept of 15%
Containment (see Boundaries)
Correlation Dimension
See Full Definitions D – F
Deterministic System
Difference Questioning
Dissipative Structure
Dynamical System
Edge of Chaos
Emergence
Equilibrium
Far-from-equilibrium
Feedback
Fitness Landscape
Fractal
Fractal Dimension
See Full Definitions G – I
Generative Relationships
Genetic Algorithm
Information
Initial Conditions
Instability
Internal Models
Interactive
See Full Definitions L – N
Logical Depth
Logistic Equation
Mental Models
Minimum Specifications
Neural Nets
N/K Model
Nonlinear System
Novelty (Innovation)
See Full Definitions O – R
Order for Free
Parameters
Phase (State) Space
Phase Portrait
Power Law
Purpose Contrasting
Random Boolean Network
Redundancy
See Full Definitions S – W
Scale (Scaling Law)
Schema
Self-fulfilling Prophecy
Self-organization
Self-organized Criticality (SOC)
Sensitive Dependence on Initial Conditions (SIC)
Shadow Organization
Stability
Swarmware and Clockware
Time Series
Turing Machine
Wicked Questions
Complete References for Glossary Terms and Definitions
Glossary A – B
Terms & definitions has been organized in alphabetical order for easy access.
Adaptation
Algorithm
Algorithim Complexity
Anacoluthian Processes
Artificial Life
Attractor
Types of Attractors
Basins of Attraction
Autopoeisis
Benard System
Bifurcation
Boundaries (Containers)
Butterfly Effect
Adaptation
In the theory of Darwinian Evolution, adaptation is the ongoing process by which an organism becomes “fit” to a changing environment. Adaptation occurs when modifications of an organism prove helpful to the continuation of the species in a changed environment. These modifications result from both random mutations and recombination of genetic material (e.g., by means of sexual reproduction). In general, through the mechanism of natural selection, those modifications that aid in the survival of species survival are maintained. However, insights from the study of complex, adaptive systems are suggesting that natural selection operates on systems which already contain a great deal of order simply as a result of self- organizing processes following the internal dynamics of a system (Kauffman’s “order for free”). A fundamental characteristic of complex, adaptive systems is their capacity to adapt by changing the rules of interaction among their component agents. In that way, adaptation consists of “learning” new rules through accumulating new experiences.
See: Complex, Adaptive Systems; Genetic Algorithm; N/K Model
Bibliography: Holland (1995); Kauffman (1995)
Algorithm
A well-defined method or systematic procedure to solve a problem. In mathematics an algorithm is as a set of rules for performing a calculation or solving a mathematical problem. An example is Euclid’s algorithm for finding the highest common factor of two numbers (the highest common factor of 1365 and 3654 is 21; see Penrose, 1989). In the case of computers and Artificial Intelligence, an algorithm refers to a routine(s) in a computer program used to calculate or solve a particular type of problem. In general, an algorithm is a formalized method for solving a problem.
See: Algorithmic Complexity under Complexity; Genetic Algorithm; Turing Machine
Bibliography: Chaitin (1987): Holland (1995); Penrose (1989)
Algorithmic Complexity
A measure of complexity developed by the mathematician Gregory Chaitin based on Claude Shannon’s Information Theory and earlier work by the Russian mathematicians Kolmogorov and Solomonoff. Algorithmic complexity measure of the complexity of a system based on the length of the shortest computer program (set of algorithms) possible for generating or computing the measurements of a system. In other words, the algorithmic complexity of a system is how small a model of the system would need to be that captured the essential patterns of that system. For example, the algorithmic complexity of a random system would have to be as large as the system itself since the random patterns could not be shortened into a smaller set of algorithms. As such, algorithmic complexity has to do with the mixture of repetition and innovation in a complex system.
For example, imagine a simple system that could be represented by a bit string composed of the following sequence, 010101010101010 10101… It would only require a short program or algorithm, e.g., a command to print first a zero, then a one, then a zero, then a one, and so on. Therefore, the complexity of a system represented by the bit string 0101010 101010101010101… would be very low. However, the complexity of a system (such as the toss of a fair coin with a 1 on one side and a 0 on the other) represented by random sequence, e.g., 10110000100110001111…) would require a computer program that was as large or long as the bit string itself since it is randomly produced and no computer program could predict future 1’s or 0’s. As a result, the algorithmic complexity of a infinite random system would have to be as infinitely large as the system itself.
Bibliography: Chaitin (1987); Goertzel (1993)
Anacoluthian Processes
From the Greek “anacoluthon” (inconsistency in logic), a general term for system processes or methods facilitating self-organization and emergence. In these processes traditional procedures are followed while at the same time they are transgressed, thereby allowing the emergence of something radically new. An example of an anacoluthian process is the crossing-over of chromosomes from both parents in sexual reproduction. An example in a business or institution when people from diverse organizational functions are brought together in a project team, hopefully resulting in the emergence of an innovative organizational structure.
See: Far-from-equilibrium; Genetic Algorithm
Bibliography: Goldstein, “Leadership and Emergence…” (in this Resource Guide); Holland (1995)
Artificial Life
The life-like patterns emerging in cellular automata and related electronic arrays. These emergent patterns seem organic in the manner in which they move, grow, change their shape, reproduce themselves, aggregate, and die. Artificial Life was pioneered by the computer scientist Chris Langton, and experimented with extensively at the Santa Fe Institute. Artificial Life is being used to model various complex systems such as eco-systems, the economy, societies and cultures, the immune system, and so on. The study of Artificial Life is promising insights into natural processes leading to the build-up of structure in self- organizing, complex systems.
See: Cellular Automata; N/K Model; Random Boolean Networks
Bibliography: Langton (1986); Lewin (1992)
Attractor
The evolution of a nonlinear, dynamical, complex system can be marked by a series of phases, each of which constrains the behavior of the system to be in consonance with a reigning attractor(s). Such phases and their attractors can be likened to the stages of human development: infancy, childhood, adolescence, and so on. Each stage has its own characteristic set of behaviors, developmental tasks, cognitive patterns, emotional issues, and attitudes (although, of course, there is some variation among different people). Though a child may sometimes behave like an adult (and vice versa), the long term behavior is what falls under the sway of the attractor. Technically, in a dynamical system, an attractor is a pattern in phase or state space called a phase portrait to which values of variables settle into after transients die out. More generally, an attractor can be considered a circumscribed or constrained range in a system which seemingly underlies and “attracts” how a system is functioning within particular environmental (internal and external) conditions. The dynamics of the system as well as current conditions determine the system’s attractors. When attractors change, the behavior in the system changes because it is operating under a different set of governing principles. The change of attractors is called bifurcation, and is brought about from far-from-equilibrium conditions which can be considered as a change in parameter values toward a critical threshold.
See: Bifurcation, Far-from-equilibrium; Phase (State) Space and Phase Portrait
Bibliography: Abraham, et. al., (1991); Goldstein (1994); Zimmerman (in this Resource Guide).
Types of Attractors
Fixed Point Attractor:
An attractor which is a particular point in phase space, sometimes called an equilibrium point. As a point it represents a very limited range of possible behaviors in the system. For example, in a pendulum, the fixed point attractor represents the pendulum when the bob is at rest. This state of rest attracts the system because of gravity and friction. In an organization a fixed point attractor would be a metaphor for describing when the organization is “stuck” in a narrow range of possible actions.
Periodic (Limit Cycle) Attractor: An attractor which consists of a periodic movement back and forth between two or more values. The periodic attractor represents more possibilities for system behavior than the fixed point attractor. An example of a period two attractor is the oscillating movement of a metronome. In an organization, a periodic attractor might be when the general activity level oscillates from one extreme to another. Or, an example from psychiatry might be bi-polar disorder where a person’s mood shifts back and forth from elation to depression.
Strange Attractor:
An attractor of a chaotic system which is bound within a circumscribed region of phase space yet is aperiodic, meaning the exact behavior in the system never repeats. The structure of a strange attractor is fractal. A strange attractor can serve as a metaphor for creative activities in an organization in which innovation is possible yet there is a boundary to the activities determined by the core competencies of the organization as well as its resources and the environmental factors effecting the organization. A strange attractor portrays the characteristic of sensitive dependence on initial conditions (the Butterfly Effect) found in chaos.
See: Butterfly Effect; Chaos; Fractal; Sensitive Dependence on Initial Conditions
Basins of Attraction
If one imagines a complex system as a sink, then the attractor can be considered the drain at the bottom, and the basin of attraction is the sink’s basin. Technically, the set of all points in phase space that are attracted to an attractor. More generally, the initial conditions of a system which evolve into the range of behavior allowed by the attractor.
When a specific attractor(s) is operative in a system, the behavior of the system will be consonant with that attractor(s) meaning that a measurement of that behavior will be in the systems basin of attraction and thereby eventually converge to the attractor(s), no matter how unusual the conditions affecting the system are.
Autopoeisis
A theory of what life is, developed by the Chilean scientists Humberto Maturana and Francisco Varela. A living organism is understood as a circular, autocatalytic-like process which has its own survival as its main goal. The phenomena of self-organization is sometimes seen as an autopoeitic phenomena. This theory with its emphasis on the “closure” of the living organism has been an appropriate remedy for the overemphasis on “openness” in open systems theory. The management theorist Gareth Morgan points out that the way an organization’s identity, strategies, and awareness of its market can be seen as an autopoeitic circularity. That is why organizations can get “stuck” in a rut of activity and become unadaptable to a changing environment.
See: Self-fulfilling Prophecy
Bibliography: Maturana and Varela (1992); Morgan (1997).
Benard System
A simple physical system, consisting of a liquid in a container being heated from the bottom and which has been extensively studied by the Prigogine School because of it demonstration of self-organization and emergence. As the liquid in a container is heated from the bottom, at a critical temperature level (a far-
from-equilibrium condition), there is the sudden emergence of striking hexagonally-shaped convection cells. Prigogine has termed these hexagonal cells “dissipative structures” since they maintain their structure while dissipating energy through the system and from the system to the environment. These “dissipative structures” are a good example of unpredictable emergent patterns since the direction of rotation of the convection cells is the result of the amplification of random currents in the liquid.
See: Dissipative Structures; Emergence; Far-from-equilibrium; Self-organization
Bibliography: Nicolis (1989); Nicolis and Prigogine (1989); Goldstein (1994)
Bifurcation
The emergence of a new attractor(s) in a dynamical, complex system that occurs when some parameter reaches a critical level (a far-from-equilibrium condition). For example, in the logistic equation or map system, bifurcation and the emergence of new attractors takes place when the parameter representing birth/death rates in a population reaches a critical value. More generally, a bifurcation is when a system shows an abrupt change in typical behavior or functioning that lasts over time. For example, a change of an organizational policy or practice which results in a long-term change of the business’ or institution’s behavior can be considered a bifurcation.
See: Attractor; Dynamical System; Far-from-equilibrium
Bibliography: Abraham (1982); Guastello (1995).
Boundaries (Containers or “Closure”
Processes of self-organization and emergence occur within bounded regions, e.g., the container holding the Benard System so that the liquid is intact as it undergoes far-from- equilibrium conditions. In cellular automata the container is the electronic network itself which is “wrapped around” in that cells at the outskirts of the field are hooked back into the field. These boundaries or containers act to demarcate a system from its environment, and, thereby, maintain the identity of a system as it changes. Furthermore, boundaries channel the nonlinear processes at work during self-organization. In human systems, boundaries can refer to the actual physical plant, organizational policies, “rules” of interaction, and whatever serves to underlie an organization’s identity and distinguish an organization from its boundaries. Boundaries need to be both permeable in the sense that they allow exchange between a system and its environments as well as impermeable in so far as they circumscribe the identity of a system in contrast with its environments.
See: Autopoeisis
Bibliography: Chandler & Van de Vijver (2000); Eoyang & Olson (2001); Goldstein (1994); Luhman, Bednarz, & Baecker (1996).
Butterfly Effect
A popular image coming out of chaos theory which portrays the concept of sensitive dependence on initial conditions, i.e., a small change having a huge impact like a butterfly flapping its wings in South America which eventually leads to a thunderstorm in North America. Some attribute the term “Butterfly” in “Butterfly Effect” to the butterfly-like shape of the phase portrait of the strange attractor discovered by the
meteorologist Edward Lorenz when he first discerned what was later termed “chaos.” The Butterfly Effect introduces a great amount of unpredictability into a system since one can never have perfect accuracy in determining those present conditions which will be amplified and lead to a drastically different outcome than expected. However, since chaotic attractors are not deterministic and not truly random and operate within a circumscribed region of phase or state space, there still exists a certain amount of predictability associated with chaotic systems. Thus, a particular state of the weather may be unpredictable more than a few days in advance, nevertheless, climate and season reduce the range of possible states of the weather, thereby adding some degree of predictability even into chaotic systems.
See: Chaos; Sensitive Dependence on Initial Conditions
Bibliography: Abraham (1982); Lorenz (1993); Ott (2003).