[fusion_builder_container hundred_percent=”no” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” box_shadow=”no” box_shadow_blur=”0″ box_shadow_spread=”0″ animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_title title_type=”text” rotation_effect=”bounceIn” display_time=”1200″ highlight_effect=”circle” loop_animation=”off” highlight_width=”9″ highlight_top_margin=”0″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”none”]
Nonlinear Dynamics & Complexity Glossary
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
The description of each term in the glossary is accompanied by bibliographical references (which are either included with the definition or in the references at the end of the glossary.) This glossary was originally compiled by Jeffrey Goldstein, School of Management and Business, Adelphi University, Garden City, New York.
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_1″ background_position=”left top” border_style=”solid” border_position=”all” spacing=”yes” background_repeat=”no-repeat” margin_top=”0px” margin_bottom=”0px” animation_speed=”0.3″ animation_direction=”left” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” center_content=”no” last=”false” hover_type=”none” first=”true” spacing_right=”2%” min_height=”” link=””][fusion_text columns=”2″ column_spacing=”1em” rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Adaptation
Algorithm
Algorithmic Complexity
Anacoluthian Processes
Artificial Life
Attractor
Types of Attractors
Basins of Attraction
Autopoeisis
Benard System
Bifurcation
Boundaries (Containers)
Butterfly Effect
Catastrophe Theory
Cellular Automata
Chaos
Chunking
The Church-Turing Thesis
Co-evolution
Coherence
Complexity
Algorithmic Complexity
Complex Adaptive System (CAS)
Concept of 15%
Containment (see Boundaries)
Correlation Dimension
Deterministic System
Difference Questioning
Dissipative Structure
Dynamical System
Edge of Chaos
Emergence
Equilibrium
Far-from-equilibrium
Feedback
Fitness Landscape
Fractal
Fractal Dimension
Generative Relationships
Genetic Algorithm
Information
Initial Conditions
Instability
Internal Models
Interactive
Linear System
Logical Depth
Logistic Equation
Mental Models
Minimum Specifications
Neural Nets
N/K Model
Nonlinear System
Novelty (Innovation)
Order for Free
Parameters
Phase (State) Space
Phase Portrait
Power Law
Purpose Contrasting
Random Boolean Network
Redundancy
Scale (Scaling Law)
Self-organization
Self-organized Criticality (SOC)
Sensitive Dependence on Initial Conditions (SIC)
Shadow Organization
Stability
Swarmware and Clockware
Time Series
Turing Machine
Wicked Questions
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” spacing_left=”2%” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary A – B
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions has been organized in alphabetical order for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]Adaptation
Algorithm
Algorithim Complexity
Anacoluthian Processes
Artificial Life
Attractor
Types of Attractors
Basins of Attraction
Autopoeisis
Benard System
Bifurcation
Boundaries (Containers)
Butterfly Effect[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”3″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Adaptation” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” linktext=”Read More” animation_direction=”left” animation_speed=”0.3″]
In the theory of Darwinian Evolution, adaptation is the ongoing process by which an organism becomes “fit” to a changing environment. Adaptation occurs when modifications of an organism prove helpful to the continuation of the species in a changed environment. These modifications result from both random mutations and recombination of genetic material (e.g., by means of sexual reproduction). In general, through the mechanism of natural selection, those modifications that aid in the survival of species survival are maintained. However, insights from the study of complex, adaptive systems are suggesting that natural selection operates on systems which already contain a great deal of order simply as a result of self- organizing processes following the internal dynamics of a system (Kauffman’s “order for free”). A fundamental characteristic of complex, adaptive systems is their capacity to adapt by changing the rules of interaction among their component agents. In that way, adaptation consists of “learning” new rules through accumulating new experiences.
See: Complex, Adaptive Systems; Genetic Algorithm; N/K Model
Bibliography: Holland (1995); Kauffman (1995)
[/fusion_content_box][fusion_content_box title=”Algorithm” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” animation_direction=”left” animation_speed=”0.3″]
A well-defined method or systematic procedure to solve a problem. In mathematics an algorithm is as a set of rules for performing a calculation or solving a mathematical problem. An example is Euclid’s algorithm for finding the highest common factor of two numbers (the highest common factor of 1365 and 3654 is 21; see Penrose, 1989). In the case of computers and Artificial Intelligence, an algorithm refers to a routine(s) in a computer program used to calculate or solve a particular type of problem. In general, an algorithm is a formalized method for solving a problem.
See: Algorithmic Complexity under Complexity; Genetic Algorithm; Turing Machine
Bibliography: Chaitin (1987): Holland (1995); Penrose (1989)
[/fusion_content_box][fusion_content_box title=”Algorithmic Complexity” animation_direction=”left” animation_speed=”0.3″]
A measure of complexity developed by the mathematician Gregory Chaitin based on Claude Shannon’s Information Theory and earlier work by the Russian mathematicians Kolmogorov and Solomonoff. Algorithmic complexity measure of the complexity of a system based on the length of the shortest computer program (set of algorithms) possible for generating or computing the measurements of a system. In other words, the algorithmic complexity of a system is how small a model of the system would need to be that captured the essential patterns of that system. For example, the algorithmic complexity of a random system would have to be as large as the system itself since the random patterns could not be shortened into a smaller set of algorithms. As such, algorithmic complexity has to do with the mixture of repetition and innovation in a complex system.
For example, imagine a simple system that could be represented by a bit string composed of the following sequence, 010101010101010 10101… It would only require a short program or algorithm, e.g., a command to print first a zero, then a one, then a zero, then a one, and so on. Therefore, the complexity of a system represented by the bit string 0101010 101010101010101… would be very low. However, the complexity of a system (such as the toss of a fair coin with a 1 on one side and a 0 on the other) represented by random sequence, e.g., 10110000100110001111…) would require a computer program that was as large or long as the bit string itself since it is randomly produced and no computer program could predict future 1’s or 0’s. As a result, the algorithmic complexity of a infinite random system would have to be as infinitely large as the system itself.
Bibliography: Chaitin (1987); Goertzel (1993)
[/fusion_content_box][fusion_content_box title=”Anacoluthian Processes” animation_direction=”left” animation_speed=”0.3″]
From the Greek “anacoluthon” (inconsistency in logic), a general term for system processes or methods facilitating self-organization and emergence. In these processes traditional procedures are followed while at the same time they are transgressed, thereby allowing the emergence of something radically new. An example of an anacoluthian process is the crossing-over of chromosomes from both parents in sexual reproduction. An example in a business or institution when people from diverse organizational functions are brought together in a project team, hopefully resulting in the emergence of an innovative organizational structure.
See: Far-from-equilibrium; Genetic Algorithm
Bibliography: Goldstein, “Leadership and Emergence…” (in this Resource Guide); Holland (1995)
[/fusion_content_box][fusion_content_box title=”Artificial Life” animation_direction=”left” animation_speed=”0.3″]
The life-like patterns emerging in cellular automata and related electronic arrays. These emergent patterns seem organic in the manner in which they move, grow, change their shape, reproduce themselves, aggregate, and die. Artificial Life was pioneered by the computer scientist Chris Langton, and experimented with extensively at the Santa Fe Institute. Artificial Life is being used to model various complex systems such as eco-systems, the economy, societies and cultures, the immune system, and so on. The study of Artificial Life is promising insights into natural processes leading to the build-up of structure in self- organizing, complex systems.
See: Cellular Automata; N/K Model; Random Boolean Networks
Bibliography: Langton (1986); Lewin (1992)
[/fusion_content_box][fusion_content_box title=”Attractor” animation_direction=”left” animation_speed=”0.3″]
The evolution of a nonlinear, dynamical, complex system can be marked by a series of phases, each of which constrains the behavior of the system to be in consonance with a reigning attractor(s). Such phases and their attractors can be likened to the stages of human development: infancy, childhood, adolescence, and so on. Each stage has its own characteristic set of behaviors, developmental tasks, cognitive patterns, emotional issues, and attitudes (although, of course, there is some variation among different people). Though a child may sometimes behave like an adult (and vice versa), the long term behavior is what falls under the sway of the attractor. Technically, in a dynamical system, an attractor is a pattern in phase or state space called a phase portrait to which values of variables settle into after transients die out. More generally, an attractor can be considered a circumscribed or constrained range in a system which seemingly underlies and “attracts” how a system is functioning within particular environmental (internal and external) conditions. The dynamics of the system as well as current conditions determine the system’s attractors. When attractors change, the behavior in the system changes because it is operating under a different set of governing principles. The change of attractors is called bifurcation, and is brought about from far-from-equilibrium conditions which can be considered as a change in parameter values toward a critical threshold.
See: Bifurcation, Far-from-equilibrium; Phase (State) Space and Phase Portrait
Bibliography: Abraham, et. al., (1991); Goldstein (1994); Zimmerman (in this Resource Guide).
[/fusion_content_box][fusion_content_box title=”Types of Attractors” animation_direction=”left” animation_speed=”0.3″]
Fixed Point Attractor:
An attractor which is a particular point in phase space, sometimes called an equilibrium point. As a point it represents a very limited range of possible behaviors in the system. For example, in a pendulum, the fixed point attractor represents the pendulum when the bob is at rest. This state of rest attracts the system because of gravity and friction. In an organization a fixed point attractor would be a metaphor for describing when the organization is “stuck” in a narrow range of possible actions.
Periodic (Limit Cycle) Attractor: An attractor which consists of a periodic movement back and forth between two or more values. The periodic attractor represents more possibilities for system behavior than the fixed point attractor. An example of a period two attractor is the oscillating movement of a metronome. In an organization, a periodic attractor might be when the general activity level oscillates from one extreme to another. Or, an example from psychiatry might be bi-polar disorder where a person’s mood shifts back and forth from elation to depression.
Strange Attractor:
An attractor of a chaotic system which is bound within a circumscribed region of phase space yet is aperiodic, meaning the exact behavior in the system never repeats. The structure of a strange attractor is fractal. A strange attractor can serve as a metaphor for creative activities in an organization in which innovation is possible yet there is a boundary to the activities determined by the core competencies of the organization as well as its resources and the environmental factors effecting the organization. A strange attractor portrays the characteristic of sensitive dependence on initial conditions (the Butterfly Effect) found in chaos.
See: Butterfly Effect; Chaos; Fractal; Sensitive Dependence on Initial Conditions
[/fusion_content_box][fusion_content_box title=”Basins of Attraction” animation_direction=”left” animation_speed=”0.3″]
If one imagines a complex system as a sink, then the attractor can be considered the drain at the bottom, and the basin of attraction is the sink’s basin. Technically, the set of all points in phase space that are attracted to an attractor. More generally, the initial conditions of a system which evolve into the range of behavior allowed by the attractor.
When a specific attractor(s) is operative in a system, the behavior of the system will be consonant with that attractor(s) meaning that a measurement of that behavior will be in the systems basin of attraction and thereby eventually converge to the attractor(s), no matter how unusual the conditions affecting the system are.
[/fusion_content_box][fusion_content_box title=”Autopoeisis” animation_direction=”left” animation_speed=”0.3″]
A theory of what life is, developed by the Chilean scientists Humberto Maturana and Francisco Varela. A living organism is understood as a circular, autocatalytic-like process which has its own survival as its main goal. The phenomena of self-organization is sometimes seen as an autopoeitic phenomena. This theory with its emphasis on the “closure” of the living organism has been an appropriate remedy for the overemphasis on “openness” in open systems theory. The management theorist Gareth Morgan points out that the way an organization’s identity, strategies, and awareness of its market can be seen as an autopoeitic circularity. That is why organizations can get “stuck” in a rut of activity and become unadaptable to a changing environment.
See: Self-fulfilling Prophecy
Bibliography: Maturana and Varela (1992); Morgan (1997).
[/fusion_content_box][fusion_content_box title=”Benard System” animation_direction=”left” animation_speed=”0.3″]
A simple physical system, consisting of a liquid in a container being heated from the bottom and which has been extensively studied by the Prigogine School because of it demonstration of self-organization and emergence. As the liquid in a container is heated from the bottom, at a critical temperature level (a far-
from-equilibrium condition), there is the sudden emergence of striking hexagonally-shaped convection cells. Prigogine has termed these hexagonal cells “dissipative structures” since they maintain their structure while dissipating energy through the system and from the system to the environment. These “dissipative structures” are a good example of unpredictable emergent patterns since the direction of rotation of the convection cells is the result of the amplification of random currents in the liquid.
See: Dissipative Structures; Emergence; Far-from-equilibrium; Self-organization
Bibliography: Nicolis (1989); Nicolis and Prigogine (1989); Goldstein (1994)
[/fusion_content_box][fusion_content_box title=”Bifurcation” animation_direction=”left” animation_speed=”0.3″]
The emergence of a new attractor(s) in a dynamical, complex system that occurs when some parameter reaches a critical level (a far-from-equilibrium condition). For example, in the logistic equation or map system, bifurcation and the emergence of new attractors takes place when the parameter representing birth/death rates in a population reaches a critical value. More generally, a bifurcation is when a system shows an abrupt change in typical behavior or functioning that lasts over time. For example, a change of an organizational policy or practice which results in a long-term change of the business’ or institution’s behavior can be considered a bifurcation.
See: Attractor; Dynamical System; Far-from-equilibrium
Bibliography: Abraham (1982); Guastello (1995).
[/fusion_content_box][fusion_content_box title=”Boundaries (Containers or “Closure”” animation_direction=”left” animation_speed=”0.3″]
Processes of self-organization and emergence occur within bounded regions, e.g., the container holding the Benard System so that the liquid is intact as it undergoes far-from- equilibrium conditions. In cellular automata the container is the electronic network itself which is “wrapped around” in that cells at the outskirts of the field are hooked back into the field. These boundaries or containers act to demarcate a system from its environment, and, thereby, maintain the identity of a system as it changes. Furthermore, boundaries channel the nonlinear processes at work during self-organization. In human systems, boundaries can refer to the actual physical plant, organizational policies, “rules” of interaction, and whatever serves to underlie an organization’s identity and distinguish an organization from its boundaries. Boundaries need to be both permeable in the sense that they allow exchange between a system and its environments as well as impermeable in so far as they circumscribe the identity of a system in contrast with its environments.
See: Autopoeisis
Bibliography: Chandler & Van de Vijver (2000); Eoyang & Olson (2001); Goldstein (1994); Luhman, Bednarz, & Baecker (1996).
[/fusion_content_box][fusion_content_box title=”Butterfly Effect” animation_direction=”left” animation_speed=”0.3″]
A popular image coming out of chaos theory which portrays the concept of sensitive dependence on initial conditions, i.e., a small change having a huge impact like a butterfly flapping its wings in South America which eventually leads to a thunderstorm in North America. Some attribute the term “Butterfly” in “Butterfly Effect” to the butterfly-like shape of the phase portrait of the strange attractor discovered by the
meteorologist Edward Lorenz when he first discerned what was later termed “chaos.” The Butterfly Effect introduces a great amount of unpredictability into a system since one can never have perfect accuracy in determining those present conditions which will be amplified and lead to a drastically different outcome than expected. However, since chaotic attractors are not deterministic and not truly random and operate within a circumscribed region of phase or state space, there still exists a certain amount of predictability associated with chaotic systems. Thus, a particular state of the weather may be unpredictable more than a few days in advance, nevertheless, climate and season reduce the range of possible states of the weather, thereby adding some degree of predictability even into chaotic systems.
See: Chaos; Sensitive Dependence on Initial Conditions
Bibliography: Abraham (1982); Lorenz (1993); Ott (2003).
[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary C
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions has been organized in alphabetical order for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Catastrophe Theory
Cellular Automata
Chaos
Chunking
The Church-Turing Thesis
Co-evolution
Coherence
Complexity
Algorithmic Complexity
Complex Adaptive System (CAS)
Concept of 15%
Containment (see Boundaries)
Correlation Dimension
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”2″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ margin_top=”0″ margin_bottom=”10″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Catastrophe Theory” animation_direction=”left” animation_speed=”0.3″]
A mathematical theory of discontinuous change in a system formulated by the French mathematician Renee Thom from his work in topology. A “catastrophe” is an abrupt change in a variable(s) during the evolution of a system that can be modeled by structural equations and topological folds. Catastrophes are governed by control parameters whose changes of values leads either to smooth transition at low values or abrupt changes at higher, critical values. Catastrophes indicate points of bifurcation in dynamical systems. For example, the way a dog can change abruptly from a playful mood to an aggressive stance can be modeled by a simple “catastrophe.” In organizations, the presence of sudden change can similarly be modeled using Catastrophe Theory. In recent years, Catastrophe Theory is understood as a part of Nonlinear Dynamical Systems Theory in general.
See: Bifurcation
Bibliography: Guastello (1995)
[/fusion_content_box][fusion_content_box title=”Cellular Automata” animation_direction=”left” animation_speed=”0.3″]
Computer programs that are composed of a grid of “cells” connected to their neighboring cells according to certain “rules” of activity, e.g., a cell might be “on” if its four neighbor cells (east, west, north, and south) are also on. The entire array can self-organize into global patterns that may move around the screen. These emergent patterns can be quite complex although they emerge from very simple rules governing the connections among the cells. Cellular automata were originally conceived by the late, eminent mathematician John von Neumann, and were realized more recently by the equally eminent living mathematician John Conway in his “Game of Life.” Today, the study of cellular automata goes under the name “Artificial Life” (A-Life) because the exploration of cellular automata and their patterns at such places as the Santa Fe Institute has led to insights into the way structure is built-up in biological and other complex systems. Businesses and institutions can be modeled by cellular automata to the extent they are made up of interaction among people, equipment, and supplies. For example, the strength, number, and quality of connectivities among people or groups can be modeled by cells and rules among cells, so that how changing the rules influences the emergence of patterns can be investigated. These cellular automata models hopefully will yield important insight into the dynamics of human systems.
See: Complexity; Emergence; N/K Model; Self-organization
Bibliography: Langton (1986); Poundstone (1985); Sulis and Combs (1996)
[/fusion_content_box][fusion_content_box title=”Chaos” animation_direction=”left” animation_speed=”0.3″]
A type of system behavior appearing random-like, yet is actually deterministic and is constituted by a “hidden” order or pattern. Chaos can be found in certain nonlinear dynamical systems when control parameters surpass certain critical levels. The emergence of chaos suggests that simple rules can lead to complex results. Such systems are constituted by nonlinear, interactive, feedback types of relationships among the variables, components, or processes in the system. Chaotic times series of data from measurements of a system can be reconstructed or graphed in phase or state space as a chaotic or strange attractor with a fractal structure. Chaotic attractors are characterized by sensitive dependence on initial conditions so that although the behavior is constrained within a range, the future behavior of the system is largely unpredictable. However, unlike a random system which is also unpredictable, chaos is brought about by deterministic rules. However, there is some measure of predictability due to the way the attractor of the system is constrained to a particular region of phase space. For example, if the weather is a chaotic system, particular states of the weather are unpredictable yet the range of those states is predictable. Thus, it is impossible to predict what the weather will be exactly on August 10, 2000 in New York, yet it is predictable that the temperature will fall within a range of 65-95 degrees Fahrenheit. That is, the climate acts as a constraint of the unpredictability of the state of the weather. In organizations chaos may show-up under certain circumstances, e.g., inventory or production processes, admission rates, timing of procedures, and so on. As a result, certain aspect of organizational functioning may be unpredictable. Recent research has pointed to ways of “controlling” chaos by introducing particular perturbations into a system.
See: Attractor; The Butterfly Effect; Control Parameter; Logistic Equation; Sensitive Dependence on Initial Conditions; Time Series
Bibliography: Lorenz (1993); Guastello (1995); Peak and Frame (1994)
[/fusion_content_box][fusion_content_box title=”Chunking” animation_direction=”left” animation_speed=”0.3″]
A term coined by journalist Kevin Kelly to describe how nature constructs complex systems Ð from the bottom up with building blocks (systems) that have proven themselves able to work on their own. This concept is widely appreciated by evolutionary biologists and has been highlighted by complexity pioneer John Holland as a key feature of complex adaptive systems. He used the image of children’s building blocks, of different shapes and sizes, combined in a variety of ways to yield new creations like castles and palaces.
See: Emergence; Genetic Algorithms’ Self-organization
Bibliography: Holland (1995); Kelly (1994)
[/fusion_content_box][fusion_content_box title=”The Church-Turing Thesis” animation_direction=”left” animation_speed=”0.3″]
A logical/mathematical postulate, independently arrived at by the English mathematician Alan Turing and the American logician Alonzo Church, stating that as long as a procedure is sufficiently clear-cut and mechanical, there is some algorithmic way of solving it (such as via computation on a Turing Machine). Thus, there are some processes or problems which are computable according to some set of algorithms, whereas, other processes or problems are not computable. A strong form of the Church-Turing Thesis claims that all neural and psychological processes can be simulated as computational processes on a computer.
See: Complexity (Algorithm); Turing Machine
Bibliography: Goertzel (1993); Penrose (1989); Sulis in Robertson and Combs (1995)
[/fusion_content_box][fusion_content_box title=”Co-evolution” animation_direction=”left” animation_speed=”0.3″]
The coordinated and interdependent evolution of two or more systems within a larger ecological system. There is feedback among the systems in terms of competition or cooperation and different utilization of the same limited resources. Fore example, Kauffman and Macready give as examples of co-evolution the way in which alterations in a predator will alter the adaptive possibilities of the prey. Businesses or institutions can co-evolve in various ways such as with their suppliers, receivers, even competitors. For instance, the numerous types of joint ventures that are recently emerging can be considered a kind of co-evolution.
See: Feedback; Fitness Landscapes
Bibliography: Kauffman (1995); Kauffman and Macready (1995)
[/fusion_content_box][fusion_content_box title=”Coherence” animation_direction=”left” animation_speed=”0.3″]
The cohesiveness, coordination, and correlation characterizing emergent structures in self-organizing systems. For example, laser light is coherent compared to the light emanating from a regular light bulb. That emergent structures show a kind of order not found on the lower level of components suggests that complex systems contain potentials of functioning that have not been recognized before. Businesses and institutions can facilitate and utilize the coherence of emergent structures in place of the imposed kind of order found in the traditional bureaucratic hierarchy.
See: Dissipative Structures; Emergence; Self-organization
Bibliography: Goldstein (1994); Kauffman (1995); Prigogine and Stengers (1984)
[/fusion_content_box][fusion_content_box title=”Complexity” animation_direction=”left” animation_speed=”0.3″]
A description of the complex phenomena demonstrated in systems characterized by nonlinear interactive components, emergent phenomena, continuous and discontinuous change, and unpredictable outcomes. Although there is at present no one accepted definitions of complexity, the term can be applied across a range of different yet related system behaviors such as chaos, self-organized criticality, complex, adaptive systems, neural nets, nonlinear dynamics, far-from-equilibrium conditions, and so on. Complexity characterizes complex systems as opposed to simple, linear, and equilibrium-based systems. Measures of complexity include algorithmic complexity; fractal dimensionality; Lyapunov exponents; Gell-mann’s “effective complexity” and Bennett’s “logical depth.”
See: Anacoluthian Processes; Complex, Adaptive Systems; Nonlinear System; N/K Model; Random Boolean Network; Self-organization; Swarmware
Bibliography: Gell-mann (1995); Holland (1995); Kauffman (1995); Kelly (1994); Stacey (1996)
[/fusion_content_box][fusion_content_box title=”Algorithmic Complexity” animation_direction=”left” animation_speed=”0.3″]
A measure of complexity developed by the mathematician Gregory Chaitin based on Claude Shannon’s Information Theory and earlier work by the Russian mathematicians Kolmogorov and Solomonoff. Algorithmic complexity measure of the complexity of a system based on the length of the shortest computer program (set of algorithms) possible for generating or computing the measurements of a system. In other words, the algorithmic complexity of a system is how small a model of the system would need to be that captured the essential patterns of that system. For example, the algorithmic complexity of a random system would have to be as large as the system itself since the random patterns could not be shortened into a smaller set of algorithms. As such, algorithmic complexity has to do with the mixture of repetition and innovation in a complex system.
For example, imagine a simple system that could be represented by a bit string composed of the following sequence, 010101010101010 10101… It would only require a short program or algorithm, e.g., a command to print first a zero, then a one, then a zero, then a one, and so on. Therefore, the complexity of a system represented by the bit string 0101010 101010101010101… would be very low. However, the complexity of a system (such as the toss of a fair coin with a 1 on one side and a 0 on the other) represented by random sequence, e.g., 10110000100110001111…) would require a computer program that was as large or long as the bit string itself since it is randomly produced and no computer program could predict future 1’s or 0’s. As a result, the algorithmic complexity of a infinite random system would have to be as infinitely large as the system itself.
Bibliography: Chaitin (1987); Goertzel (1993)
[/fusion_content_box][fusion_content_box title=”Complex Adaptive System (CAS)” animation_direction=”left” animation_speed=”0.3″]
A complex, nonlinear, interactive system which has the ability to adapt to a changing environment. Such systems are characterized by the potential for self-organization, existing in a nonequilibrium environment. CAS’s evolve by random mutation, self-organization, the transformation of their internal models of the environment, and natural selection. Examples include living organisms, the nervous system, the immune system, the economy, corporations, societies, and so on. In a CAS, semi-autonomous agents interact according to certain rules of interaction, evolving to maximize some measure like fitness. The agents are diverse in both form and capability and they adapt by changing their rules and, hence, behavior, as they gain experience. Complex, adaptive systems evolve historically, meaning their past or history, i.e., their experience, is added onto them and determines their future trajectory. Their adaptability can either be increased or decreased by the rules shaping their interaction. Moreover, unanticipated, emergent structures can play a determining role in the evolution of such systems, which is why such systems show a great deal of unpredictability. However, it is also the case that a CAS has the potential of a great deal of creativity that was not programmed-into them from the beginning. Considering an organization, e.g., a hospital, as a CAS shifts how change is enacted. For example, change can be understood as a kind of self-organization resulting from enhanced interconnectivity as well as connectivity to the environment, the cultivation of diversity of viewpoint of organizational members, and experimenting with alternative “rules” and structures.
See: Adaptation; Emergence; Genetic Algorithm; Self-organization
Bibliography: Dooley (1997); Gell-mann (1994); Holland (1995); Kauffman (1995)
[/fusion_content_box][fusion_content_box title=”Concept of 15%:” animation_direction=”left” animation_speed=”0.3″]
The organizational theorist Gareth Morgan’s concept for the amount of discretionary influence a manager has in influencing change processes. One of Morgan’s points is that this 15% can accomplish a great deal in a nonlinear, complex system. For example, nonlinearity means that a small change can have a huge outcome. Therefore, although one’s discretionary efficacy may only be 15%, nevertheless, there can still be a large impact resulting from these discretionary efforts.
See: Butterfly Effect; Instability; Nonlinear Systems; Sensitive Dependence on Initial Conditions
Bibliography: Morgan (1997)
[/fusion_content_box][fusion_content_box title=”Correlation Dimension” animation_direction=”left” animation_speed=”0.3″]
A mathematical method for measuring the complexity of a system by analyzing times series data from that system. The correlation dimension measures the degree of correlation among the elements of a system which can reveal the existence of some kind of hidden order such as chaos, nonlinear-coupling, and so on in what otherwise might have been conceived as a random system. Correlation dimension is related to other measures of complexity such as fractal dimensionality, Kolmogorov entropy, Lyapunov Exponents, and Structural Complexity.
See: Chaos; Coherence; Time Series
Bibliography: Peak and Frame (1994)
[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary D – F
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions has been organized in alphabetical order for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Deterministic System
Difference Questioning
Dissipative Structure
Dynamical System
Edge of Chaos
Emergence
Equilibrium
Far-from-equilibrium
Feedback
Fitness Landscape
Fractal
Fractal Dimension
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”2″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Deterministic System” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” linktext=”Read More” animation_direction=”left” animation_speed=”0.3″]
A system in which the later states of the system follow from or are determined by the earlier ones. Such a system is described in contrast to the stochastic or random system in which future states are not determined from previous ones. An example of a stochastic system would be the sequence of heads or tails of an unbiased coin or radioactive decay. If a system is deterministic, this doesn’t necessarily entail that later states of the system are predictable from knowledge of the earlier ones. In this way, chaos is similar to a random system. For example, chaos has been termed “deterministic chaos” since, although it is determined by simple rules, its property of sensitive dependence on initial conditions makes a chaotic system largely unpredictable.
See: Chaos; Randomness
Bibliography: Goldstein in Sulis and Combs (1996); Lorenz (1993)
[/fusion_content_box][fusion_content_box title=”Difference Questioning” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” animation_direction=”left” animation_speed=”0.3″]
A group process technique developed by the organizational/complexity theorist Jeffrey Goldstein that facilitates self-organization by generating far-from-equilibrium conditions in a work group. The process consists of several methods whereby information is amplified by highlighting the differences in perception, idea, opinion, and attitude among group members. Difference questioning does not aim at increasing or generating conflict, but, instead, tries to uncover the already differing standpoints. Moreover, the process takes place within boundaries that ensure the self-organization is channeled in constructive directions. Difference Questioning aims at interrupting the tendency toward social conformity which robs groups of their creative idea generating and decision-making potential. In other words, it strives to allow a greater flow of information among the group members which has been shown to be correlated with a far-from-equilibrium condition, i.e., a condition in which self-organizing change can take place.
See: Information; Self-organization
Bibliography: Goldstein (1994)
[/fusion_content_box][fusion_content_box title=”Dissipative Structure” animation_direction=”left” animation_speed=”0.3″]
The term used by the Prigogine School (from Ilya Prigogine, winner of the Nobel Prize in chemistry) for emergent structures arising in self-organizing systems. Such structures are dissipative by serving to dissipate energy in the system. They happen at a critical threshold of far-from-equilibrium conditions. An example is the hexagonal convection cells that emerge in the Benard System when it is heated. Another example are the so-called “chemical clocks” demonstrated in the Belousov-Zhabotinsky reaction. These “chemical clocks” are composed of both temporal structures such as a shift from one color to another with the regular of a clock as well as spatial structures such as spiral waves and so on.
See: Coherence; Emergence; Far-from-equilibrium
Bibliography: Prigogine and Stengers (1984); Nicolis in Davies (1989)
[/fusion_content_box][fusion_content_box title=”Dynamical System” animation_direction=”left” animation_speed=”0.3″]
A complex, interactive system evolving over time through multiple modes of behavior, i.e., attractors. Instead, therefore, of conceiving of entities or events as static occurrences, the perspective of a dynamical system is a changing, evolving process following certain rules and exhibiting an increase of complexity. This evolution can show transformations of behavior as new attractors emerge. The changes in system organization and behavior are called bifurcations. Dynamical systems are deterministic systems, although they can be influenced by random events. Times series data of dynamical systems can be graphed as phase portraits in phase space in order to indicate the “qualitative” or topological properties of the system and its attractor(s). For example, various physiological systems can be conceptualized as dynamical systems, the heart for one. Seeing physiological systems as dynamical systems opens up the possibilities of studying various attractor regimes. Moreover, certain diseases can be understood now as “dynamical diseases” meaning that their temporal phasing can be a key to understanding pathological conditions.
See: Attractors; Bifurcation; Logistic Equation
Bibliography: Abraham, et. al. (1991); Guastello (1995); Peak and Frame (1994)
[/fusion_content_box][fusion_content_box title=”Edge of ChaosAttractor” animation_direction=”left” animation_speed=”0.3″]
A term made popular by researchers at the Santa Fe Institute to indicate a particularly “pregnant” phase in the evolution of a dynamical, complex system where creative emergence of new structures is at a maximum. In the study of the behavior of cellular automata and similar electronic arrays, the edge of chaos seems particularly favorable for the emergence of innovative, more adaptive structures and modes of functioning. The edge of chaos is conceived as that zone between too much rigidity and too much laxity. There is controversy whether natural systems have a tendency to evolve into edge of chaos conditions. The edge of chaos can also be considered as roughly analogous to far-from- equilibrium conditions in that they both represent critical thresholds where self-organization and emergence are heightened. Organizational applications have to do with processes that encourage organizational innovation by facilitating edge of chaos like conditions.
See: Cellular Automata; Far-from-equilibrium
Bibliography: Kauffman (1995); Lewin (1992); Waldrop (1992)
[/fusion_content_box][fusion_content_box title=”Emergence” animation_direction=”left” animation_speed=”0.3″]
The arising of new, unexpected structures, patterns, or processes in a self-organizing system. These emergents can be understood as existing on a higher level than the lower level components from which the emergents emerged. Emergents seem to have a life of their own with their own rules, laws, and possibilities unlike the lower level components. The term was first used by the nineteenth century philosopher G.H.Lewes and came into greater currency in the scientific and philosophical movement known as Emergent Evolutionism in the 1920’s and 1930’s. In an important respect the work connected with the Santa Fe Institute and similar facilities represents a more powerful way of investigating emergent phenomena. In organizations, emergent phenomena are happening ubiquitously yet their significance can be downplayed by control mechanisms grounded in the officially sanctioned corporate hierarchy. One of the keys for leaders from complex systems theory is how to facilitate emergent structures and take advantage of the ones that occur spontaneously.
See: Self-organization
Bibliography: Cohen and Stewart (1994); Goldstein in Sulis and Combs (1996)
[/fusion_content_box][fusion_content_box title=”Equilibrium” animation_direction=”left” animation_speed=”0.3″]
Equilibrium is a term indicating a rest state of a system, for example, when a dynamical system is under the sway of a fixed or periodic attractor. The concept originated in Ancient Greece when the great mathematician Archimedes experimented with levers in balance, literally “equilibrium”. The idea was elaborated upon through the Middle Ages, the Renaissance and the Birth of Modern Mathematics and Physics in the 17th and 18th centuries. “Equilibrium” has come to mean pretty much the same thing as stability, i.e., a system that is largely unaffected by internal or external changes since it easily returns to its original condition after being perturbed, e.g., a balanced lever on a fulcrum (i.e., a see-saw). More generally, equilibrium suggests a system that tends to remain at status quo.
See: Attractor; Far-from-equilibrium
Bibliography: Goldstein (1994); Prigogine and Stengers (1984).
[/fusion_content_box][fusion_content_box title=”Far-from-equilibrium” animation_direction=”left” animation_speed=”0.3″]
The term used by the Prigogine School for those conditions leading to self- organization and the emergence of dissipative structures. Far-from-equilibrium conditions move the system away from its equilibrium state, activating the nonlinearity inherent in the system. Far-from-equilibrium conditions are another way of talking about the changes in the values of parameters leading-up to a bifurcation and the emergence of new attractor(s) in a dynamical system. Furthermore, to some extent, far-from-equilibrium conditions are similar to “edge of chaos” in cellular automata and random boolean networks.
See: Difference Questioning; Equilibrium; Purpose Contrasting; Self-organization
Bibliography: Goldstein (1994); Nicolis in Davies (1989); Prigogine and Stengers (1984)
[/fusion_content_box][fusion_content_box title=”Feedback” animation_direction=”left” animation_speed=”0.3″]
The mutually reciprocal effect of one system or subsystem on another. Negative feedback is when two subsystems act to dampen the output of the other. For example, the relation of predators and prey can be described by a negative feedback loop since the more predators there are leads to a decline in the population of prey, but when prey decrease too much so does the population of predators since they don’t have enough food. Positive feedback means that two subsystems are amplifying each other’s outputs, e.g., the screech heard in a public address system when the mike is too close to the speaker. The microphone amplifies the sound from speaker which in turn amplifies the signal from the microphone and around and around. Feedback is a way of talking about the nonlinear interaction among the elements or components in a system and can be modeled by nonlinear differential or difference equations as well as by the activity of cells in a cellular automata array. The idea of feedback forms the basis of System Dynamics, a way of diagramming the flow of work in an organization founded by Jay Forrester and made popular by Peter Senge.
See: Interactive, Nonlinear
Bibliography: Eoyang (1997)
[/fusion_content_box][fusion_content_box title=”Fitness Landscape” animation_direction=”left” animation_speed=”0.3″]
A “graphical” way to measure and explore the adaptive (fitness) value of different configurations of some elements in a system. Each configuration and its neighbor configurations (i.e., slight modifications of it) are graphed as lower or higher peaks on a landscape-like surface, i.e., high fitness is portrayed as mountainous-like peaks, and low fitness is depicted as lower peaks or valleys Such a display provides an indication of the degree to which various combinations add or detract from the system s survivability or sustainability. The use of fitness landscapes in understanding the behavior of complex, adaptive systems has been pioneered by Stuart Kauffman in his study of random boolean networks. An important implication from studying fitness landscapes is that there may be many local peaks or “okay” solutions instead of one, perfect, optimal solution. Thinking in terms of fitness landscapes can point to foolish adaptation, i.e., a downward trend on the slopes of the peaks. Moreover, studies of N/K models using fitness landscapes demonstrates that there is a decreasing rate of finding fitter adaptable configurations as one travels uphill on a fitness landscape. The use of fitness landscapes can be applied to gain insight into various organizational issues including which innovative organizational designs, processes, or strategies promise greater potential.
See: N/K Model; Random Boolean Networks
Bibliography: Kauffman (1995); Kauffman and Macready (1995); Maguire (1997).
[/fusion_content_box][fusion_content_box title=”Fractal” animation_direction=”left” animation_speed=”0.3″]
A geometrical pattern, structure, or set of points which is self-similar (exhibiting an identical or similar pattern) on different scales. For example, Benoit Mandelbrot, the discoverer of fractal geometry, describes the coast of England as a fractal, because as it is observed from closer and closer points of view (i.e., changing the scale), it keeps showing a self-similar kind of irregularity. Another example is the structure of a tree with its self- similarity of branching patterns on different scales of observation, or the structure of the lungs in which self-similar branching provides a greater area for oxygen to be absorbed into the blood. Strange attractors in chaos theory have a fractal structure. The imagery of fractals has been popularized by the fascinating graphical representations of fractals in the form of Mandelbrot and Julia Sets on a personal computer.Unlike the whole number characteristic of our usual dimensions, e.g., two or three dimensional drawings, the dimension of a fractal is not a whole number but a fractional part of a whole number such as a dimensionality of 2.4678.
[/fusion_content_box][fusion_content_box title=”Fractal Dimension” animation_direction=”left” animation_speed=”0.3″]
A noninteger measure of the irregularity or complexity of a system. Knowing the fractal dimension helps one determine the degree of irregularity and pinpoint the number of variables that are key to determining the dynamics of the system.
See: Chaos; Correlation Dimension; Scale
Bibliography: Peak and Frame (1994)
[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary G – I
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions has been organized in alphabetical order for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Generative Relationships
Genetic Algorithm
Graph Theory (Social Networks)
Information
Initial Conditions
Instability
Internal Models
Interactive
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”2″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Genetic Algorithm” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” linktext=”Read More” animation_direction=”left” animation_speed=”0.3″]
A type of evolving computer program developed by the computer scientist John Holland whose strategy of arriving at solutions is based on principles taken from genetics. Basically, the genetic algorithm uses the mixing of genetic information in sexual reproduction, random mutations, and natural selection at arriving at solutions. In an analogous manner to the way a genetic algorithm learns better solutions through the mixing of patterns and an openness to random or chance events, a complex, adaptive system can adapt to a changing environment through a mixing of previous internal models of their environment. Thus, genetic algorithms can provide insight into the creative process of problem solving or decision making.
See: Complex, Adaptive System; Randomness
Bibliography: Eoyang & Olson (2001); Holland (1994).
[/fusion_content_box][fusion_content_box title=”Generative Relationships” animation_direction=”left” animation_speed=”0.3″]A concept developed by complexity researchers David Lane and Robert Maxfield. They define a human relationship as generative if it “produces new sources of value that cannot be forseen in advance.” Their contention is that organizations, in times of turbulence and change, need to foster multiple generative relationships, within and outside the organization, as a means of discovering new strategies and directions. fostering such relationships as well as preconditions for their success, they suggest, is a key responsibility of leaders.
See: Edge of Chaos; Emergence; Genetic Algorithm; Self-Organization
Bibliography: Lane/ Maxfield (1996).[/fusion_content_box][fusion_content_box title=”Graph Theory (Social Networks)” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” animation_direction=”left” animation_speed=”0.3″]
The mathematical theory that studies the properties of networks or webs of connections. A graph consists of edges (linkages) connecting nodes (what’s connected). Examples of networks studied by graph theory include the internet, the economy, and genetic landscapes. Although graph theory is a purely mathematical discipline, the term is being included here because it is providing a theoretical foundation for the very influential and growing field of social network theory. The latter is providing rich insights into the dynamics of complex systems in general. For example, social network theory using graph theory has been discovering the complex structures of the internet, communities, employees connected within and outside their work organizations and so forth.
See: Scale-free Network; Small World Network
Bibliography: Kilduff & Tsai (2003); Newman, Barabasi, Watts (2006); Trudeau (1993); Watts (1999).
[/fusion_content_box][fusion_content_box title=”Information” animation_direction=”left” animation_speed=”0.3″]
Originally, information in the technical senses referred to the bits of a message, as opposed to “noise,” in a communication channel (formulated in Information Theory by the mathematician Claude Shannon building on earlier work done by Harry Nyquist and Ralph Hartley). Information has come to mean the bits of data that are the elements that are processed by the computer as information processor. “Noise” has a disorganizing effect in its way of disrupting redundant patterns so that novelty can come about in the emergent structures resulting from self-organizing processes. In terms of organizations, information is the cognate in social systems of what energy is in a physical system. According to Gregory Bateson, information is “a difference that makes a difference.” In terms of social systems this refers to the differences among group members’ perspectives on what is going on in the system. Information is not mere data: it is data that is meaningful to organizational members. An organization that is low in the flow of information is one in equilibrium or tending to maintain its status quo; whereas, an organization that is high in informational flow is in a far-from-equilibrium state in which dramatic changes can take place. Recent years have seen the birth of a new field entitled quantum information science which is playing an important role in the development of quantum computers and so-called quantum teleportation, both relying on the strange nature of quantum entanglement. These new fields reveal that information is increasingly seen as a basic constituent of the world around us or as the renowned physicist John Wheeler once put it, “It comes from bit!”
See: Redundancy
Bibliography: Goldstein (1994); Darling (2005).
[/fusion_content_box][fusion_content_box title=”Initial Conditions” animation_direction=”left” animation_speed=”0.3″]
The state of a system corresponding to the beginning of a period of observing or measuring it. The initial conditions are what is assessed at any particular time, and to which one can compare any later observation, measurement, or assessment of the system as it evolves over time. For example, chaotic systems demonstrate sensitive dependence on initial conditions, meaning that the nonlinearity strongly amplifies slight differences in initial conditions, thereby rendering impossible the predictability of later states of the system.
See: Chaos; Sensitive Dependence on Initial Conditions
Bibliography: Lorenz (1993); Ott (2003).
[/fusion_content_box][fusion_content_box title=”Instability” animation_direction=”left” animation_speed=”0.3″]
The condition of a system when it is more easily disturbed by internal or external forces or events, in contrast to a stable system that will return to its previous condition when disturbed. A pencil resting vertically on its eraser or a coin resting on its edge are examples of systems that have the property of instability because they easily fall over at the slightest breeze or movement of the surface they are resting on. An unstable system is one whose attractors can change, thus, instability is a characteristic of a system near or at bifurcation (or far-from-equilibrium).
See: Bifurcation; Equilibrium: Far-from-equilibrium
Bibliography: Nicolis (1989)
[/fusion_content_box][fusion_content_box title=”Internal Models” animation_direction=”left” animation_speed=”0.3″]In complex, adaptive systems theory, a system functions according to its internal representation or model of its environment. This internal model is encoded in a set of internal mechanisms or processes (for example, memory structures). For a system to adapt to a changing environment, the internal models must have a means for changing as well. Thus, one of the most important functions of “change agents” in a business or institution is to expedite reconsiderations of an organization’s internal model of its environment.
See: Complex, Adaptive Systems
Bibliography: Gell-Mann (1994); Holland (1995)[/fusion_content_box][fusion_content_box title=”Interaction” animation_direction=”left” animation_speed=”0.3″]The mutual effect of components or subsystems or systems on each other. This interaction can be thought of as feedback between the components as there is a reciprocal influence. In contrast, the effect of a pool cue on a cue ball is not interactive since the cue balls movement does not immediately affect the pool cue itself. For example, in cellular automata, it is the programmed rules that shape the kind of interaction occurring among neighboring cells. Complex adaptive systems are nonlinear, interactive systems.
See: Feedback; Nonlinear
Bibliography: Eoyang and Olson (2001).[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary L – N
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions are organized alphabetically for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Linear System
Logical Depth
Logistic Equation
Mental Models
Minimum Specifications
Neural Nets
N/K Model
Nonlinear System
Novelty (Innovation)
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”2″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Linear System” animation_direction=”left” animation_speed=”0.3″]
Technically, any system in which the change of values of its variables can be represented as a series of points suggesting a straight line on a coordinate plane, hence, the term “linear” for line. More generally, a linear system is one in which small changes result in small effects, and large changes in large effects. In a linear system, the components are isolated and noninteractive. Real linear systems are rare in nature since living organisms and their components are not isolated and are made-up of rich interactions.
See: Nonlinear System
Bibliography: Abraham (1982)
[/fusion_content_box][fusion_content_box title=”Logical Depth” animation_direction=”left” animation_speed=”0.3″]Algorithmic Complexity is often contrasted with another measure of complexity also relying on a measure of the algorithms needed to generate the data from a system, i.e., that of logical depth defined by the computer scientist and mathematician, Charles Bennett.
See: Complexity (Algorithmic)
Bibliography: Bennett (1982)[/fusion_content_box][fusion_content_box title=”Logistic Equation (Map)” animation_direction=”left” animation_speed=”0.3″]An equation that has been applied to various natural systems such as the changes in a system’s population over time which show a convergence to some fixed value or values over time (i.e., attractors). The logistic equation (also called a “map”) is technically a nonlinear difference equation which has been used to model the population changes in a system characterized by a predator-prey relationship between two species. The fascinating behavior of this particular equation was studied by the physicist-turned-epidemiologist Robert May and the physicist Mitchell Feigenbaum. May discovered the fascinating change of attractors as the control parameter is increased leading up to chaos, while Feigenbaum discovered his famous constant of the ratio of succeeding parameter values following the path of the period-doubling bifurcation. The logistic map and its changes of attractors has become somewhat emblematic of chaos theory since it shows how a relatively simple equation can generate such fascinating complexity.
See: Attractors
Bibliography: Peak and Frame (1994)[/fusion_content_box][fusion_content_box title=”Mental Models” animation_direction=”left” animation_speed=”0.3″]Images, representations, or thought schemes of how we perceive and cognize the world around us. We follow our mental models in getting about in the world, but can become trapped in limiting behaviors by being overly attached to certain mental models. That is why we need occasionally to be jogged out of the ruts of our dominant mental models by investigating new ways of looking at things. Complexity science has the promise of being a powerful tool to get us to look at our work and organizations in a new way, thereby changing our mental models of how to go about our business in the most effective manner.
See: Complex, Adaptive Systems; Internal Model
Bibliography: Senge (1990), Stacey (1996)[/fusion_content_box][fusion_content_box title=”Minimum Specifications” animation_direction=”left” animation_speed=”0.3″]
The management theorist Gareth Morgan’s term for processes encouraging self-organization by avoiding an overly top-down, imposed design on an organization or work group. These processes can include such elements as mission statements, guiding principles, boundaries, creative challenges, and so on. The key is for leadership to provide the minimum specifications so a work group itself has creative space to accomplish its work. Minimum specifications are analogous to the simple rules governing cell interactions in studies of cellular automata.
See: Cellular Automata; N/K Model
Bibliography: Morgan (1997); Zimmerman, Lindberg, Plsek (2001).
[/fusion_content_box][fusion_content_box title=”Neural Networks” animation_direction=”left” animation_speed=”0.3″]
Electronic automatons, similar in some ways to cellular automata, that offer a simplified model of a brain. As such, neural networks are devices of machine learning that are based on associative theories of human cognition. Using various algorithms and weightings of different connections between “neurons,” they are set up to learn how to recognize a pattern such as learning a voice, recognizing a visual pattern, learning some form of robotic control, manipulating symbols, or making decisions, and so on. Generally, neural nets are composed of three layers: input neurons; output neurons; and a layer in-between where information from input to output is processed. Initially the network is loaded with a random program, then the output is measured against a desired output which prompts an adjustment in the “weights” assigned to the connectivities in response to the “error” between the actual and desired output, and this is repeated many times. In this way, the neural network learns. In a sense, a neural net has to be able to discover its own rules. Changing the rules of interaction between the “neurons” in the network can lead to interesting emergent behavior. Hence, neural nets are another tool for investigating self-organization and emergence.
See: Adaptation
Bibliography: Allman (1989).
[/fusion_content_box][fusion_content_box title=”N/K Model” animation_direction=”left” animation_speed=”0.3″]Stuart Kauffman’s conception of understanding the evolution of complex, adaptive systems based on the fitness traits of an organism (N) and the inputs of one trait (or gene) to another. Then one can observe the fitness landscapes obtained by manipulating the N’s and K’s. Emergent patterns are thus understood in terms of what rules led to them and what implications they have for fitness.
See: Adaptation; Fitness Landscape; Self-organization
Bibliography: Kauffman (1995); Kauffman and Macready (1995)[/fusion_content_box][fusion_content_box title=”Nonlinear System (Nonlinearity)” animation_direction=”left” animation_speed=”0.3″]
Technically, any system where the data points representing values of its variables can be represented as a curvilinear pattern on a coordinate plane, hence, “nonlinear” for not-a-line. That is, the system’s dynamics are more appropriately represented by nonlinear and not linear functions. More generally, a system in which small changes can result in large effects, and large changes in small effects. Thus, sensitive dependence on initial conditions (the butterfly effect) in chaotic systems illustrates the extreme nonlinearity of these systems. In a nonlinear system the components are interactive, interdependent, and exhibit feedback effects. Complex adaptive systems are nonlinear systems.
Before the advent of chaos and complexity theories during the past thirty-five years, nonlinear functions were mostly relegated to appendices in textbook because of their refractoriness to analytic solutions. But now, because of both advances in computational approaches to nonlinear functions and the recognition of the crucial role played by interactions in most systems of interest, it is increasingly recognized that examples of nonlinear systems are presumably endless. Thus, human physiology is replete with nonlinear systems, such as the cardiac, circulatory, and immune systems. In addition, in social systems nonlinearity seems to be the norm since they are constituted by mutually reciprocal interactions among the members of social groupings and such interactions are appropriately modeled by nonlinear rather than linear equations.
See: Linear; Sensitive Dependence on Initial Conditions
Bibliography: Eoyang and Olson (2001); Goldstein (1994); Guastello (1995); Scott (2005); West (1990); West (2006).
[/fusion_content_box][fusion_content_box title=”Novelty” animation_direction=”left” animation_speed=”0.3″]
One of the defining characteristics of emergent patterns is their novelty or innovative character. Indeed, that is why such phenomena are termed “emergent” since they introduce new qualities into the system that were not pre-existing in it. An example is the novel nature of the “dissipative structures” that arise in nonlinear systems at far-from-equilibrium conditions. This novelty is neither expected, predictable, nor deducible from the pre-existing components. Moreover, this novelty is not reducible to the lower level components without loosing its essential characteristics. An issue, therefore, for practitioners working with complex systems, is to determine which system processes are necessary for the emergence of novelty. That is, novel outcomes demand novel processes that prompt a system to produce novel structures and practices. In organizations, novel emergent outcomes are typically termed innovations. The study of the diffusion of innovations was pioneered by the late Everett Rogers (see below).
See: Bifurcation; Emergence; Far-from-equilibrium; Self-organization
Bibliography: Goldstein (2006); Rogers (2003); Van de Ven & Garud (1994).
[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary O – R
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions are organized alphabetically for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Order for Free
Parameters
Phase (State) Space
Phase Portrait
Positive Deviance
Power Law
Purpose Contrasting
Random Boolean Network
Redundancy
[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ heading_size=”2″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Order for Free” animation_direction=”left” animation_speed=”0.3″]Stuart Kauffman’s term for the way the internal dynamics of a system generates order spontaneously under the right conditions. This order is “for free” in that it does not need to be imposed or imported from outside the system. It is Kauffman’s conjecture that natural selection during the course of evolution takes place on already self-organized order. An implication is that particular biological adaptations or forms may result from constraints on possible designs due to the inherent mathematical dynamics of a system. In terms of organizations, it may be case that spontaneously emerging structures provide a crucial understanding of such system’s functioning.
See: Emergence; Self-organization
Bibliography: Kauffman (1995)[/fusion_content_box][fusion_content_box title=”Parameters” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” linktext=”Read More” animation_direction=”left” animation_speed=”0.3″]
Variables in the mathematical equations used to model system behavior. Changes in the values of these variables can affect the system’s behavior.
Control Parameters: These parameters often model some kind of external influence on a system that facilitate a far-from-equilibrium condition or, in other words, expedite a bifurcation. An example is temperature in the Benard System, which at a critical value prompts self-organization and the emergence of hexagonal convection cells when a particular liquid in a container is heated from the bottom.
Order Parameters: Parameters that represent some global emergent characteristic of a system as opposed to variables of lower level components. The shift to order parameters signifies recognition that emergent phenomena need to be investigated on their own terms.
Lambda Parameter: A parameter used by the computer scientist Chris Langton to get at the range where self-organization is most likely in cellular automata. As such the lambda parameter is a control parameter.
See: Bifurcation; Cellular Automata
Bibliography: Haken (1981); Langton (1986).
[/fusion_content_box][fusion_content_box title=”Phase (State) Space” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” animation_direction=”left” animation_speed=”0.3″]
An abstract mathematical space which is used to display time series data of the measurements of a system. The dimensions of phase or state space correspond to the number of variables used to characterize the state of the system. For example, the phase space of a pendulum would consist of two dimensions: the speed of the bob; and the distance of the bob from the vertical resting state. Phase space is very helpful for observing the patterns that result as systems evolve over time. Please note that time is usually not one of the explicit dimensions of the phase space, a role that time does play in a straight graphical depiction of a time series.
Phase Portrait: The geometrical patterns shown in phase space as a system evolves. These portraits may be attractors such as fixed point, periodic, and strange attractors. They can also include repellors (the opposite of attractors) and such interesting patterns as saddles (in which there are attractor(s) in one direction and repellor(s) in another direction) and separatrices, or boundaries between two basins of attraction.
See: Attractors; Chaos
Bibliography: Abraham, et. al. (1991); Guastello (1995)
9).
[/fusion_content_box][fusion_content_box title=”Phase Portrait” animation_direction=”left” animation_speed=”0.3″]The geometrical patterns shown in phase space as a system evolves. These portraits may be attractors such as fixed point, periodic, and strange attractors. They can also include repellors (the opposite of attractors) and such interesting patterns as saddles (combination of attractor in one direction and repellor in another direction) and separatrices, or boundaries between two basins of attraction.
See: Attractors; Chaos
Bibliography: Abraham, et. al. (1991); Guastello (1995)[/fusion_content_box][fusion_content_box title=”Positive Deviance” animation_direction=”left” animation_speed=”0.3″]
Experiments or deviations from the norm in a social system that can lead to positive change. The phrase itself “positive deviance” is a kind of oxymoron, since it pairs the constructive term “positive” with the negative term “deviance,” the latter term carrying quite a bit of pejorative associations precisely because it pertains to deviations from the norm. For example, members of a society practicing “fringe behavior” are often called deviants. However, branding “deviance” with such derogatory outcomes sets up a bias that protects the norm with a halo of righteousness while condemning deviations-from-the-norm as degenerate. If this bias were strictly enforced, we would never have gained most of the great scientific and social advances of human history. All such major innovations and transformations, in one way or another, relied on radical departures from the norm. Both the Copernican and the American Revolutions are cases in point.
A social interventional method termed “Positive Deviance” developed by Jerry and Monique Sternin identifies novel experiments in complex social systems—deviations from the norm—and harnesses them to generate positive outcomes. “Radical” ideas from organizational outliers are reframed as solutions with the potential of bringing about significant social system change. According to Jerry Sternin, in many communities facing seemingly intractable problems, there are certain individuals or groups (positive deviants) with the same access to resources as other community members whose special practices, strategies or behaviors generate better results.
Bibliography: Sternin & Choo (2000 ); Sternin (2003).
[/fusion_content_box][fusion_content_box title=”Purpose Contrasting” animation_direction=”left” animation_speed=”0.3″]A method developed by the management/complexity theorist Jeffrey Goldstein which consists of a work group highlighting the discrepancy between their original purpose and the current activities or procedures being done. The purpose of purpose contrasting is similar to difference questioning: to increase the amount of information in a system and thereby facilitate self-organization. For example, in a bureaucratically organized business or institution, the bureaucracy itself becomes its own purpose, obscuring the original purpose of making a product or providing a service.
See: Difference Questioning; Information; Far-from-equilibrium
Bibliography: Goldstein (1994)[/fusion_content_box][fusion_content_box title=”Power Laws” animation_direction=”left” animation_speed=”0.3″]
A type of mathematical pattern in which the frequency of an occurrence of a given size is inversely proportionate to some power (or exponent) of its size. For example, in the case of avalanches or earth quakes, large ones are fairly rare, smaller ones are much more frequent, and between these extremes are cascades of different sizes and frequencies which take place a moderate number of times. Power laws define the distribution of catastrophic events in self-organized critical systems. Systems with power law distributions are marked by invariance with respect to scale and universality, the latter term referring to remarkably similar dynamics across quite different systems. Power laws are associated with fractal-like patterns since the pattern is self-similar with respect to scale. In this regard power law signatures have been discovered in heart inter-beat variability and are suspected in many other physiological phenomena.
See: Fractal; Scale-free Network; Self-organized Criticality; Sensitive Dependence on Initial Conditions
Bibliography: Bak (1996);Barabási (2002); Schroeder (1991); West (2006).
[/fusion_content_box][fusion_content_box title=”Redundancy” animation_direction=”left” animation_speed=”0.3″]
The existence of repetitive patterns or structures. In an important sense, redundancy refers to order in a complex system in the sense that order is defined as the existence of structures that maintain themselves over time (i.e., they are stable). In information theory, redundancy refers to repetition in patterns of messages in a communication channel. If the message contains these redundancies, they can be compressed further. For example, if a message contains a series of two hundred and fifty 1s, then the message could be compressed into a command which effectively says “and then repeat 1 250 times” instead of writing out all two hundred fifty 1s. Self-organizing processes demand some element of redundancy which can be considered as a “fuel” for the processes leading to emergence. In other words, novel patterns come from a recombination of redundant patterns.
See: Information; Novelty
Bibliography: Campbell (1982); Poundstone (1985).
[/fusion_content_box][fusion_content_box title=”Random Boolean Networks” animation_direction=”left” animation_speed=”0.3″]Electronic arrays developed by the medical researcher and evolutionary biologist Stuart Kauffman. These arrays are used to study self-organizing processes and the emergence of new structures. It is from his study of these Boolean Networks plus borrowing from Wright s idea of fitness landscapes and work in solid state physics on spin glasses, that Kauffman derived his N/K model of complex systems. The Networks are random to the extent that input rules are set and changed at random in order to not bias the system in the direction of specifically planned structures.
See: Cellular Automata; Fitness Landscape; N/K Model; Parameters
Bibliography: Kauffman (1995)[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”false” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_global id=”4101″][/fusion_builder_column][fusion_builder_column type=”1_2″ type=”1_2″ layout=”1_2″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”false” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
Glossary S – W
[/fusion_title][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Terms & definitions are organized alphabetically for easy access.
[/fusion_text][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]Scale (Scaling Law)
Self-organization
Self-organized Criticality (SOC)
Sensitive Dependence on Initial Conditions (SIC)
Shadow Organization
Small World Network
Stability
Swarmware and Clockware
Time Series
Turing Machine
Wicked Questions[/fusion_text][/fusion_builder_column][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_separator style_type=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” sep_color=”#8bc34a” top_margin=”20″ bottom_margin=”20″ alignment=”center” /][fusion_content_boxes layout=”icon-with-title” columns=”1″ title_size=”Schema” heading_size=”3″ title_color=”#8bc34a” body_color=”#000000″ icon=”fa-bookmark fas” iconspin=”no” iconcolor=”#00bcd4″ icon_circle=”no” circlecolor=”#ffffff” hover_accent_color=”#ffffff” icon_align=”left” animation_direction=”left” animation_speed=”0.3″ hide_on_mobile=”small-visibility,medium-visibility,large-visibility”][fusion_content_box title=”Scale” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” linktext=”Read More” animation_direction=”left” animation_speed=”0.3″]
The level at which a system is observed. For example, one can observe the coast of England from a satellite or from a jet liner or from a low flying plane, or from walking along the coast, or from peering down into the sand and rocks on a cove beach that you are standing on. Each of these perspectives is of a different scale of the actual coast of England. Fractals are geometric patterns that are self-similar on different scales.
See: Fractal; Power Law
Bibliography: Kaye (1989); Schroeder (1991)
[/fusion_content_box][fusion_content_box title=”Scale-free Network” icon=”fa-bookmark fas” iconflip=”none” iconrotate=”none” iconspin=”no” animation_direction=”left” animation_speed=”0.3″]
A type of network, studied by means of graph theory, in which some of the nodes act as highly connected hubs but most other nodes have a lower degree of connectivity. They are called “scale-free” because their structure and dynamics are independent of the system’s size defined in terms of number of nodes. A scale-free network has the same features no matter what number of nodes is in the network. Scale-free networks also exhibit a power law distribution and may be more resilient in the face of loss of connectednes than hub networks which cannot withstand the loss of the hub.
See: Fractal; Scale; Graph Theory; Small Worlds
Bibliography : Barabási (2002); Watts (1999).
[/fusion_content_box][fusion_content_box title=”Self-organization” animation_direction=”left” animation_speed=”0.3″]
A process in a complex system whereby new emergent structures, patterns, and properties arise without being externally imposed on the system. Not controlled by a centralized, hierarchical “command and control” center, self-organization is usually distributed throughout a system. Self-organization requires a complex, nonlinear system under appropriate conditions, variously described as “far-from-equilibrium” or criticalization. Studied in physical systems by Ilya Prigogine and his followers, as well as the Synergetics School founded by Hermann Haken, self-organization is now studied primarily through computer simulations such as cellular automata, Boolean networks, and other phenomena of artificial life. Self-organization is recognized as a crucial way for understanding emergent, collective behavior in a large variety of systems including: the economy; the brain and nervous system; the immune system; ecosystems; and the modern large corporation or institution. The emergence of new system order via self-organization is thought to be a primary tendency of complex systems in contrast to the past emphasis on the degrading of order in association with the principle of entropy (second law of thermodynamics). In recent perspectives, rather than fighting against entropy, self-organization can be understood as a way that the total entropy of a complex system along with its environment(s) increases.
Now that we have a better handle scientifically on how self-organization takes place, it is easier to recognize instances of it in the world around us. For example, self-organization could be an appropriate way of understanding how a hospital staff may spontaneously re-organize itself to respond more effectively to a sudden influx of critically ill patients. This is what seems to have happened, for example, at Beekman Downtown Hospital in Manhattan during the tragedy of 9-11-2001 when the staff coalesced into novel treatment teams to handle the tremendous inflow of seriously wounded victims. Self-organization may also take place in innumerable other ways, for example, the change in family dynamics that results when a family member enters a hospice program, or the emergence of novel ways to provide care to a seriously ill that comes from interactions among the patient, nurses, physicians, other healthcare professionals, support staff and family members when patients have multiple chronic diseases.
See: Coherence; Dissipative Structures; Emergence; Far-from-equilibrium
Bibliography: Eoyang & Olson (2001); Goldstein (1994); Nicolis (1989); Nicolis & Prigogine (1989).
[/fusion_content_box][fusion_content_box title=”Self-organized Criticality” animation_direction=”left” animation_speed=”0.3″]
Formulated by the late physicist Per Bak, a phenomenon of sudden change in physical systems in which they evolve naturally to a critical state at which abrupt changes can occur. That is, when these systems are not in a critical state (i.e., they are characterized by instability), output follows from input in a linear fashion, but when in the critical state, systems characterized by self-organized criticality act like nonlinear amplifiers, similar to but not as extreme as the exponential increase in chaos due to sensitive dependence on initial conditions. That is, the nonlinear amplification in a self-organized, critical system follows a power law instead of an exponential law. Such systems are self-organized in the sense that they reach a critical state on their own. Examples of such systems include avalanches, plate tectonics leading to earthquakes or stock market systems leading to crashes. Because these systems follow power laws, and because fractals also show a similar mathematical pattern, it may be that many naturally occurring fractals, such as tree growth, the structure of the lungs, and so on, may be generated by some form of self-organized criticality.
See: Bifurcation; Catastrophe; Instability; Power Law; Self-organization
Bibliography: Bak (1996).
[/fusion_content_box][fusion_content_box title=”Sensitive Dependence on Initial Conditions” animation_direction=”left” animation_speed=”0.3″]
The property of chaotic systems in which a small change in initial conditions can have a hugely disproportionate effect on outcome. Sensitive dependence on initial conditions is popularly captured by the image of the butterfly effect. Sensitive dependence on initial conditions makes the behavior of chaotic systems largely unpredictable because measurements at initial conditions always will contain some amount of error. The late mathematical metereologist Edward Lorenz uncovered this concept in his work on weather forecasting. He noticed that a seemingly insignificant difference in an initial parameter in a forecasting system modeled on his computer led to very different forecasts.
See: Chaos; The Butterfly Effect
Bibliography: Lorenz (1993); Ott (2003).
[/fusion_content_box][fusion_content_box title=”Shadow Organization” animation_direction=”left” animation_speed=”0.3″]The management/complexity theorist Ralph Stacey’s term for the set of informal relationships or networks among people in an organization which exists in tandem with the official and”legitimate” network or hierarchy. The shadow organization is not focused on the same stabilizing objective as the official organization, so it is a ripe ground for the instability required for self-organization and the emergence of more adaptable organizational structures and processes. Effective leaders take into consideration both the mainstream and the shadow systems, even capitalizing, according to Stacey on the potential friction between them.
See: Edge of Chaos; Far-from-equilibrium
Bibliography: Stacey (1996)[/fusion_content_box][fusion_content_box title=”Small World Network” animation_direction=”left” animation_speed=”0.3″]
A type of graph network in which the connectivity among nodes leads to the formation of pathways linking an unusually large number of nodes. The small world phenomenon was made famous in the play(movie) “Six Degrees of Separation” and the “Kevin Bacon number”, which refers to the idea that any actor can be linked through his or her film roles to the actor Kevin Bacon. In both of these, it has been shown mathematically and through experimentation that nearly everyone on the planet is remarkably linked by no more than six linkages in a network comprising all the relationships between people.
See: Graph Theory; Scale-free Network
Bibliography: Barabási, A. L. (2002); Watts (1999).
[/fusion_content_box][fusion_content_box title=”Stability” animation_direction=”left” animation_speed=”0.3″]
The opposite of “instability,” the property of a system which stays pretty much the same after being disturbed by internal or external forces or events. For example, the deeper the keel of a sailboat, the more stable it is regarding the wind and currents. A running gyroscope is stable with respect to changes affecting its centrifugally determined level plane. Stability is sometimes used as a synonym for equilibrium or with the state of a system circumscribed within a particular attractor regime.
See: Equilibrium; Far-from-Equilibrium; Instability
Bibliography: Nicolis (1989); Nicolis & Prigogine (1989); Ott (2003).
[/fusion_content_box][fusion_content_box title=”Swarmware and Clockware” animation_direction=”left” animation_speed=”0.3″]
Two terms coined by the editor of Wired Magazine Kevin Kelly for two antithetical management processes. “Clockware” are rational, standardized, controlled, measured processes; whereas “swarmware” are processes including experimentation, trial and error, and risk-taking. Clockware processes are seen in linear systems whereas swarmware is what happens in complex systems undergoing self-organization as a result of the nonlinear interaction among components.
See: Cellular Automata; Complex Adaptive System; Self-organization
Bibliography: Kelly (1994).
[/fusion_content_box][fusion_content_box title=”Symbiogenesis” animation_direction=”left” animation_speed=”0.3″]
A theory about the emergence of new biological forms put forward by Lynn Margulis which posits that cooperation or symbiosis among two or more distinct types of organisms can lead to the emergence of radically novel types of organisms. It is believed that primitive organisms called eukaryotes incorporated certain elements of the aerobic bacteria that had been ingested into them and that out of this symbiotic relation, the more advanced prokaryotic cells resulted with the novel features of nuclei and membranes. Symbiogenesis manifests a new interpretation of evolution whereby other mechanisms besides variation and selection may be at work. It also represents a growing recognition of the importance of cooperative relationships among species instead of the more typical emphasis on competition and predator-prey relationships.
See: Co-evolution; Emergence
Bibliography: De Duve (2005); Margulis & Sagan (2002); Reid (2007)
[/fusion_content_box][fusion_content_box title=”Synchronization” animation_direction=”left” animation_speed=”0.3″]
A phenomenon that can occur in complex systems in which system components or agents align themselves in a startling coherence. A striking example can be seen in the dramatic synchronization of lighting in certain species of fireflies (what we used to call “lightning bugs” as children). This can be seen inside the Great Smoky National Park near Elmont, Tennessee, during mid-June at about 10 PM every night when thousands of fire flies flash together according to a highly synchronized pattern: After six seconds of total darkness, thousands of lights flash in perfect synchrony six times in three short seconds; the pattern then repeats itself over and over again. A similarly synchronization of firing among fireflies can be observed in parts of Thailand. Research has shown that synchronization takes place without any “leader” firefly. Instead synchronization develops out of the interaction among the fireflies. Specifically, under the right conditions, signals from one to the other become resonated in concert. In human systems, synchronization is evident during sporting events when fans in a stadium combine movement into the famous “wave” of hands.
A destructive kind of synchronization was responsible for the collapse of the Tacoma Narrows Bridge on December 11, 1940. A confluence of high winds and too much structural coherence that was built into the bridge led to a resonance of vibrations affecting the bridge leading to the collapse of the bridge’s structure.
See: Coherence
Bibliography: Strogatz (2003)
[/fusion_content_box][fusion_content_box title=”Time Series” animation_direction=”left” animation_speed=”0.3″]
A collection of measurements of the variable(s) of a system as it evolves over time. Traditionally, times series data were graphed with time on the x-axis and some system variable on the y-axis. For example, the time series of an oscillating (periodic) system such as a forced pendulum or a metronome would show a curve depicting the speed of the pendulum bob going up and down like hills and valleys over time. However, as the result of dynamical systems theory, time series are now usually graphed in phase or state space with either two or more variables marking each dimension, or one variable is mapped against a time lagged version of the same variable. By graphing times series data in phase space, attractors can be identified more easily. Our ability to graph such times series and to determine their attractors has been greatly accelerated by the rise of the personal computer.
See: Attractor; Phase Space
Bibliography: Guastello (1995); Ott, Sauer, Yorke 1994)
[/fusion_content_box][fusion_content_box title=”Turning Machine” animation_direction=”left” animation_speed=”0.3″]
A hypothetical “universal” computer envisioned by the great English mathematician and founder of modern computer languages, Alan Turing (who incidentally helped the British break the famous “Enigma Code” of the Germans during World War II.) Turing used this concept of a “universal computer” to prove that there were some mathematical problems which could not be solved by a “mechanical” procedure (or algorithm) generated on a computer, that is, there are certain well-defined mathematical problems which are not computable.
See: Algorithm; Church-Turing Thesis
Bibliography: Goertzel (1993); Penrose (1989); Sulis in Robertson and Combs (1995)
[/fusion_content_box][fusion_content_box title=”Wicked Questions” animation_direction=”left” animation_speed=”0.3″]
The management/complexity theorist Brenda Zimmerman’s term for the kind of hard- hitting challenges to which managers need to subject their plans and organizing schemes. Wicked questions serve to dislodge self-fulfilling prophecies, open the ground for new experimental possibilities and increase information in a system, thereby facilitating far-from- equilibrium conditions and self-organization.
See: Difference Questioning; Information; Purpose Contrasting
Bibliography: Zimmerman
[/fusion_content_box][/fusion_content_boxes][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page][fusion_builder_container hundred_percent=”no” hundred_percent_height=”no” hundred_percent_height_scroll=”no” hundred_percent_height_center_content=”yes” equal_height_columns=”no” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” status=”published” background_position=”center center” background_repeat=”no-repeat” fade=”no” background_parallax=”none” enable_mobile=”no” parallax_speed=”0.3″ video_aspect_ratio=”16:9″ video_loop=”yes” video_mute=”yes” border_style=”solid” flex_column_spacing=”0px” type=”legacy”][fusion_builder_row][fusion_builder_column type=”1_1″ type=”1_1″ layout=”1_1″ center_content=”no” target=”_self” hide_on_mobile=”small-visibility,medium-visibility,large-visibility” background_position=”left top” background_repeat=”no-repeat” hover_type=”none” border_style=”solid” border_position=”all” animation_direction=”left” animation_speed=”0.3″ last=”true” border_sizes_top=”0px” border_sizes_bottom=”0px” border_sizes_left=”0px” border_sizes_right=”0px” first=”true” min_height=”” link=””][fusion_title hide_on_mobile=”small-visibility,medium-visibility,large-visibility” content_align=”center” size=”1″ text_color=”#67c100″ style_type=”default”]
References
[/fusion_title][fusion_text columns=”2″ rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Abraham, R. (1982).Dynamics: The Geometry of Behavior (Four Volumes). Santa Cruz, CA: Aerial Press
Allman, W. F. (1989). Apprentices of Wonder: Inside the Neural Network Revolution. NY: Bantam Books.
Anderson, P. (1972). More is different: Broken symmetry and the nature of the hierarchical structure of science. Science, 177 (4047), 393-396.
Anderson, Ruth A.; Issel, L. Michele; McDaniel , Reuben R. Jr. (2003). Nursing Homes as Complex Adaptive Systems: Relationship Between Management Practice and Resident Outcomes. Nursing Research. 52(1):12-21.
Axelrod, R. (1984). The Evolution of Cooperation. New York, NY: Basic Books.
Axelrod, R. & Cohen, M. (2000). Harnessing Complexity: Organizational Implications of a Scientific Frontier. NY: Basic Books.
Bak, P. (1996). How Nature Works: The Science of Self-organized Criticality. NY: Springer-Verlag
Barabási, A. L. (2002). Linked: The New Science of Networks. Cambridge, MA: Perseus.
Campbell, J. (1982). Grammatical Man: Information Theory, Entropy, Languages, and Life. NY: Simon and Shuster.
Carroll, S. (2005). Endless Forms Most Beautiful: The New Science of Evo Devo. NY: W. W. Norton & Co.
Chandler, J. & Van de Vijver, G. (2000). Closure: Emergent Organizations and their Dynamics. NY: The New York Academy of Sciences.
Darling, D. (2005). Teleportation: The Impossible Leap. Hoboken, NJ: John Wiley and Sons.
De Duve, C. (2005). Singularities: Landmarks on the Pathways of Life. NY: Cambridge University Press.
Dyson, G. (1998). Darwin among the Machines: The Evolution of Global Intelligence. NY: Basic Books.
Emergence: Complexity and Organization. (2004 to present: quarterly journal—available also as yearly volumes, edited by K. Richardson, et.al.)
Eoyang & Olson (2001). Facilitating Organizational Change: Lessons from Complexity Science. San Francisco, CA: Jossey-Bass/Pfeiffer.
Epstein, J. (2007). Generative Social Science: Studies in Agent-Based Computational Modeling (Princeton Studies in Complexity): Princeton, NJ: Princeton University Press.
Field, M. & Golubitsky, M. (1996). Symmetry in Chaos: A Search for Pattern in Mathematics, Art, and Nature. NY: Oxford University Press.
Goldberger, A. (1996). Nonlinear dynamics for clinicians: chaos theory, fractals and complexity at the bedside, The Lancet 347, 1312-1314.
Goldstein, J. (1994). The Unshackled Organization. Portland, Oregon: Productivity Press.
Goldstein, J. (1999). Emergence as a Construct: History and Issues, Emergence, 1(1): 49-62.
Goldstein, J. (2006). Emergence, Creative Process, and Self-transcending Constructions,” In K. Richardson (Ed.), Managing Organizational Complexity: Philosophy, Theory, and Application, pp. 63-78. Greenwich, CT: Information Age Press.
Goldstein, J. (2007). A New Model for Emergence and its Leadership Implications, J. Hazy, J. Goldstein, B. Lichtenstein (Eds.), Complex Systems Leadership Theory: New Perspectives from Complexity Science on Social and Organizational Effectiveness, pp. 62-93. Mansfield, MA: ISCE Publishing.
Guastello, S. (1995). Chaos, Catastrophe, and Human Affairs: Applications of Nonlinear Dynamics To Work, Organizations, and Social Evolution. Mahwah, NJ: Lawrence Erlbaum Associates.
Guastello, S. (2001). Managing Emergent Phenomena: Nonlinear Dynamics in Work Organizations. Mahwah, NJ: Lawrence Erlbaum Associates.
Griffeath, D. & Moore, C. (2003). New Constructions in Cellular Automata. (Santa Fe Institute Studies in the Sciences of Complexity. NY: Oxford University Press.
Haken, H. (1981). The Science of Structure: Synergetics. NY: Van Nostrand Reinhold.
Hazy, J., Goldstein, J., & Lichtenstein, B. (2007). Complex Systems Leadership Theory: New Perspectives from Complexity Science on Social and Organizational Effectiveness. Mansfield, MA: ISCE Publishing.
Holland, John. (1994). Hidden order: How adaptation builds complexity. Reading, MA: Addison-Wesley.
Holland, J. (1998). Emergence: From Chaos to Order. Reading, MA: Addison-Wesley.
Kauffman, S. (1993), The Origins of Order: Self-organization and Selection in Evolution. NY: Oxford University Press.
Kauffman, S. (1995), At Home in the Universe: The Search for Laws of Self-organization and Complexity. NY: Oxford University Press.
Kelly K. (1995). Out of Control: The New Biology of Machines, Social Systems, and the Economic World, 528 pp. Reading, MA, USA: Addison Wesley Longman.
Kilduff, M. & Tsai, W. (2003). Social Networks and Organizations. Thousand Oaks, CA: Sage.
Langton, C.G. 1986. Studying Artificial Life with Cellular Automata, In D. Farmer, A Lapedes, N. Packard, and B. Wendroff (Eds.) Evolution, Games, and Learning: Models for Adaptation in Machines and Nature, Proceedings of the Fifth Annual Conference of the Center for Nonlinear Studies, Los Alamos, NM, May 20-24, pp. 120-149. 1985. Amsterdam: North- Holland.
Laughlin, R. (2006) A Different Universe: Reinventing Physics from the Bottom Down. NY: Basic Books.
Luhmann, N., Bednarz, J., & Baecker, D. (1996). Social Systems. Palo Alto, CA: Stanford University Press.
Mandelbrot, B.B. (1977). Fractals, Form, Chance and Dimension, W.H. Freeman and Co., San Francisco.
Mandelbrot, B.B. (1982). The Fractal Geometry of Nature, W.H. Freeman and Co., San Francisco.
Margulis, L., & Sagan, D. (2002). Acquiring Genomes: A Theory of the Origins of Species. NY: Basic Books.
Maturana, H. and Varela, F.(1980). Autopoeisis and Cognition. Boston: D. Reidel.
Mead, G. H. (2002). The Philosophy of the Present. Prometheus Books.
Morgan, G. (1997). Images of Organization. Thousand Oaks, CA: Sage.
Newman, M., Barabasi, A., Watts, D. (Eds.). (2006). The Structure and Dynamics of Networks. Princeton, NJ: Princeton University Press.
Nicolis, G. (1989). Physics of Far-from-equilibrium Systems, In P. Davies (Ed.) The New Physics. Cambridge, England: Cambridge University Press.
Nicolis, G. and Prigogine, I. (1989). Exploring complexity: An introduction. NY: W. H. Freeman and Company.
Ott, E. (2003). Chaos in Dynamical Systems. NY: Cambridge University Press.
Ott, E., Sauer, T., Yorke, J. (1994). Coping with Chaos: Analysis of Chaotic Data and The Exploitation of Chaotic Systems. Somerset, NJ: Wiley-Interscience.
Page, S. (2007). The Difference: How the Power of Diversity Creates Better Groups, Firms, Schools, and Societies. Princeton, NJ: Princeton University Press.
Poundstone, W. (1985). The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge. Chicago: Contemporary Books.
Reid, R. G. B. (2007). Biological Emergences: Evolution by Natural Experiment. Cambridge, MA: MIT Press.
Richardson, K. & Goldstein, J. (Eds.). (2007). Classic Complexity: From the Abstract to the Concrete (Exploring Complexity, Volume 2). Mansfield, MA: ISCE Publishing.
Rogers, E. (2003). Diffusion of Innovation (5th Edition). NY: Free Press.
Scott, A., (Ed). (2005). Encyclopedia of Nonlinear Science. NY: Routledge.
Schroeder, M. (1991). Fractals, Chaos, Power Laws: Minutes from an Infinite Paradise. NY: W. H. Freeman & Co.
Sole, R. & Goodwin, B. (2000). Signs of Life: How Complexity Pervades Biology. NY: Basic Books.
Stacey, R. (2001). Complex Responsive Processes in Organizations: Learning and Knowledge Creation (Complexity and Emergence in Organizations). London, UK: Routledge.
Stacey, R. (2007). Strategic Management and Organisational Dynamics: The Challenge of Complexity to Ways of Thinking About Organisations, 5th Ed., London: Pearson Education.
Sternin, J. & Choo, R. (2000). The Power of Positive Deviance. Harvard Business Review. January – February 2000, 14, 15. (available at http://www.positivedeviance.org/)
Sternin, J. (2003). Positive Deviance for Extraordinary Social and Organizational Change. In D. Ulrich, M. Goldsmith, L. Carter, J. Bolt, N. Smallwood (Eds.). The Change Champion’s Fieldguide. 20-37. NY: Best Practice Publications LLC.
Strogatz, S. (2003). Sync: The Emerging Science of Spontaneous Order. NY: Hyperion.
Trudeau, R. (1993). Introduction to Graph Theory. NY: Dover Publications.
Van de Ven, A. & Garud, R. (1994). The Coevolution of Technical and Institutional Events in the Development of an Innovation. In J. Baum & J. Singh (Eds.), Evolutionary Dynamics of Organizations. pp. 425-443. NY: Oxford University Press.
Watts, D. (1999). Small Worlds: The Dynamics of Networks between Order and Randomness. Princeton, NJ: Princeton University Press.
Weber, D., & Depew, B. W. (1994). Darwinism Evolving: System Dynamics and the Genealogy of Natural Selection. Cambridge, MA: MIT Press.
West, B. J. (1990). Fractal Physiology and Chaos in Medicine (Studies of Nonlinear Phenomena in Life Science, Vol.1), Singapore: World Scientific.
West, B. J., (2006). Where Medicine Went Wrong; Rediscovering the Path to Complexity, Singapore: World Scientific.
Zimmerman, B., Lindberg, C., Plsek, P. (2001). Edgeware: Insights from Complexity Science for Health Care Leaders. VHA.
[/fusion_text][fusion_text rule_style=”default” hide_on_mobile=”small-visibility,medium-visibility,large-visibility”]
Bibliographical References to Glossary
(not included elsewhere)
- Bak, P. (1996). How Nature Works: The Science of Self-organized Criticality. NY: Springer-Verlag
- Bennett, C. (1982). The Thermodynamics of Computation: A Review. International Journal of Theoretical Physics, 21: 905-940.
- Campbell, J. (1982). Grammatical Man: Information Theory, Entropy, Languages, and Life. NY: Simon and Shuster.
- Chaitin, G. (1987). Algorithmic Information Theory. NY: Cambridge University Press.;
- Haken, H. (1981). The Science of Structure: Synergetics. NY: Van Nostrand Reinhold.
- Langton, C.G. 1986. Studying Artificial Life with Cellular Automata, In D. Farmer, A Lapedes, N. Packard, and B. Wendroff (Eds.) Evolution, Games, and Learning: Models for Adaptation in Machines and Nature, Proceedings of the Fifth Annual Conference of the Center for Nonlinear Studies, Los Alamos, NM, May 20-24, pp. 120-149. 1985. Amsterdam: North- Holland. Maturana, H. & Varela, F. (1992). The Tree of Knowledge. Boston: Shambhala
- Penrose, R. (1989). The Emperor’s New Mind. NY: Oxford University Press.
- Poundstone, W. (1985). The Recursive Universe: Cosmic Complexity and the Limits of Scientific Knowledge. Chicago: Contemporary Books.
- Senge, P. (1990). The Fifth Discipline. NY: Doubleday.
- Van de Ven, A. & Garud, R. (1994). The Coevolution of Technical and Institutional Events in the Development of an Innovation. In J. Baum & J. Singh (Eds.) Evolutionary Dynamics of Organizations. pp. 425-443. NY: Oxford University Press.
[/fusion_text][/fusion_builder_column][/fusion_builder_row][/fusion_builder_container][fusion_builder_next_page]