Complex adaptive systems
Complex adaptive systems are special cases of complex systems, often defined as a 'complex macroscopic collection' of relatively 'similar and partially connected micro-structures' – formed in order to adapt to the changing environment, and increase its survivability as a macro-structure.
- Quotes are arranged alphabetically by author
A - F
- The inductive-reasoning system I have described above consists of a multitude of “elements” in the form of belief-models or hypotheses that adapt to the aggregate environment they jointly create. Thus it qualifies as an adaptive complex system. After some initial learning time, the hypotheses or mental models in use are mutually co-adapted. Thus we can think of a consistent set of mental models as a set of hypotheses that work well with each other under some criterion—that have a high degree of mutual adaptedness. Sometimes there is a unique such set, it corresponds to a standard rational expectations equilibrium, and beliefs gravitate into it. More often there is a high, possibly very high, multiplicity of such sets. In this case we might expect inductive reasoning systems in the economy—whether in stock-market speculating, in negotiating, in poker games, in oligopoly pricing, in positioning products in the market—to cycle through or temporarily lock into psychological patterns that may be non-recurrent, path-dependent, and increasingly complicated. The possibilities are rich.
- W. Brian Arthur. "Inductive Reasoning and Bounded Rationality (The El Farol Problem)" in Amer. Econ. Review (Papers and Proceedings), 84, 406, 1994. p. 8
- A more viable model, one much more faithful to the kind of system that society is more and more recognized to be, is in process of developing out of, or is in keeping with, the modern systems perspective (which we use loosely here to refer to general systems research, cybernetics, information and communication theory, and related fields). Society, or the sociocultural system, is not, then, principally an equilibrium system or a homeostatic system, but what we shall simply refer to as a complex adaptive system.
- Walter F. Buckley (1968) "Society as a complex adaptive system" in: Modern systems research for the behavioral scientist. Walter Buckley ed. p. 49
- The notion of system we are interested in may be described generally as a complex of elements or components directly or indirectly related in a network of interrelationships of various kinds, such that it constitutes a dynamic whole with emergent properties.
- Walter F. Buckley (1998). Society: A Complex Adaptive System--Essays in Social Theory. p. 35
- A basic assumption within these theories is that organizations are complex adaptive systems (Anderson, 1999; Axelrod and Cohen, 1999), composed of semiautonomous agents that seek to maximize fitness by adjusting interpretative and action-oriented schema that determine how they view and interact with other agents and the environment.
- Kevin Dooley et al. (2003: 62): Cited in: Antonacopoulou, E. P., and Ricardo Chiva. "Social complex evolving systems: Implications for organizational learning." 6th International Organizational Knowledge, Learning and Capabilities Conference, Boston, MA, USA. 2005.
G - L
- A complex adaptive system acquires information about its environment and its own interaction with that environment, identifying regularities in that information, condensing those regularities into a kind of "schema", or model, and acting in the real world on the basis of that schema.
- Murray Gell-Mann (1995) The Quark and the Jaguar: Adventures in the Simple and the Complex, p. 17
- Many of the core ideas of cybernetics have been assimilated by other disciplines, where they continue to influence scientific developments. Other important cybernetic principles seem to have been forgotten, though, only to be periodically rediscovered or reinvented in different domains. Some examples are the rebirth of neural networks, first invented by cyberneticists in the 1940's, in the late 1960's and again in the late 1980's; the rediscovery of the importance of autonomous interaction by robotics and AI in the 1990's; and the significance of positive feedback effects in complex systems, rediscovered by economists in the 1990's. Perhaps the most significant recent development is the growth of the complex adaptive systems movement, which, in the work of authors such as John Holland, Stuart Kauffman and Brian Arthur and the subfield of artificial life, has used the power of modern computers to simulate and thus experiment with and develop many of the ideas of cybernetics. It thus seems to have taken over the cybernetics banner in its mathematical modelling of complex systems across disciplinary boundaries, however, while largely ignoring the issues of goal-directedness and control.
- A Complex Adaptive System (CAS) is a dynamic network of many agents (which may represent cells, species, individuals, firms, nations) acting in parallel, constantly acting and reacting to what the other agents are doing. The control of a CAS tends to be highly dispersed and decentralized. If there is to be any coherent behavior in the system, it has to arise from competition and cooperation among the agents themselves. The overall behavior of the system is the result of a huge number of decisions made every moment by many individual agents.
- John Holland Cited in M. Mitchell Waldrop (1994) Complexity: The Emerging Science at the Edge of Order and Chaos
M - Z
- Complexly structured, non-additive behavior emerges out of interactive networks.... Interactive agents unite in an ordered state of sorts, and the behavior of the resulting whole is more than the sum of individual behaviors. Ordered states... [arise] … when a unit adapts its individual behaviors to accommodate the behaviors of units with which it interacts. Poincare observed this phenomenon mathematically among colliding particles, which impart some of their resonance to each other leading to a degree of synchronized resonance. Interacting people and organizations tend similarly to adjust their behaviors and worldviews to accommodate others with whom they interact. Networks with complex chains of interaction allow large systems to correlate, or self-order.
- Marion, R. and J. Bacon. 2000. "Organizational Extinction and Complex Systems."” Emergence 1(4). p. 75. As cited in: Begun, James W., Brenda Zimmerman, and Kevin Dooley. "Health care organizations as complex adaptive systems." Advances in health care organization theory 253 (2003): 288.
- A complex adaptive system is composed of interacting 'agents' following rules, exchanging influence with their local and global environments and altering the very environment they are responding to by virtue of their simple actions.
- Sherman, H. and Schultz, R. (1998) Open Boundaries. New York: Perseus Books. Cited in: Antonacopoulou & Chiva (2005)