Welcome to the Principia Cybernetica Web

Author: Editors

Updated: Mar 23, 1998

Filename: DEFAULT.html

To get started, there is an introduction with background and motivation, and an overview, summarizing the project as a whole.

Recent Changes "What's new" on this server Searchable index keyword search of all documents (titles and full-text). Table of Contents a long hierarchical outline, which provides a "standard ordering" through the main material. Random Link jump to an arbitrary node. Useful to get unusual suggestions for areas to explore. "Hit Parade" nodes ordered according to popularity, and other usage statistics for the server (out-of-date). Map the picture below is a clickable map of the most important nodes of the PCP hierarchy.

Although Principia Cybernetica Web has received very positive reviews, the work is of course never finished. The material in this web is continuously being added to and improved. Nodes followed by the mention "[empty]" don't contain any text yet, only a menu of linked nodes. Some important results have not yet been converted to hypertext, but may be found in the papers in our FTP-archive.

Comments about content and presentation of the information are appreciated. If you have any technical problems, questions or suggestions on our Web, please contact the "Webmaster", Francis Heylighen (PCP@vub.ac.be). Comments about the content of a node can be addressed to its author(s). You can also directly annotate each node separately, or add general comments to the User Annotations.

We apologize for difficulties you might have in getting files from this server: Internet connections between Belgium and especially America are often overloaded. Try to avoid the most busy periods: 15.00 to 0.00 hrs (European time), i.e. 9.00 to 18.00 (US East Coast) or 6.00 to 15.00 (US West Coast), on weekdays. We would like to establish a mirror site in the US in order to avoid this problem in the future: proposals welcome! At present we only have a Belgian back-up FTP-server with WWW documents at ftp.vub.ac.be for emergencies, but it is not kept up-to-date. These servers are part of the network of the Free University of Brussels.

If you plan to regularly consult this server, you might keep a copy of this home page on your own computer.

Introduction to Principia Cybernetica

Author: Heylighen, Joslyn, Turchin,

Updated: Apr 1, 1996

Filename: INTRO.html

Every time has its own approach to these eternal philosophical questions, deriving from its knowledge and technology. We hold that in our time, the age of information, it is systems science and cybernetics, as the general sciences of organization and communication, that can provide the basis for contemporary philosophy. Therefore, this philosophical system is derived from, and further develops, the basic principles of cybernetics.

Moreover, we start from the thesis that systems at all levels have been constructed by evolution, which we see as a continuing process of self-organization, based on variation and natural selection of the "fittest" configuration. Evolution continuously creates complexity and makes systems more adaptive by giving them better control over their environments. We consider the emergence of a new level of control as the quantum of evolution, and call it a "metasystem transition".

As cybernetic theory informs our philosophy, so cybernetic technology lets us do things that philosophers of other times could only dream of. Using computer technology, we develop a large philosophical text from many nodes which are linked together with different relationships. Readers can navigate among the many concepts, guided by their individual understanding and interests. Disparate material can be integrated together while being written and read by collaborators from all around the world, undergoing variation and selection. Thus we apply theories about the evolution of cybernetic systems to the practical development of this very system of philosophy.

We hold that PCP is more than an interesting experiment, and that there is an acute need for an approach similar to PCP. The on-going explosion and fragmentation of knowledge demands a renewed effort at integration. This has always been the dream of the systems theorists; all they lacked was the appropriate technology to attack the complexity of the task.

PCP draws its inspiration from many predecessors in intellectual history, including philosophers, systems scientists and cyberneticians, and others who have tried to collaboratively develop complex systems of thought.

This effort has been on-going since 1989, and is now in the stage of implementation (see our history). Of course, the task is enormous, and we are still beginning. If you are really interested in our Project, we invite you to join our efforts and become a contributor.

For further introductory reading, see the following documents:

Eternal Philosophical Questions

Author: F. Heylighen,

Updated: Nov 5, 1997

Filename: ETERQUES.html

The present document brings these different questions and answers together, in the form of a "FAQ" (Frequently Asked Questions). The answers given here are by necessity short. They barely scratch the surface of a profound and complex issue. However, where available, we have included links to other documents which discuss the problem in more detail. The present document can be seen as a roadmap, which will help philosophically interested readers to better explore the Principia Cybernetica world view.

Philosophy

Author: C. Joslyn,

Updated: Aug 1993

Filename: PHILOSI.html

We begin with the idea that philosophy is a kind of clear, deep thought; essentially putting our thought and language in order. This apparently analytic and linguistic understanding arises from the explicit recognition that all expression and communication, in particular all works of philosophy, the body of Principia Cybernetica, and this article itself, exist in a physical form as a series of symbol tokens in a particular modality and interpretable in a specific language and interpretational framework. It is impossible to consider philosophy in particular outside of the context of its processes and products. In that respect, philosophy must be understood as a process of philosophizing in which linguistic symbol tokens are produced and received. This includes the normal linguistic forms of speaking, hearing, reading, and writing, but also other linguistic forms such as diagrams, mathematics, and sign language. The authors of this paper philosophize as they write it; the readers philosophize as they read it. This article itself cannot have any existence "as philosophy" outside of this context of its production and/or reception.

What then distinguishes philosophical linguistic productions from any other? It is tempting to distinguish philosophy on the basis of its content, that is its referents, or what it is "about". Then we would believe, as some cybernetic philosophers have suggested \cite{BAA53a}, that philosophy is linguistic thought which refers to specific deep questions, e.g. about existence and knowledge, the nature of thought, and the ultimate good. We do not deny this, but do not believe that it is a good place to start in finding a definition.

Rather the focus on philosophizing as a process leads us to consider philosophy as any language conducted in a certain manner. In particular, whenever we deal with issues in depth, continually asking "why" and "how" to critically analyze underlying assumptions and move to the foundations of our complex knowledge structures, then that is necessarily philosophy. Thus we construct philosophy of language, of mind, or of law when we consider these specific subjects in their depth. Surely we could have a philosophy of plumbing or gum chewing should we wish.

As we proceed in the question asking mode towards deep thought and thus philosophy, then of course we are naturally drawn to the traditional philosophical questions outlined above. But what distinguishes them as the quintessential philosophical problems is their generality. Thus if we restrict ourselves specifically to (say) philosophy of law or plumbing, then perhaps we can avoid certain general philosophical issues. Philosophy per se is simply the result of philosophizing in an unrestricted domain of discourse.

See also: Cybernetics and Philosophy(paper by Turchin in tex format)

Links on Philosophy

Epistemology, introduction

Author: F. Heylighen,

Updated: Sep 1993

Filename: EPISTEMI.html

When we look at the history of epistemology, we can discern a clear trend, in spite of the confusion of many seemingly contradictory positions. The first theories of knowledge stressed its absolute, permanent character, whereas the later theories put the emphasis on its relativity or situation-dependence, its continuous development or evolution, and its active interference with the world and its subjects and objects. The whole trend moves from a static, passive view of knowledge towards a more and more adaptive and active one.

Let us start with the Greek philosophers. In Plato's view knowledge is merely an awareness of absolute, universal Ideas or Forms, existing independent of any subject trying to apprehend to them. Though Aristotle puts more emphasis on logical and empirical methods for gathering knowledge, he still accepts the view that such knowledge is an apprehension of necessary and universal principles. Following the Renaissance, two main epistemological positions dominated philosophy: empiricism, which sees knowledge as the product of sensory perception, and rationalism which sees it as the product of rational reflection.

The implementation of empiricism in the newly developed experimental sciences led to a view of knowledge which is still explicitly or implicity held by many people nowadays: the reflection-correspondence theory. According to this view knowledge results from a kind of mapping or reflection of external objects, through our sensory organs, possibly aided by different observation instruments, to our brain or mind. Though knowledge has no a priori existence, like in Plato's conception, but has to be developed by observation, it is still absolute, in the sense that any piece of proposed knowledge is supposed to either truly correspond to a part of external reality, or not. In that view, we may in practice never reach complete or absolute knowledge, but such knowledge is somehow conceivable as a limit of ever more precise reflections of reality.

The following important theory developed in that period is the Kantian synthesis of rationalism and empiricism. According to Kant, knowledge results from the organization of perceptual data on the basis of inborn cognitive structures, which he calls "categories". Categories include space, time, objects and causality. This epistemology does accept the subjectivity of basic concepts, like space and time, and the impossibility to reach purely objective representations of things-in-themselves. Yet the a priori categories are still static or given.

The next stage of development of epistemology may be called pragmatic. Parts of it can be found in early twentieth century approaches, such as logical positivism, conventionalism, and the "Copenhagen interpretation" of quantum mechanics. This philosophy still dominates most present work in cognitive science and artificial intelligence. According to pragmatic epistemology, knowledge consists of models that attempt to represent the environment in such a way as to maximally simplify problem-solving. It is assumed that no model can ever hope to capture all relevant information, and even if such a complete model would exist, it would be too complicated to use in any practical way. Therefore we must accept the parallel existence of different models, even though they may seem contradictory. The model which is to be chosen depends on the problems that are to be solved. The basic criterion is that the model should produce correct (or approximate) predictions (which may be tested) or problem-solutions, and be as simple as possible. Further questions about the "Ding an Sich" or ultimate reality behind the model are meaningless.

The pragmatic epistemology does not give a clear answer to the question where knowledge or models come from. There is an implicit assumption that models are built from parts of other models and empirical data on the basis of trial-and-error complemented with some heuristics or intuition. A more radical point of departure is offered by constructivism. It assumes that all knowledge is built up from scratch by the subject of knowledge. There are no 'givens', neither objective empirical data or facts, nor inborn categories or cognitive structures. The idea of a correspondence or reflection of external reality is rejected. Because of this lacking connection between models and the things they represent, the danger with constructivism is that it may lead to relativism, to the idea that any model constructed by a subject is as good as any other and that there is no way to distinguish adequate or 'true' knowledge from inadequate or 'false' knowledge.

We can distinguish two approaches trying to avoid such an 'absolute relativism'. The first may be called individual constructivism. It assumes that an individual attempts to reach coherence among the different pieces of knowledge. Constructions that are inconsistent with the bulk of other knowledge that the individual has will tend to be rejected. Constructions that succeed in integrating previously incoherent pieces of knowledge will be maintained. The second, to be called social constructivism, sees consensus between different subjects as the ultimate criterion to judge knowledge. 'Truth' or 'reality' will be accorded only to those constructions on which most people of a social group agree.

In these philosophies, knowledge is seen as largely independent of a hypothetical 'external reality' or environment. As the 'radical' constructivists Maturana and Varela argue, the nervous system of an organism cannot in any absolute way distinguish between a perception (caused by an external phenomenon) and a hallucination (a purely internal event). The only basic criterion is that different mental entities or processes within or between individuals should reach some kind of equilibrium.

Though these constructivistic approaches put much more emphasis on the changing and relative character of knowledge, they are still absolutist in the primacy they give to either social consensus or internal coherence, and their description of construction processes is quite vague and incomplete. A more broad or synthetic outlook is offered by different forms or evolutionary epistemology. Here it is assumed that knowledge is constructed by the subject or group of subjects in order to adapt to their environment in the broad sense. That construction is an on-going process at different levels, biological as well as psychological or social. Construction happens through blind variation of existing pieces of knowledge, and the selective retention of those new combinations that somehow contribute most to the survival and reproduction of the subject(s) within their given environment. Hence we see that the 'external world' again enters the picture, although no objective reflection or correspondence is assumed, only an equilibrium between the products of internal variation and different (internal or external) selection criteria. Any form of absolutism or permanence has disappeared in this approach, but knowledge is basically still a passive instrument developed by organisms in order to help them in their quest for survival.

A most recent, and perhaps most radical approach, extends this evolutionary view in order to make knowledge actively pursue goals of its own. This approach, which as yet has not had the time to develop a proper epistemology, may be called memetics. It notes that knowledge can be transmitted from one subject to another, and thereby loses its dependence on any single individual. A piece of knowledge that can be transmitted or replicated in such a way is called a 'meme'. The death of an individual carrying a certain meme now no longer implies the elimination of that piece of knowledge, as evolutionary epistemology would assume. As long as a meme spreads more quickly to new carriers, than that its carriers die, the meme will proliferate, even though the knowledge it induces in any individual carrier may be wholly inadequate and even dangerous to survival. In this view a piece of knowledge may be succesful (in the sense that it is common or has many carriers) even though its predictions may be totally wrong, as long as it is sufficiently 'convincing' to new carriers. Here we see a picture where even the subject of knowledge has lost his primacy, and knowledge becomes a force of its own with proper goals and ways of developing itself. That this is realistic can be illustrated by the many superstitions, fads, and irrational beliefs that have spread over the globe, sometimes with a frightening speed.

Like social constructivism, memetics attracts the attention to communication and social processes in the development of knowledge, but instead of seeing knowledge as constructed by the social system, it rather sees social systems as constructed by knowledge processes. Indeed, a social group can be defined by the fact that all its members share the same meme (Heylighen, 1992). Even the concept of 'self', that which distinguishes a person as a individual, can be considered as a piece of knowledge, constructed through social processes (HarrŽ, 19), and hence a result of memetic evolution. From a constructivist approach, where knowledge is constructed by individuals or society, we have moved to a memetic approach, which sees society and even individuality as byproducts constructed by an ongoing evolution of independent fragments of knowledge competing for domination.

We have come very far indeed from Plato's immutable and absolute Ideas, residing in an abstract realm far from concrete objects or subjects, or from the naive realism of the reflection-correspondence theory, where knowledge is merely an image of external objects and their relations. At this stage, the temptation would be strong to lapse into a purely anarchistic or relativistic attitude, stating that 'anything goes', and that it would be impossible to formulate any reliable and general criteria to distinguish 'good' or adequate pieces of knowledge from bad or inadequate ones. Yet in most practical situations, our intuition does help us to distinguish perceptions from dreams or hallucinations, and unreliable predictions ('I am going to win the lottery') from reliable ones ('The sun will come up tomorrow morning'). And an evolutionary theory still assumes a natural selection which can be understood to a certain degree. Hence we may assume that it is possible to identify selection criteria, but one of the lessons of this historical overview will be that we should avoid to quickly formulate one absolute criterion. Neither correspondence, nor coherence or consensus, and not even survivability, are sufficient to ground a theory of knowledge. At this stage we can only hope to find multiple, independent, and sometimes contradictory criteria, whose judgment may quickly become obsolete. Yet if we would succeed to formulate these criteria clearly, within a simple and general conceptual framework, we would have an epistemology that synthesizes and extends al of the traditional and less traditional philosophies above.

Metaphysics, introduction

Author: Turchin, Joslyn, Heylighen,

Updated: Aug 1993

Filename: METAPHI.html

Such a theory would obviously be priceless for judging and constructing more specific physical theories. When we understand language as a hierarchical model of reality, i.e. a device which produces predictions, and not as a true static picture of the world, metaphysics is understood as much more valuable than just the "free fantasy" of philosophers. To say that the real nature of the world is a certain way means to propose the construction of a model of the world along those lines. Metaphysics creates a linguistic model (logical or conceptual structure) to serve as a basis for further refinements. Even though a mature physical theory fastidiously distinguishes itself from metaphysics by formalizing its basic notions and introducing verifiable criteria, metaphysics, in a very important sense, is physics.

Philosophies traditionally start with an ontology or metaphysics: a theory of being in itself, of the essence of things, of the fundamental principles of existence and reality. In a traditional systemic philosophy, "organization" might be seen as the fundamental principle of being, rather than God, matter, or the laws of nature. However this still begs the question of where this organization comes from. In a constructive systemic philosophy, on the other hand, the essence is the process through which this organization is created.

Process Metaphysics

Author: F. Heylighen,

Updated: Jan 24, 1997

Filename: PROCMETA.html

See further:

Ontology, introduction

Author: F. Heylighen,

Updated: Aug 15, 1995

Filename: ONTOLI.html

a branch of metaphysics relating to the nature and relations of being a particular theory about the nature of being or the kinds of existence

Recently, the term of "(formal) ontology" has been up taken by researchers in Artificial Intelligence, who use it to designate the building blocks out of which models of the world are made.(see e.g. " What is an ontology?"). An agent (e.g. an autonomous robot) using a particular model will only be able to perceive that part of the world that his ontology is able to represent. In a sense, only the things in his ontology can exist for that agent. In that way, an ontology becomes the basic level of a knowledge representation scheme. See for example my set of link types for a semantic network representation which is based on a set of "ontological" distinctions: changing-invariant, and general-specific.

Ethics, introduction



Updated: Aug 1993

Filename: ETHICSI.html

What is a world view?

Author: F. Heylighen,

Updated: Dec 9, 1996

Filename: WORLVIEW.html

What we need is a framework that ties everything together, that allows us to understand society, the world, and our place in it, and that could help us to make the critical decisions which will shape our future. It would synthesize the wisdom gathered in the different scientific disciplines, philosophies and religions. Rather than focusing on small sections of reality, it would provide us with a picture of the whole. In particular, it would help us to understand, and therefore cope with, complexity and change. Such a conceptual framework may be called a "world view".

The Belgian philosopher Leo Apostel has devoted his life to the development of such an integrating world view. As he quickly understood, the complexity of this task is too great for one man. Therefore, a major part of Apostel's efforts were directed at gathering other people, with different scientific and cultural backgrounds, to collaborate on this task. Only in the last years of his life, after several failed attempts, did he managed to create such an organization: the "Worldviews" group, which includes people from disciplines as diverse as engineering, psychiatry, theology, theoretical physics, sociology and biology.

Their first major product was a short book entitled "World views, from fragmentation to integration". This booklet is a call to arms, a program listing objectives rather than achievements. Its main contribution is a clear definition of what a world view is, and which are its necessary components. The "Worldviews" group has continued to work on different components and aspects of this general objective. Many of its members are also involved in a new interdisciplinary research center at the Free University of Brussels, which is named after Leo Apostel: the "Center Leo Apostel".

The book lists seven fundamental components of a world view. I will discuss them one by one, using a formulation which is slightly different from the one in the book, but which captures the main ideas.

A model of the world It should allow us to understand how the world functions and how it is structured. "World" here means the totality, everything that exists around us, including the physical universe, the Earth, life, mind, society and culture. We ourselves are an important part of that world. Therefore, a world view should also answer the basic question: "Who are we?". Explanation The second component is supposed to explain the first one. It should answer the questions: "Why is the world the way it is? Where does it all come from? Where do we come from?". This is perhaps the most important part of a world view. If we can explain how and why a particular phenomenon (say life or mind) has arisen, we will be able to better understand how that phenomenon functions. It will also help us to understand how that phenomenon will continue to evolve. Futurology This extrapolation of past evolution into the future defines a third component of a world view: futurology. It should answer the question "Where are we going to?" It should give us a list of possibilities, of more or less probable future developments. But this will confront us with a choice: which of the different alternatives should we promote and which should we avoid? Values This is the more fundamental issue of value: "What is good and what is evil?" The theory of values defines the fourth component of a world view. It includes morality or ethics, the system of rules which tells us how we should or should not behave. It also gives us a sense of purpose, a direction or set of goals to guide our actions. Together with the answer to the question "why?", the answer to the question "what for?", may help us to understand the real meaning of life. Action Knowing what to strive for does not yet mean knowing how to get there, though. The next component must be a theory of action (praxiology). It would answer the question "How should we act?" It would help us to solve practical problems and to implement plans of action. Knowledge Plans are based on knowledge and information, on theories and models describing the phenomena we encounter. Therefore, we need to understand how we can construct reliable models. This is the component of knowledge acquisition. It is equivalent to what in philosophy is called "epistemology" or "the theory of knowledge". It should allow us to distinguish better theories from worse theories. It should answer the traditional philosophical question "What is true and what is false?" Building Blocks The final point on the agenda of a world view builder is not meant to answer any fundamental question. It just reminds us that world views cannot be developed from scratch. You need building blocks to start with. These building blocks can be found in existing theories, models, concepts, guidelines and values, scattered over the different disciplines and ideologies. This defines the seventh component: fragments of world views as a starting point.

Cybernetics and Systems Theory

Author: F. Heylighen,

Updated: Apr 29, 1996

Filename: CYBSYSTH.html

Cybernetics and Systems Theory is an interdisciplinary academic domain. Although there are relatively few research centers and even fewer educational programs devoted to the domain, a lot of activity is going on in between established departments. This is shown by the number of associations, conferences and journals active in the domain.

The best way of getting acquainted with the main ideas of cybernetics and systems theory is to read a few of the classic books or papers defining the domain. Other, specific bibliographic references can be found in the library database of the Department of Medical Cybernetics and AI at the University of Vienna. There also exists more general reference material, including our own Web Dictionary of basic concepts.

You can get in touch with cybernetics and systems people via existing mailing lists and newsgroups, personal or departmental home pages, or by visiting conferences in the field (see the Calendar of events from the International Federation of Systems Research).

What are Cybernetics and Systems Science?

Author: F. Heylighen, C. Joslyn, V. Turchin,

Updated: Feb 18, 1998

Filename: CYBSWHAT.html

Systems theory or systems science argues that however complex or diverse the world that we experience, we will always find different types of organization in it, and such organization can be described by principles which are independent from the specific domain at which we are looking. Hence, if we would uncover those general laws, we would be able to analyse and solve problems in any domain, pertaining to any type of system. The systems approach distinguishes itself from the more traditional analytic approach by emphasizing the interactions and connectedness of the different components of a system.

Many of the concepts used by system scientists come from the closely related approach of cybernetics: information, control, feedback, communication... Cybernetics, deriving from the Greek word for steersman (kybernetes), was first introduced by the mathematician Wiener, as the science of communication and control in the animal and the machine (to which we now might add: in society and in individual human beings). It grew out of Shannon's information theory, which was designed to optimize the transmission of information through communication channels, and the feedback concept used in engineering control systems. In its present incarnation of "second-order cybernetics", its emphasis is on how observers construct models of the systems with which they interact (see constructivism).

In fact cybernetics and systems theory study essentially the same problem, that of organization independent of the substrate in which it is embodied. Insofar as it is meaningful to make a distinction between the two approaches, we might say that systems theory has focused more on the structure of systems and their models, whereas cybernetics has focused more on how systems function, that is to say how they control their actions, how they communicate with other systems or with their own components, ... Since structure and function of a system cannot be understood in separation, it is clear that cybernetics and systems theory should be viewed as two facets of a single approach.

This insight has had as a result that the two domains have in practice almost merged: many, if not most, of the central associations, journals and conferences in the field include both terms, "systems" and "cybernetics", in their title.

The following links should provide plenty of introductory material and references. An excellent, easy to read overview of the systems approach can be found in our web edition of the book "The Macroscope". Together with our dictionary, and list of basic books and papers, this should be sufficient for an introductory course in the domain:

Outside links:

What is Systems Theory?

Author: F. Heylighen, C. Joslyn,

Updated: Nov. 1, 1992

Filename: SYSTHEOR.html

Synopsys: Systems Theory: the transdisciplinary study of the abstract organization of phenomena, independent of their substance, type, or spatial or temporal scale of existence. It investigates both the principles common to all complex entities, and the (usually mathematical) models which can be used to describe them.

Systems theory was proposed in the 1940's by the biologist Ludwig von Bertalanffy ( : General Systems Theory, 1968), and furthered by Ross Ashby (Introduction to Cybernetics, 1956). von Bertalanffy was both reacting agaInst reductionism and attempting to revive the unity of science. He emphasized that real systems are open to, and interact with, their environments, and that they can acquire qualitatively new properties through emergence, resulting in continual evolution. Rather than reducing an entity (e.g. the human body) to the properties of its parts or elements (e.g. organs or cells), systems theory focuses on the arrangement of and relations between the parts which connect them into a whole (cf. holism). This particular organization determines a system, which is independent of the concrete substance of the elements (e.g. particles, cells, transistors, people, etc). Thus, the same concepts and principles of organization underlie the different disciplines (physics, biology, technology, sociology, etc.), providing a basis for their unification. Systems concepts include: system-environment boundary, input, output, process, state, hierarchy, goal-directedness, and information.

The developments of systems theory are diverse (Klir, Facets of Systems Science, 1991), including conceptual foundations and philosophy (e.g. the philosophies of Bunge, Bahm and Laszlo); mathematical modeling and information theory (e.g. the work of Mesarovic and Klir); and practical applications. Mathematical systems theory arose from the development of isomorphies between the models of electrical circuits and other systems. Applications include engineering, computing, ecology, management, and family psychotherapy. Systems analysis, developed independently of systems theory, applies systems principles to aid a decisIon-maker with problems of identifying, reconstructing, optimizing, and controlling a system (usually a socio-technical organization), while taking into account multiple objectives, constraints and resources. It aims to specify possible courses of action, together with their risks, costs and benefits. Systems theory is closely connected to cybernetics, and also to system dynamics, which models changes in a network of coupled variables (e.g. the "world dynamics" models of Jay Forrester and the Club of Rome). Related ideas are used in the emerging "sciences of complexity", studying self-organization and heterogeneous networks of interacting actors, and associated domains such as far-from-equilibrium thermodynamics, chaotic dynamics, artificial life, artificial intelligence, neural networks, and computer modeling and simulation.

Francis Heylighen and Cliff Joslyn

Prepared for the Cambridge Dictionary of Philosophy.(Copyright Cambridge University Press)

Analytic vs. Systemic Approaches

Author: J. de Rosnay

Updated: Feb 17, 1997

Filename: ANALSYST.html

The analytic and the systemic approaches are more complementary than opposed, yet neither one is reducible to the other.

The analytic approach seeks to reduce a system to its elementary elements in order to study in detail and understand the types of interaction that exist between them. By modifying one variable at a time, it tries to infer general laws that will enable one to predict the properties of a system under very different conditions. To make this prediction possible, the laws of the additivity of elementary properties must be invoked. This is the case in homogeneous systems, those composed of similar elements and having weak interactions among them. Here the laws of statistics readily apply, enabling one to understand the behavior of the multitude-of disorganized complexity.

The laws of the additivity of elementary properties do not apply in highly complex systems composed of a large diversity of elements linked together by strong interactions. These systems must be approached by new methods such as those which the systemic approach groups together. The purpose of the new methods is to consider a system in its totality, its complexity, and its own dynamics Through simulation one can "animate" a system and observe in real time the effects of the different kinds of interactions among its elements. The study of this behavior leads in time to the determination of rules that can modify the system or design other systems.

The following table compares, one by one, the traits of the two approaches.

Analytic Approach Systemic Approach isolates, then concentrates on the elements unifies and concentrates on the interaction between elements studies the nature of interaction studies the effects of interactions emphasizes the precision of details emphasizes global perception modifies one variable at a time modifies groups of variables simultaneously remains independent of duration of time; the phenomena considered are reversible. integrates duration of time and irreversibility validates facts by means of experimental proof within the body of a theory validates facts through comparison of the behavior of the model with reality uses precise and detailed models that are less useful in actual operation (example: econometric models) uses models that are insufficiently rigorous to be used as bases of knowledge but are useful in decision and action (example: models of the Club of Rome) has an efficient approach when interactions are linear and weak has an efficient approach when interactions are nonlinear and strong leads to discipline-oriented (juxtadisciplinary) education leads to multidisciplinary education leads to action programmed in detail leads to action through objectives possesses knowledge of details poorly defined goals possesses knowledge of goals, fuzzy details

This table, while useful in its simplicity, is nevertheless a caricature of reality. The presentation is excessively dualist; it confines thought to an alternative from which it seems difficult to escape. Numerous other points of comparison deserve to be mentioned. Yet without being exhaustive the table has the advantage of effectively opposing the two complementary approaches, one of which-the analytic approach-has been favored disproportionately in our educational system.

The Nature of Cybernetic Systems

Author: C. Joslyn,

Updated: Jan 1992

Filename: CYBSNAT.html

Complexity: Cybernetic systems are complex structures, with many heterogeneous interacting components. Mutuality: These many components interact in parallel, cooperatively, and in real time, creating multiple simultaneous interactions among subsystems. Complementarity: These many simultaneous modes of interaction lead to subsystems which participate in multiple processes and structures, yielding any single dimension of description incomplete, and requiring multiple complementary, irreducible levels of analysis. Evolvability: Cybernetic systems tend to evolve and grow in an opportunistic manner, rather than be designed and planned in an optimal manner. Constructivity: Cybernetic systems are constructive, in that as they tend to increase in size and complexity, they become historically bound to previous states while simultaneously developing new traits. Reflexivity: Cybernetic systems are rich in internal and external feedback, both positive and negative. Ultimately, they can enter into the "ultimate" feedback of reflexive self-application, in which their components are operated on simultaneously from complementary perspectives, for example as entities and processes. Such situations may result in the reflexive phenomena of self-reference, self-modeling, self-production, and self-reproduction.

Cybernetics and Systems Science in Academics

Author: C. Joslyn, F. Heylighen,

Updated: Jan 1992

Filename: CYBSACAD.html

Some recent fashionable approaches have their roots in ideas that were proposed by cyberneticians many decades ago: e.g. artificial intelligence, neural networks, complex systems, human-machine interfaces, self-organization theories, systems therapy, etc. Most of the fundamental concepts and questions of these approaches have already been formulated by cyberneticians such as Wiener, Ashby, von Bertalanffy \cite{V L56}, Boulding, von Foerster, von Neumann, McCulloch, and Pask in the 1940's through 1960's.

But since its founding, Cybernetics and Systems Science have struggled to find a degree of "respectability" in the academic community. While little interdisciplinary work has prospered recently, cyberneticians especially have failed to find homes in academic institutions, or to create their own. Very few academic programs in Cybernetics and Systems Science exist, and those working in the new disciplines described above seem to have forgotten their cybernetic predecessors.

What is the reason that cybernetics does not get the popularity it deserves? What distinguishes cyberneticians from researchers in the previously mentioned areas is that the former stubbornly stick to their objective of building general, domain independent theories, whereas the latter focus on very specific applications: expert systems, psychotherapy, thermodynamics, pattern recognition, etc. General integration remains too abstract, and is not sufficiently successful to be really appreciated.

As an interdisciplinary field, Cybernetics and Systems Science sees common concepts used in multiple traditional disciplines and attempts to achieve a consensual unification by finding common terms for similar concepts in these multiple disciplines. Thus sometimes Cybernetics and Systems Science abstracts away from concepts, theories, and terminologies in specific discipline towards general, and perhaps idiosyncratic, usages. These new conceptual categories may not be recognizable to the traditional researchers, or they may find no utility in the use of the general concepts.

Clearly the problem of building a global theory is much more complex than any of the more down-to-earth goals of the fashionable approaches. But we may also say that the generality of the approach is dangerous in itself if it leads to being "stuck" in abstractions which are so far removed from the everyday world that it is difficult to use them, interact with them, or test them on concrete problems; in other words, to get a feel for how they behave and what their strengths and weaknesses are.

Although there are many exceptions, researchers in Cybernetics and Systems Science tend to be trained in a traditional specialty (like biology, management, or psychology) and then come to apply themselves to problems in other areas, perhaps a single other area. Thus their exposure to Cybernetics and Systems Science concepts and theory tends to be somewhat ad hoc and specific to the two or three fields they apply themselves to.

Existing Cybernetic Foundations

Author: C. Joslyn,

Filename: CYBFOUND.html

Few have even attempted to address foundational theoretical and methodological issues in anything other than an ad hoc manner. Some conceptual "frameworks" exist at the formal, mathematical level \cite{KLG85c,MEMTA88}. Some researchers have presented integrated conceptual frameworks for major areas of systems science \cite{JAE80a,ODH83,POW73,TUV77}, and there have been some attempts to develop the foundations of the philosophy underlying cybernetics and systems theory \cite{BUM74,LAE72}. Yet these works focus specifically on cybernetics and systems theory from the perspectives of the traditional fields of mathematics or philosophy respectively; they are still locked into the traditional forms of development of academic work. There is as yet no systems theory of systems theories.

There is at the same time a lack of researchers who are willing or able to address themselves to the general problems and theories encompassed by cybernetics and systems theory. The lack of a coherent terminology and methodology is reflected in a lack of basic textbooks and glossaries, (with some exceptions \cite{ASR56,KLG91a,WEG75}) and further in a failure to establish even primary educational programs to instruct upcoming generations. What little interdisciplinary work has prospered has profited from the developments in cybernetics and systems theory over the past few decades while either ignoring or deliberately avoiding any reliance on cybernetics and systems theory (e.g. cite{SFI,WOS88}).

The lack of a strong foundation for or consensus within cybernetics and systems theory extends to the very basic information about the field. How do we describe ourselves, what can we tell new students and outsiders? Cybernetics and systems theory has been alternatively described as a science, a point of view, a world-view, an approach, an outlook, or a kind of applied philosophy or applied mathematics. There are those in our community who approve of and even champion this state of affairs. They focus on the creativity of the maverick academics who are drawn to cybernetics and systems theory, and decry any attempts to structure or build a solid theory.(Again, with some notable exceptions \cite{UMS90}.) Clearly this lack of balance has led to rather poor review standards in systems journals and conferences, and a low "signal to noise ratio".

What can account for the current state of affairs in cybernetics and systems theory, the lack of a consensually held fundamental theory? Is it inherent in the field, and necessary in any broad interdisciplinary studies? Or is it an historical accident, exacerbated by the personalities and careers of individual researchers? The Principia Cybernetica Project holds that there are in fact fundamental and foundational concepts, principles, and theories immanent in the body and literature of cybernetics and systems theory which do hold to general information systems, including all living and evolving systems at all levels of analysis. We contend that the lack of a fundamental theory is due to a lack of investment in the field. Support for and investment in a field are mutually reinforcing. A lack of either will lead to a lack of the other.

Cybernetic Technology

Author: Heylighen,

Updated: Oct 18, 1993

Filename: CYBTECH.html

The domain of computing applications has grown so quickly that labeling anything that uses a computer as "cybernetic" is more obscuring than enlightening. Therefore we would restrict the label "cybernetic technology" to those information processing and transmitting tools that somehow increase the general purpose "intelligence" of the user, that is to say the control the user has over information and communication.

Especially all "value-added" computer-supported communication technologies (electronic mailing list, such as PRNCYB-L, newsgroups and bulletin boards, various forms of groupware, electronic publishing tools such as FTP or WWW) fall under this heading. They make it possible to exchange information in a very fast, simple and reliable way, so that it is automatically stored and ready for immediate further processing or transfer. The practical implication is that communication channels between far-away locations becomes so flexible and direct that they remind us of nerves, connecting and controlling different parts of an organism. The group of cooperators thus can behave more like a single system, with a vastly increased knowledge and intelligence, rather than like a collection of scattered individuals who now and then exchange limited messages, that need a lot of time to reach their destination and be processed.

In addition to communication, there is the aspect of increased control over information. The is especially obvious in computing tools that offer some kind of additional intelligence to the user: 1) everything deriving from artificial intelligence, and its daughter fields, such as expert systems, machine learning, and neural networks, where certain cognitive processes are automatized and thus taken over from the user; 2) the different tools that offer better ways to organize and represent information or knowledge, i.e. that support the user in building useful models. This category includes all types of computer simulation (e.g. virtual reality), knowledge representation tools, hypertext and multimedia, databases and information retrieval. The two features of computer intelligence and modelling are merged in what may be called "knowledge structuring": the use of computer programs that reorganize models in order to make them more adequate (more correct, simple, rich, easy-to-use, ...). (see a short paper by me, suggesting a possible way to introduce knowledge structuring in hypertexts)

The merging of the twin cybernetic dimensions of communication and control leads us to envision an all-encompassing, "intelligent" communication network, cyberspace, which may form the substrate for an emerging world-wide super-brain.

See also: Cybermedia

Cyberspace

Author: Heylighen,

Updated: Oct 17, 1994

Filename: CYBSPACE.html

"Cyberspace is the `place` where a telephone conversation appears to occur. Not inside your actual phone, the plastic device on your desk. Not inside the other person's phone, in some other city. _The_place_between_ the phones. The indefinate place _out_there_, where the two of you, human beings, actually meet and communicate." Bruce Sterling [The Hacker Crackdown]

The word "cyberspace" was coined by the science fiction author William Gibson, when he sought a name to describe his vision of a global computer network, linking all people, machines and sources of information in the world, and through which one could move or "navigate" as through a virtual space.

The word "cyber", apparently referring to the science of cybernetics, was well-chosen for this purpose, as it derives from the Greek verb "Kubernao", which means "to steer" and which is the root of our present word "to govern". It connotes both the idea of navigation through a space of electronic data, and of control which is achieved by manipulating those data. For example, in one of his novels Gibson describes how someone, by entering cyberspace, could steer computer-controlled helicopters to a different target. Gibson's cyberspace is thus not a space of passive data, such as a library: its communication channels connect to the real world, and allow cyberspace navigators to interact with that world. The reference to cybernetics is important in a third respect: cybernetics defines itself as a science of information and communication, and cyberspace's substrate is precisely the joint network of all existing communication channels and information stores connecting people and machines.

The word "space", on the other hand, connotes several aspects. First, a space has a virtually infinite extension, including so many things that they can never be grasped all at once. This is a good description of the already existing collections of electronic data, on e.g. the Internet. Second, space connotes the idea of free movement, of being able to visit a variety of states or places. Third, a space has some kind of a geometry, implying concepts such as distance, direction and dimension.

The most direct implementation of the latter idea is the technology of virtual reality, where a continuous three-dimensional space is generated by computer, which reacts to the user's movements and manipulations like a real physical space would. In a more metaphorical way, the geometry (or at least topology) of space can be found in the network of links and references characterizing a hypertext (which can be seen as the most general form for a collection of interlinked data). Nodes in a hypertext can be close or distant, depending on the number of links one must traverse in order to get from the one to the other. Moreover, the set of links in a given node define a number of directions in which one can move. However, a hypertext does not seem to have any determined number of dimensions (except perhaps infinity), it is not continuous but "chunky", and the distance between two points is in general different depending on the point from which one starts to move.

One of the challenges for the researchers who are trying to make present computer networks look more like a Gibsonian cyberspace is to integrate the intuitive geometry of 3-D virtual reality, with the more general, but cognitively confusing, infinite dimensionality of hypertext nets (see e.g. NCSA's project on navigation through information space). A first step in that direction are the extensions to World-Wide Web which allow the user to do hypermedia navigation in a two-dimensional image (e.g. a map of Internet Resources), by associating clicks in different areas of the image with different hyperlinks. More ambitious proposals to develop a Virtual Reality interface to the World-Wide Web are being discussed.

As a description for what presently exists, the word "cyberspace" is used in a variety of significations, which each emphasize one or more of the meanings sketched above. Some use it as a synonym for virtual reality, others as a synonym for the World-Wide Web hypermedia network, or for the Internet as a whole (sometimes including the telephone, TV, and other communication networks).

None of the uses already seems to incorporate the most intrinsically cybernetic aspect of the concept: that of a shared medium through which one can exert control over one's environment. Control can apply as well to objects in cyberspace (e.g. when you alter the information in database through a Web form interface), as to objects in the real world (telepresence or teleoperation). As a first example of the control possibilities offered by the World-Wide Web, it is possible to steer a operated robot arm to do excavations. I would venture that it is that last dimension which will turn out to be the most important one in the future, as it may form the substrate for a cybernetic "superbeing" or "metabeing"...

See also:

Cybernetic Theory and Cybernetic Practice

Author: C. Joslyn,

Updated: Jan 1992

Filename: CYBTHPRA.html

It is therefore not surprising that the use of this same technology is the bedrock of practicing cyberneticians, and further holds the promise to resolve some of these conflicts between the objects and nature of cybernetic theory and the nature of academic work. In particular, it is now possible to develop representational media which share the characteristics of the systems being studied:

Complexity: The miniaturization and speed of computer components allows the representation of models and systems of great complexity, with many interacting elements at a variety of scales. Complementarity: Not only automated indexing and look-up mechanisms, but especially the recent developments in hypertext and hypermedia have allowed representations of complex systems which can have multiple orderings, and thus a nonlinear structure. Mutuality: There is a great deal of current research in parallel processes and cooperative work amongst researchers. Such systems allow real-time, simultaneous interaction among many agents (either programs or people). The nonlinear structure of hypermedia allows for the representations of the work of all cooperating agents. Evolvability: A hallmark of electronic representations is their plasticity. Dynamic memories (such as electronic RAMs) are designed for minimal time to change their state; while even more static memories (such as tape drives) are easily modified. Furthermore, the multiple orderings available through hypermedia allow for easy location of information to be changed. This results in systems which can easily be changed and modified to reflect conditions or the desires of their creators. Constructivity: Again, partly due to these nonlinear representations, maintaining dynamically changing representations which record and preserve the history of their development is quite feasible. Edits, updates, and general change and growth can be represented directly, and revealed or concealed as desired. Reflexivity: Another hallmark of computer technology is that it is fundamentally reflexive. The ability to treat a given piece of information as either an object for manipulation or as representing something is the essence of the program/data distinction which allows for programmable machines. Some computer systems (e.g. Lisp, Smalltalk, and Refal) make this reflexivity explicit, representing program as data, or a data type as a data object, yielding programming environments which are extensible. Furthermore, the mathematical bases of computational theory in Turing machines and recursive functions are also inherently reflexive. Recursiveness in formal systems is used to represent feedback in cybernetic systems.

Cybernetics and Systems Science and Academic Work

Author: C. Joslyn,

Updated: Aug 1993

Filename: ^CYBSWORK.html

Traditional analytic methods tend to focus on individual, simple subsystems in isolation, while only occasionally (and frequently inaccurately) extrapolating to group traits. Temporal and physical levels of analysis are abstracted and isolated, and disciplinary divisions cut off consideration of their interaction.

This inadequacy is reflected in the actual products of academic and scientific work, the books, papers, and lectures which are the coin in trade for academic workers. Such works (like all traditional publications) have a linear structure, ranging from long treatises to collections of short paragraphs or sections (e.g. the work of Aristotle \cite{AR43} or Wittgenstein \cite{WIL58}). Various indexing and other methods are available to gain "random access" within documents. Dictionaries, encyclopedias, and other reference works partially introduce nonlinear structures through internal references (e.g. \cite{EDP67,KRK84,FLA79}). Some authors have made halting efforts in the direction of nonlinear documents \cite{MIM86}; others have used pictures and graphical notation to aid in understanding \cite{VOH81,ABRSHC85,VAF75,HAD88}. And certainly the use of formal systems (mathematics and logical notations) have given the ability to construct large, complex linguistic systems.

Nevertheless, over the years the fundamental linear textual form has been maintained. Works are produced by single or at most small groups of authors. Collaborative work among more than two people remains next to impossible. Work proceeds almost entirely in natural language. The development of large, complex systems of philosophical thought in non-formal domains has been difficult. Once published, the works sit on library shelves in mute inactivity. They are not even open to revision except through further publications and errata. The connections among and within works are revealed only through laborious reference searches and synthetic works by diligent authors. Tracing the historical development of ideas is as laborious as that of bibliographical relation. The physical form of texts required that the products of one author or the writings on one subject be physically scattered throughout a vast published literature, leading to a cacophonous din of argument and discourse.

The disciplinary divisions of academic work also place a regimented, linear, and highly specific structure to the categorization of published books and papers. Cybernetics and systems science researchers, on the other hand, typically utilize a great deal of the library shelves, including mathematics, all the traditional sciences, psychology and sociology, philosophy, linguistics, etc. In fact, ultimately there can be little doubt that cybernetics and systems science are not "academic disciplines" at all in the traditional sense of the word. As the trans- (inter-, meta-, anti-) disciplinary studies of general systems and information systems, cybernetics and systems science has long fought against the traditional disciplinary divisions of intellectual specialization.

This critique can be extended to the ultimate reflexivity of cybernetics and systems science, in which the academic milieu in which they operate is regarded as another cybernetic system, and therefore an object of study which itself should be understood through cybernetic principles.(Similarly, Turchin \cite{TUV77} describes the ultimate end of science as the reflexive study of the scientific process.)

Relation to other disciplines

Author: F. Heylighen,

Updated: Nov 12, 1996

Filename: CYBSREL.html

Unfortunately, few practitioners in these recent disciplines seem to be aware that many of their concepts and methods were proposed or used by cyberneticians since many years. Subjects like complexity, self-organization, connectionism and adaptive systems have already been extensively studied in the 1940's and 1950's, by researchers like Wiener, Ashby, von Neumann and von Foerster, and in discussion forums like the famous Josiah Macy meetings on cybernetics [Heims, 1991]. Some recent popularizing books on "the sciences of complexity" (e.g. Waldrop, 1992) seem to ignore this fact, creating the false impression that work on complex adaptive systems only started in earnest with the creation of the Santa Fe Institute in the 1980's.

Reference: S. Heims. The Cybernetics Group. MIT Press, Cambridge MA, 1991.

Complex Adaptive Systems

Author: F. Heylighen,

Updated: Nov 12, 1996

Filename: CAS.html

Two popular science books, one by the science writer Mitchell Waldrop and one by the Nobel laureate and co-founder of the Santa Fe Institute Murray Gell-Mann, offer good reviews of the main ideas underlying the CAS approach. Another Santa Fe collaborator, the systems analyst John Casti, has written several popular science books, discussing different issues in the modelling of complex systems, while integrating insights from the CAS approach with the two older traditions.

John Holland is the founder of the domain of genetic algorithms. These are parallel, computational representations of the processes of variation, recombination and selection on the basis of fitness that underly most processes of evolution and adaptation (Holland, 1992). They have been successfully applied to general problem solving, control and optimization tasks, inductive learning (classifier systems, Holland et al., 1986), and the modelling of ecological systems (the ECHO model, Holland, 1996). The biologist Stuart Kauffman has tried to understand how networks of mutually activating or inhibiting genes can give rise to the differentiation of organs and tissues during embryological development. This led him to investigate the properties of Boolean networks of different sizes and degrees of connectedness. Through a reasoning reminiscent of Ashby, he proposes that the self-organization exhibited by such networks of genes or chemical reactions is an essential factor in evolution, complementary to Darwinian selection by the environment.

Holland's and Kauffman's work, together with Dawkins' simulations of evolution and Varela's models of autopoietic systems, provide essential inspiration for the new discipline of artificial life, This approach, initiated by Chris Langton (1989, 1992), tries to develop technological systems (computer programs and autonomous robots) that exhibit lifelike properties, such as reproduction, sexuality, swarming, and co-evolution. Tom Ray's Tierra program proposes perhaps the best example of a complex, evolving ecosystem, with different species of "predators", "parasites" and "prey", that exists only in a computer.

Backed by Kauffman's work on co-evolution, Wolfram's cellular automata studies, and Bak's investigations of self-organized criticality, Langton (1990) has proposed the general thesis that complex systems emerge and maintain on the edge of chaos, the narrow domain between frozen constancy and chaotic turbulence. The "edge of chaos" idea is another step towards an elusive general definition of complexity. Another widely cited attempt at a definition in computational terms was proposed by Charles Bennett.

Another investigation which has strongly influenced the artificial life community is Robert Axelrod's game theoretic simulation of the evolution of cooperation. By letting different strategies compete in a repeated Prisoner's Dilemma game, Axelrod (1984) showed that mutually cooperating, "tit-for-tat"-like strategies tend to dominate purely selfish ones in the long run. This transition from biological evolution to social exchanges naturally leads into the modelling of economic processes (Anderson, Arrow & Pines, 1988). W. Brian Arthur has systematically investigated self-reinforcing processes in the economy, where the traditional law of decreasing returns is replaced by a law of increasing returns, leading to the path-dependence and lock-in of contingent developments. More recently (1994), he has simulated the seemingly chaotic behavior of stock exchange-like systems by programming agents that are continuously trying to guess the future behavior of the system to which they belong, and use these predictions as basis for their actions. The conclusion is that the different predictive strategies cancel each other out, so that the long term behavior of the system becomes intrinsically unpredictable. This result leads back to von Foerster's second-order cybernetics, according to which models of social systems change the very systems they intend to model.

Bibliography: see the "classic publications on complex, evolving systems".

See also: Web servers on complexity and self-organization

Self-organization and complexity in the natural sciences

Author: F. Heylighen,

Updated: Nov 12, 1996

Filename: COMPNATS.html

The physicist Hermann Haken (1978) has suggested the label of synergetics for the field that studies the collective patterns emerging from many interacting components, as they are found in chemical reactions, crystal formations or lasers. Another Nobel laureate, Manfred Eigen (1992), has focused on the origin of life, the domain where chemical self-organization and biological evolution meet. He has introduced the concepts of hypercycle, an autocatalytic cycle of chemical reactions containing other cycles, and of quasispecies, the fuzzy distribution of genotypes characterizing a population of quickly mutating organisms or molecules (1979).

The modelling of non-linear systems in physics has led to the concept of chaos, a deterministic process characterized by extreme sensitivity to its initial conditions (Crutchfield, Farmer, Packard & Shaw, 1986). Although chaotic dynamics is not strictly a form of evolution, it is an important aspect of the behavior of complex systems. The science journalist James Gleick has written a popular history of, and introduction to, the field. Cellular automata, mathematical models of distributed dynamical processes characterized by a discrete space and time, have been widely used to study phenomena such as chaos, attractors and the analogy between dynamics and computation through computer simulation. Stephen Wolfram has made a fundamental classification of their types of behavior. Catastrophe theory proposes a mathematical classification of the critical behavior of continuous mappings. It was developed by René Thom (1975) in order to model the (continuous) development of (discontinuous) forms in organisms, thus extending the much older work by the biologist D' Arcy Thompson (1917).

Another French mathematician, Benoit Mandelbrot (1983), has founded the field of fractal geometry, which models the recurrence of similar patterns at different scales which characterizes most natural systems. Such self-similar structures exhibit power laws, like the famous Zipf's law governing the frequency of words. By studying processes such as avalanches and earthquakes, Per Bak (1988, 1991) has shown that many complex systems will spontaneously evolve to the critical edge between order (stability) and chaos, where the size of disturbances obeys a power law, large disturbances being less frequent than small ones. This phenomenon, which he called self-organized criticality, may also provide an explanation for the punctuated equilibrium dynamics seen in biological evolution.

Bibliography: see the "classic publications on complex, evolving systems".

See also: Web servers on complexity and self-organization

History of Cybernetics and Systems Science

Author: J. de Rosnay

Updated: Nov 6, 1996

Filename: CYBSHIST.html

In illustrating a new current of thought, it is often useful to follow a thread. Our thread will be the Massachusetts Institute of Technology (MIT). In three steps, each of about ten years, MIT was to go from the birth of cybernetics to the most critical issue, the debate on limits to growth. Each of these advances was marked by many travels back and forth--typical of the systemic approach--between machine, man, and society. In the course of this circulation of ideas there occurred transfers of method and terminology that later fertilized unexplored territory.

In the forties the first step forward led from the machine to the living organism, transferring from one to the other the ideas of feedback and finality and opening the way for automation and computers. In the fifties it was the return from the living organism to the machine with the emergence of the important concepts of memory and pattern recognition, of adaptive phenomena and learning, and new advances in bionics (Bionics attempts to build electronic machines that imitate the functions of certain organs of living beings.): artificial intelligence and industrial robots. There was also a return from the machine to the living organism, which accelerated progress in neurology, perception, the mechanisms of vision In the sixties MIT saw the extension of cybernetics and system theory to industry, society, and ecology.

Three men can be regarded as the pioneers of these great breakthroughs: the mathematician Norbert Wiener, who died in 1964, the neurophysiologist Warren McCulloch, who died in 1969; and Jay Forrester, professor at the Sloan School of Management at MIT. There are of course other men, other research teams, other universities--in the United States as well as in the rest of the world--that have contributed to the advance of cybernetics and system theory. I will mention them whenever their course of research blends with that of the MIT teams.

In 1940 Wiener worked with a young engineer, Julian H. Bigelow, to develop automatic range finders for antiaircraft guns. Such servomechanisms are able to predict the trajectory of an airplane by taking into account the elements of past trajectories. During the course of their work Wiener and Bigelow were struck by two astonishing facts: the seem.ingly "intelligent" behavior of these machines and the "diseases" that could affect them. Theirs appeared to be "intelligent" behavior because they dealt with "experience" (the recording of past events) and predictions of the future. There was also a strange defect in performance: if one tried to reduce the friction, the system entered into a series of uncontrollable oscillations.

Impressed by this disease of the machine, Wiener asked Rosenblueth whether such behavior was found in man. The response was affirmative: in the event of certain injuries to the cerebellum, the patient cannot lift a glass of water to his mouth; the movements are amplified until the contents of the glass spill on the ground. From this Wiener inferred that in order to control a finalized action (an action with a purpose) the circulation of information needed for control must form "a closed loop allowing the evaluation of the effects of one's actions and the adaptation of future conduct based on past performances." This is typical of the guidance system of the antiaircraft gun, and it is equally characteristic of the nervous system when it orders the muscles to make a movement whose effects are then detected by the senses and fed back to the brain.

Thus Wiener and Bigelow discovered the closed loop of information necessary to correct any action--the negative feedback loop--and they generalised this discovery in terms of the human organism.

During this period the multidisciplinary teams of Rosenblueth were being formed and organized. Their purpose was to approach the study of living organisms from the viewpoint of a servomechanisms engineer and, conversely, to consider servomechanisms with the experience of the physiologist. An early seminar at the Institute for Advanced Study at Princeton in 1942 brought together mathematicians, physiologists, and mechanical and electrical engineers. In light of its success, a series of ten seminars was arranged by the Josiah Macy Foundation. One man working with Rosenblueth in getting these seminars under way was the neurophysiologist Warren McCulloch, who was to play a considerable role in the new field of cybernetics. In 1948 two basic publications marked an epoch already fertile with new ideas: Norbert Wiener's Cybernetics, or Control and Communication in the Animal and the Machine, and The Mathematical Theory of Communication by Claude Shannon and Warren Weaver. The latter work founded information theory.

The ideas of Wiener, Bigelow, and Rosenblueth caught fire like a trail of powder. Other groups were formed in the United States and around the world, notably the Society for General Systems Research whose publications deal with disciplines far removed from engineering such as sociology, political science, and psychiatry.

The seminars of the Josiah Macy Foundation continued, opening to new disciplines: anthropology with Margaret Mead, economics with Oskar Morgenstern. Mead urged Wiener to extend his ideas to society as a whole. Above all, the period was marked by the profound influence of Warren McCulloch, director of the Neuropsychiatric Institute at the University of Illinois.

At the conclusion of the work of his group on the organization of the cortex of the brain, and especially after his discussions with Walter Pitts, a brilliant, twenty-two-year-old mathematician, McCulloch understood that a beginning of the comprehension of cerebral mechanisms (and their simulation by machines) could come about only through the cooperation of many disciplines. McCulloch himself moved from neurophysiology to mathematics, from mathematics to engineering.

Walter Pitts became one of Wiener's disciples and contributed to the exchange of ideas between Wiener and McCulloch; it was he who succeeded in convincing McCulloch to install himself at MIT in 1952 with his entire team of physiologists.

Paralleling the work of the teams of Wiener and McCulloch at MIT, another group tried to utilize cybernetics on a wider scope. This was the Society for General Systems Research, created in 1954 and led by the biologist Ludwig von Bertalanffy. Many researchers were to join him: the mathematician A. Rapoport, the biologist W. Ross Ashby, the biophysicist N. Rashevsky, the economist K. Boulding. IIn 1954 the General Systems Yearbooks began to appear; their influence was to be profound on all those who sought to expand the cybernetic approach to social systems and the industrial firm in particular.

During the fifties a tool was developed and perfected that would permit organized complexity to be approached from a totally new angle--the computer. The first ones were ENIAC (1946) and EDVAC or EDSAC (1947). One of the fastest was Whirlwind 11, constructed at MIT in 1951. It used--for the first time--a superfast magnetic memory invented by a young electronics engineer from the servomechanisms laboratory, Jay W. Forrester.

As head of the Lincoln Laboratory, Forrester was assigned by the Air Force in 1952 to coordinate the implementation of an alert and defense system, the SAGE system, using radar and computers for the first time. Its mission was to detect and prevent possible attack on American territory by enemy rockets. Forrester realized the importance of the systemic approach in the conception and control of complex organizations involving men and machines in "real time": the machines had to be capable of making vital decisions as the information arrived.

In 1961, having become a professor at the Sloan School of Management at MIT, Forrester created Industrial Dynamics. His object was to regard all industries as cybernetics systems in order to simulate and to try to predict their behavior.

In 1964, confronted with the problems of the growth and decay of cities, he extended the industrial dynamics concept to urban systems (Urban Dynamics). Finally, in 1971, he generalized his earlier works by creating a new discipline, system dynamics, and published World Dynamics. This book was the basis of the work of Dennis H. Meadows and his team on the limits to growth. Financed by the Club of Rome these works were to have worldwide impact under the name MIT Report

See also: the origin of cybernetics and the biographies of the most important cybernetic thinkers at the cybernetics page of the ASC

Cybernetics and Systems Thinkers

Author: F. Heylighen,

Updated: Jan 14, 1998

Filename: CSTHINK.html

This list was provided as a special service to our readers, since we noticed that the names of these people were among the most common strings entered in our search engine. Therefore, the list is directly searchable through the PCP title search. The [Search PCP] link after each name will find all references to the name in other Principia Cybernetica Web pages, while [find books] will give you a list of books by or on the author, available through the Amazon web bookshop.

"The Macroscope", a book on the systems approach

Author: F. Heylighen,

Updated: Feb 26, 1997

Filename: MACRBOOK.html

Dr. Joël de Rosnay, a molecular biologist, systems theorist, science writer, and futurologist, is presently Director of Strategy of the Cite des Sciences et de l'Industrie at La Villette (near Paris). He is an associate of the Principia Cybernetica Project.

This book is an excellent, easy to read introduction to cybernetics and systems thinking, with applications to living organisms, the economy and the world as a whole. The main theme is that the complex systems which govern our life should be looked at as a whole, rather than be taken apart into their constituents. The different systems, processes and mechanisms are beautifully illustrated with examples and pictures. Although the text is over 20 years old, this visionary document is still highly relevant to our present situation and state of knowledge. It is particularly recommended to people who wish to get an understanding of the basic concepts and applications of systems theory and cybernetics. The chapters below can be read independently of each other.

TABLE OF CONTENTS

Introduction: The Macroscope

One. Through the Macroscope 1. Ecology (The Economics of Nature: Production, Consumption, and Decomposition; Regulation and Maintenance of Equilibriums) 2. The Economy (A Short History of the Economy; The Economic Machine; Recession and Inflation) 3. The City 4. Business and Industry 5. The Living Organism 6. The Cell (Linking the Cell and the Body; The Work of the Enzymes)

Two. The Systemic Revolution: A New Culture 1. History of a Global Approach (The Systemic Approach; The Search for New Tools; "Intelligent" Machines; From Cybernetics to System Dynamics) 2 What Is a System? (Open Systems and Complexity; Structural and Functional Aspects of Systems) 3. System Dynamics: The Internal Causes (Positive and Negative Feedback; Flows and Reservoirs) 4. Applications of the Systemic Approach (Analysis and Synthesis, Models and Simulation; The Dynamics of Maintenance and Change, The "Ten Commandments" of the Systemic Approach; Avoiding the Dangers of the Systemic Approach)

Three. Energy and Survival 1. The Domestication of Energy 2 The Great Laws of Energy (Entropy and the Science of Heat; Energy and Power) 3. Metabolism and Waste in the Social Organism 4. Economics and Ecology (Universal Currency: The Kilocalorie; Energy Analysis; Energy Analysis and Food Production; The Competition Between Energy and Work) 5. Birth of the Bioindustry (New Jobs for Microbes; The Domestication of Enzymes; Controlling Fermentation and Photosynthesis; Ecoengineering)

Four. Information and the Interactive Society 1. Supports of Communication (Measuring Information; Information and Entropy; The History of Communications; Descending and Ascending Information) 2. The New Interactive Networks (Communications Hardware; Services in Real Time; Social Impact of Services in Real Time) 3. Social Feedback (The Imbalance in Communications; The Media and electronic Participation; Problems of Representation; Advantages and Dangers of Society in Real Time)

Five. Time and Evolution 1. Knowledge of Time (Time in the Evolution of Thought; Time in Contemporary Theories) 2. The Prison of Time (The Link Between Chronology and Causality; Irreducible Points of View; The Causal Explanation: Divergence; The Final Explanation: Convergence; Complementarity: A Third Route) 3. Evolution: Genesis of the Improbable (The Genesis of Form; Exclusion and Divergence; Equilibrium and Zero Growth; The Conquest of Time)

Six. Values and Education 1. Birth of a Global Vision 2. The Emergence of New Values (Criticism of Authority; Criticism of Work; Criticism of Reason; Criticism of Human Relationships; Criticism of the Plan for Society) 3. Systemic Education (The Illusions of Educational Technology; The Basis of Systemic Education; The Principles of Systemic Education; The Methods of Systemic Education; Possible Structures of Parallel Education)

Seven. Scenario for a World



Filename: ASC/INDEXASC.html

Basic Books on Cybernetics and Systems Science

Author: C. Joslyn,

Updated: Jul 10, 1996

Filename: CSBOOKS.html

Other, specific bibliographic references of books and a selected number of papers can be found in the library database of the Department of Medical Cybernetics and AI at the University of Vienna. A number of more recent books and papers can be found in our bibliography on the complex, evolving systems.

The books with links below can be securely ordered and paid for over the web from Amazon.com, the largest bookstore on the Net.

Key: ** Required

* Recommended

Excellent graphical introduction to dynamic systems theory.

Ackoff, Russel: (1972) On Purposeful Systems, Aldine Press, Chicago

Grand philosophy of human systems as teleological, goal-seeking. Structure, function, and purpose. Cognitive models and action in psychology; linguistics and semantics; conflict and cooperation; social systems.

Alan, TFH, and Starr, TB: (1982) Hierarchy: Perspective for Explaining Ecological Complexity, U. Chicago, Chicago

Anderson, PW, and Arrow, KJ et. al.: eds. (1988) Economy as an Evolving, Complex System, Addison-Wesley, New York

Critical anthology of system economic theory: applied mathematical techniques, dynamical theory, bounded rationality. Kauffman on "web searching"; Holland; Ruelle on nonlinear dynamics; Baum on neural nets.

Angyal, A: (1969) Logic of Systems, Penguin

Arbib, Michael A: (1972) Metaphorical Brain, Wiley, New York,

* Ashby, Ross: (1952) Design for a Brain, Wiley, New York.

A classic book, introducing fundamental systems concepts with examples related to the brain.

** (1956) Introduction to Cybernetics, Methuen, London

** (1981) Mechanisms of Intelligence: Writings of Ross Ashby/, ..... ed. Roger Conant

Atkin, RH: (1976) Mathematical Structure in Human Affairs, Heineman, London

Introduces Q-analysis, a methodology for identifying structures in data. The methodology uses some ideas of differential geometry.

Auger, Peter: (1990) Dynamics and Thermodynamics in Hier. Organized Sys., to appear

Aulin, AV: (1989) Foundations of Mathematical System Dynamics, Pergamon, Oxford

Causal recursion and its application to social science and economics, fundamental dynamics, self-steering, self-regulation, origins of life and mind.

* Aulin, AY: (1982) Cybernetic Laws of Social Progress, Pergamon, Oxford

Cybernetic social theory, including the Law of Requisite Hierarchy.

..... (1989) Foundations of Mathematical Systems Dynamics, Pergamon Press, Oxford

Barnsley, MF: (1988) Fractals Everywhere, Academic Press, San Diego

Best text on fractal geometry.

** Bateson, Gregory: (1972) Steps to an Ecology of Mind, Ballantine, New York

Bateson's critical essays. For purchase.

..... (1979) Mind and Nature, Bantam, New York

Unlike _Steps to an Ecology of Mind_, _Mind and Nature_ is an attempt at a coherent, popular statement of Bateson's philosophy.

Bayraktar, BA, and et. al., : eds. (1979) Education in Systems Science, Taylor and Francis, London

* Beer, Stafford: (1975) Platform for Change, Wiley, London

Foundational work in management cybernetics.

Bellman, Richard: (1972) Adaptive Control Processes: A Guided Tour, Princeton U, Princeton

An excellent book covering fundamental concept of systems science.

Beltrami, Edward: (1987) Mathematics for Dynamic Modeling, Academic Press, Orlando

Excellent mathematical introduction to dynamic systems theory, including catastrophe theory. Key results and theorems, examples. Many typos.

Blalock, HM: (1969) Systems Theory: From Verbal to Mathematical Formulation, Prentice Hall, Eng.Cliffs NJ

* Blauberg, IV, and Sadovsky, VN: (1977) Systems Theory: Philosophy and Methodological Problems, Progress, Moscow

One of the best overviews of philosphical and methodological development in systems theory, both in the Soviet Union and in the West.

Bogdanov, A.: (1980) Essays in Tektology, Intersystems

Translation of historical foundation of systems science.

Booth, TL: (1967) Sequential Machines and Automata Theory, Wiley, New York

One of the most comprehensive books on finite state machines, both deterministic and probablistic.

* Boulding, Ken: (1978) Ecodynamics, Sage, Beverly Hills

Unified theory of economics and social systems theory in terms of communicative processes.

..... (1985) World as a Total System, Sage, Beverley Hills

Brillouin, Leon: (1964) Scientific Uncertainty and Information, Academic Press, New York

Classic work on the relation between thermodynamics, information theory, and the necessary conditions for observability.

Brooks, DR, and Wiley, EO: (1988) Evolution as Entropy, 2nd edition, U. of Chicago, Chicago

Recent treatise on entropy as a general measure for biological study. Definitions of non-thermodynamic, non-informational entropies at multiple levels of analysis. Severely criticized.

Brown, G. Spencer: (1972) Laws of Form, Julian Press, New York

Philosophy of and notational system for propositional logic.

Basis for a whole school of graphical approaches to classical logic.

Brunner, RD, and Brewer, GD: (1971) Organized Complexity, Free Press, New York

Buckley, W: ed. (1968) Modern Systems Research for the Behavioral Scientist, Aldine, Chicago

Bunge, Mario: Method, Model, and Matter, D. Reydel

Campbell, Jeremy: (1982) Grammatical Man, Simon and Schuster, New York

Popular treatment of many aspects of cognitive science, information theory, and linguistics.

Cariani, Peter A: (1989) On the Design of Devices w/Emergent Semantic Functions, SUNY-Binghamton, Binghamton NY, NOTE: PhD Dissertation

Casti, John: (1979) Connectivity, Complexity and Catastrophe in Large-Scale Systems, J. Wiley, New York

..... * (1989) Alternate Realities: Mathematical Models of Nature and Man, Wiley, New York

Modern and very comprehensive text on mathematical modeling.

* Cavallo, Roger E: (1979) Role of Systems Methodology in Social Science Research, Martinus Nijhoff, Boston

Introduces the GSPS framework and discusses how it can be utilizes in social science research.

* Checkland, Peter: (1981) Systems Thinking, Systems Practice, Wiley, New York

Foundations of an area called soft systems methodology, for social systems management.

Christensen, Ronald: (1980) Entropy Minimax Sourcebook, Entropy Limited, Lincoln, MA, NOTE: Four volumes

..... (1983) Multivariate Statistical Modeling, Entropy Limited, Lincoln MA

Churchman, CW: (1968) Systems Approach, Delta, New York

General introduction to systems thinking in management.

..... (1971) Design of Inquiring Systems, Basic Books, New York

..... (1979) Systems Approach and its Enemies, Basic Books, New York,

Social systems philosophy. But also really about logic and mathematical description, excluded middles as "enemies"; relation of epistemics to action. Lucid, entertaining, critical.

Clemson, Barry: (1984) Cybernetics: A New Management Tool, Abacus Press, Kent

Guide to the theory and practice of management cybernetics. Based on Beer.

Codd, EF: (1968) Cellular Automata, Academic Press, New York,

Csanyi, V: (1982) General Theory of Evolution, Akademia Kiado, Budapest

On universal evolution. Ambitious, non-technical discussion.

Davies, Paul: (1988) Cosmic Blueprint, Simon and Schuster, New York

Excellent popular survey of complex systems theory.

De Chardin, Teilhard: (1959) The Phenomenon of Man, Harper and Row, New York

Early systemic evolutionary theology.

Denbigh, Kenneth G: (1975) An Inventive Universe, Hutchinson, London

On emergence and thermodynamics.

Denbigh, Kenneth G, and Denbigh, JS: (1985) Entropy in Relation to Incomplete Knowledge, Cambridge U., Cambridge

Good survey of quantum statistical dynamics, objectivity and subjecticity, basis of the fundamental assumption of thermodynamics, resolution of Gibbs paradox, relation to information theory.

Distefano, JJ, and et. al., : (1967) Feedback and Control Systems, Schaum, New York

Dretske, Fred: (1982) Knowledge and the Flow of Information, MIT Press, Cambridge

Treatise on information theory, syntax, and semantic.

Edelman, G: (1987) Neural Darwinism, Basic Books, New York

Theory of selectional processes at the neural level.

Eigen, M, and Schuster, P: (1979) The Hypercycle, Springer-Verlag, Heidelberg

Now classic work on the autocatalysis in chemical cycles: the cybernetic basis of metabolism.

Eigen, M, and R. Oswatitsch (1996): Steps Toward Life: a perspective on evolution

Erickson, Gary J: ed. (1988) Maximum-Entropy and Bayesian Methods in Science and Engineering, v. 1,2, Kluwer

Proceedings of the 5th, 6th, and 7th MaxEnt workshops. Foundations and applications. Spectral analysis, inductive reasoning, uncertainty and measurement, information theory in biology, etc.

Farlow, SJ: (1984) Self-Organizing Methods in Modeling, Marcel Dekker, New York

Feistel, Rainer, and Ebeling, Werner: (1988) Evolution of Complex Systems, Kluwer, New York

Oscillation and chaos in mechanical, electrical, chemical, and biological systems. Thermodynamics and spatial structures. Sequences, information, and language. Self-reproducgin systems, Lotka-Volterra systems.

Forrester, JW: (1961) Industrial Dynamics, MIT Press, Cambridge

..... (1971) World Dynamics, Wright and Allen, Cambridge

Influential early attempt at modeling the "world problem": the global economic-ecological web. Like the 's _Limits to Growth_.

..... * ed. (1975) Collected Papers of Jay W. Forrester, Wright-Allen, Cambridge

Papers by the outher of the "DYNAMO" differential systems tool, used for global ecological modeling.

Garey, MR, and Johnson, DS: (1979) Computers and Intractability: Guide to NP-Completeness, WH Freeman, San Francisco

One of the best monographs on computational complexity, NP-completeness and hardness, etc.

Gatlin, L: (1972) Information Theory and the Living System, Columbia U., New York

Classic work on the use of information theory in the analysis of genetic structure, evolution, and general biology.

* Gleick, James: (1987) Chaos: Making of a New Science, Viking, New York

Solid popular introduction to chaotic dynamics and fractal theory.

Glushkov, VM: (1966) Introduction to Cybernetics, Academic Press, New York

Excellent book on cybernetics, translated from Russian.

Greeniewski, H: Cybernetics Without Mathematics, Pergamon, Oxford

Gukhman, AA: (1965) Introduction to the Theory of Similarity, Acadenmic Press, New York

One of the excellent books on the theory of similarity.

* Haken, Herman: (1978) Synergetics, Springer-Verlag, Heidelberg

Original work by this unique developer of a "competitor" to systems science as the study of natural complex systems.

..... (1988) Information and Self-Organization, Springer-Verlag, New York

On synergetics as the science of complex systems. Integrates information theory, bifurcation theory, maximum entropy theory, and semantics.

Hall, AD: (1989) Metasystems Methodology, Pergamon, Oxford

Halme, A, and et. al., : eds. (1979) Topics in Systems Theory, Acta Polytechnica, Scandanavia

Hammer, PC: ed. (1969) Advances in Mathematical Systems Theory, Penn St. U, U. Park, PA

Hanken, AFG, and Reuver, HA: (1981) Social Systems and Learning Systems, Martinus Nijhoff, Boston

Happ, HH: ed. (1973) Gabriel Kron and Systems Theory, Union College Press, Schenectady NY

Hartnett, WE: ed. (1977) Systems: Approaches, Theories, Applications, Reidel, Boston

Herman, GT, and Rozenberg, G: (1975) Developmental Systems and Languages, North-Holland, New York

* Holland, John: (1976) Adaptation in Natural and Artificial Systems, U. Michigan, Ann Arbor

On the genetic algorithms method of modeling adaptive systems.

..... Hidden Order : How Adaptation Builds Complexity

..... Induction : Processes of Inference, Learning and Discovery;

Kanerva, Penti: (1988) Sparse Distributed Memory, MIT Press, Cambridge

On the geometry of high dimensional, low cardinality spaces; application to associative memory.

Klir, George: (1969) An Approach to General Systems Theory, van Nostrand, New York

An early book that describes the nucleus of what is known now as the General Systems Problem Solver.

..... ed. (1972) Trends in General Systems Theory, Wiley, New York

Contains overviews of systems conceptual frameworks of Mesarovic, Wymore, and Klir; and other papers on some fundamental issues of systems science.

..... ed. (1981) Special Issue on Reconstructibility Analysis, in: Int. J. Gen. Sys., v. 7:1, pp. 1-107

Broekstra, Cavallo, Conant, Klir, Krippendorff

..... (1985) Architecture of Systems Problem Solving, Plenum, New York

Vast, general theory of epistemological systems, outline of a platform for general systems modeling and inductive inference.

..... **(1992) Facets of Systems Science, Plenum, New York

Reprints of most classical papers in systems science with an up-to-date introduction. Recommended for everyone as a general introduction to the domain

Klir, George, and Folger, Tina: (1987) Fuzzy Sets, Uncertainty, and Information, Prentice Hall

Primary text on fuzzy systems theory and extended information theory.

Koestler, Arthur, and Smythes, J.R.: eds. (1968) Beyond Reductionism, Hutchinson, London

Classical anthology on holism and reductionism.

Krinsky, VI: ed. (1984) Self-Organization: Autowaves and Structures Far From Equilibrium, Springer-Verlag, New York

Langton, Chris: ed. (1988) Artificial Life, Addison-Wesley

Proceedings from first artificial life conference. Pattee, Goel, Hufford, Klir.

Lerner, D: (1963) Parts and Wholes, Free Press, New York

* Lilienfeld, Robert: (1978) Rise of Systems Theory: An Ideological Analysis, Wiley-Intersciences, New York

A good critical view of some undesirable developments in the systems movement.

Lumsden, Charles, and Wilson, Edward: (1981) Genes, Mind, and Culture: the Coevolutionary Process, Harvard, Cambridge

Non-systemic attempt at unified biological evolutionary theory. Mind as necessary explanatory component from genes to culture. Sociobiology, biological constraint and cause of behavior. Epigenetic rules, epigenesis as coevolution. Mathematical, culturgens. Euculture as human culture, vs. protoculture. Bibliography, no thermodynamics.

Mandelbrot, BB: (1982) Fractal Geometry of Nature, WH Freeman, San Francisco

Classical work on the implications of fractal geometry for modeling physical systems.

Margalef, D Ramon: (1968) Perspectives in Sociological Theory, U. Chicago, Chicago

Maturana, HR, and Varela, F: (1987) Tree of Knowledge, Shambala

On cybernetics and constructivist psychology.

McCulloch, Warren: (1965) Embodiments of Mind, MIT Press, Cambridge

Meadows, Donella H, and Meadows, Dennis L: (1972) Limits to Growth, Signet, New York, and its follow-up Beyond the Limits

Famous report of the Club of Rome. First systems dynamics model of world ecology.

Mesarovic, MD: (1964) Views of General Systems Theory, Wiley, New York

Mesarovic, MD, and Macko, D: (1970) Theory of Hierarchical Multi-Level Systems, Academic Press, New York

Mesarovic, MD, and Takahara, Y: (1975) General Systems Theory: Mathematical Foundations, Academic Press, New York

Mesarovic, MD, and Takahara, : (1988) Abstract Systems Theory, Springer-Verlag, Berlin

Grand formalism for Systems Science. Fundamental behaviorism. Teleogical (functional) and material, causal (structural) descriptions as equivalent in system-description language. Defense of formalism as a kind of language. Systems as proper relations. Cybernetic systems as goal-seeking. Complexity as meta-systems (nesting). Introductions to category theory, topology, etc. Fuzzy systems as ** open** systems.

* Miller, James G: (1978) Living Systems, McGraw Hill, New York

General synthetic theory of biological systems. On functional self-similarity across levels of analysis.

Miser, HJ, and Quade, ES: eds. (1985) Handbook of Systems Analysis, North-Holland, New York

Monod, Jacques: (1971) Chance and Necessity, Vantage, New York

Famous essay on philosophical problems concerning theories of biological systems.

Morowitz, Harold J: (1968) Energy Flow in Biology, Academic Press, New York

On the thermodynamics and informational (entropic) dynamics of biological processes.

Morrison, P: (1982) Powers of Ten, in: Scientific American Books, WH Freeman, New York

"Guided tour" through the spatial scales of natural structure.

Negoita, CV, and Ralescu, DA: (1975) Applications of Fuzzy Sets to Systems Analysis, Birkhauser, Stuttgart

* Negotia, CV: (1981) Fuzzy Systems, Abacus Press, Tunbridge-Wells

Simple, coherent introduction to fuzzy systems theory.

Nicolis, G, and Prigogine, Ilya: (1977) Self-Organization in Non-Equilibrium Systems, Wiley, New York

Technical work on self-organization in flow systems, thermodynamic systems, and other describably in terms of partial differential equations.

* Odum, HT: (1983) Systems Ecology, Wiley, New York

Grand theory of global ecology. Thermodynamic basis of economy.

Pattee, Howard: ed. (1973) Hierarchy Theory, George Braziller, New York

Phillips, DC: (1976) Holistic Thought in Social Sciences

On synthesis on holism and reductionism.

Pines, David: ed. (1988) Emerging Syntheses in Science, Addison-Wesley, New York

Includes key articles by Charles Bennett and interesting looks at spin-glasses and solitons.

Powers, WT: (1973) Behavior, the Control of Perception, Aldine, Chicago

Radical constructivist cybernetic psychological theory.

* Prigogine, Ilya: (1980) From Being to Becoming, WH Freeman, San Francisco

On the whole Prigogine program for explanation of evolution in thermodynamic terms.

..... (1984) Order Out of Chaos, Bantam, New York

Famous, almost-popular treatment of the relation between far-from-equilibrium thermodynamic, general evolutionary theory, and natural philosophy.

Rapoport, Anatol: (1984) General Systems Theory: Essential Concepts and Applications, Abacus, Cambridge

Rescher, Nicholas: Scientific Explanation

Uses stochastic automata in a philosophy of theory.

..... (1979) Cognitive Systematization, Rowman and Littlefie, Totowa, NJ

Treatment of coherentist epistemolgoy and formal development of the necessary limits to knowledge.

Rosen, Robert: (1970) Dynamical Systems Theory in Biology, Wiley-Interscience, New York

..... (1985) Anticipatory Systems, Pergamon, Oxford

The only book on anticipatory systems at present.

Rosenkrantz, Roger D: ed. (1989) ET Jaynes Papers on Prob., Statistics and Statistical Physics, Kluwer

Collection of Jayne's best papers.

Sage, AP: (1977) Methodology for Large Scale Systems, McGraw-Hill, New York

Sandquist, GM: (1985) Introduction to Systems Science, Prentice Hall, Eng. Cliffs NJ

* Sayre, Kenneth: (1976) Cybernetics and the Philosophy of Mind, Humanities Press, Atl. High., NJ

Grand cybernetic evolutionary theory of mind.

Schrodinger, : (1967) What is Life?, Cambridge U., Cambridg