SPT v4n1 - Computer Technology and Evolution: From Artificial Intelligence to Artificial Life


Number 1
Fall 1998
Volume 4


COMPUTER TECHNOLOGY AND EVOLUTION: FROM ARTIFICIAL INTELLIGENCE TO ARTIFICIAL LIFE

Klaus Mainzer, University of Augsburg


1. INTRODUCTION

There is a traditional belief in the computability of nature and society, but it has run into severe difficulties. Computability is only simple for linear problems. But most problems in the world are nonlinear and complex,from the biological evolution of life to the ecological, economic, and social dynamics of human society. Classical physics and Artificial Intelligence (AI) have often been inspired by the linear world view of the Laplacean demon.

At the end of this century, the life sciences have a dominant or leading role. Biological, gene, and brain research are delivering the most exciting applications of medical technologies. Understanding biological phenomena demands a multidisciplinary and nonlinear approach. The sciences of complexity inspire the computational methods of Artificial Life (AL) research. They do not merely use artificial media to model, replicate, and investigate life processes. Artificial life research is now slowly slipping into the mainstream of technology. Biologically inspired techniques of computer science are used to solve problems, optimize solutions, simulate real-life situations, and create virtual reality.

In the following, I analyze the foundations and perspectives of computational technologies in the framework of both linear and nonlinear complex dynamic systems. I finish with an outlook on the technical evolution of computer-assisted complex societies at the turn of the century.

2. COMPUTABILITY AND LINEAR WORLD VIEW

The idea of universal computability dates back to the concept of a mathesis universalis in the 17th century. According to Leibniz, every scientific problem can be decided by a calculating (algorithmic) procedure (ars iudicandi). Every solution of a scientific problem can be sought and enumerated by a calculating procedure (ars inveniendi). Every lawful process of nature should be realizable by a mechanical calculating machine (automaton).

According to Church's thesis, every computational procedure (algorithm) can be calculated by a Turing machine. So every recursive function, as a kind of machine program, can be calculated by a general purpose computer. Then we are able to define effective procedures of decision and enumerability, which were already demanded by Leibniz's program of a mathesis universalis. It can easily be proved that every recursive (decidable) set is recursively enumerable. But there are recursively enumerable sets which are not decidable. These are the first hints that there are limits to Leibniz's originally optimistic program based on a belief in universal decision procedures.

Obviously, recursiveness or Turing computability is a theoretical limit of computability according to Church's thesis. Below this limit there are many practical problems concerning certain limitations on how much the speed of an algorithm can be increased. Especially among mathematical problems there are some classes of problems that are intrinsically more difficult to solve algorithmically than others. Thus, there are degrees of computability for Turing machines which are made precise in the complexity theory of computer science. Complexity classes of problems (or corresponding functions) can be characterized by complexity degrees, which give the order of functions describing the computational time (or number of elementary computational steps) of algorithms (or computational programs) depending on the length of their inputs. The length of inputs may be measured by the number of decimal digits.

In the tradition of Artificial Intelligence (AI), the paradigm of effective computability implies that mind is represented by program- controlled machines, and mental structures refer to symbolic data structures, while mental processes implement algorithms. Even knowledge-based expert systems are conceived by algorithmic representations of highly developed AI programming languages. Today, program-controlled computers and information-processing automata have become tools of everyday life. Computer scientists distinguish several generations of hardware and software in the historical development of their machines. In artificial intelligence research one speaks of the "second computer age," meaning the transition from number-processing machines to knowledge-processing systems such as expert systems, which are said to simulate human experts, at least partially.

In the natural sciences, computability is fine for linear dynamic systems such as a harmonic oscillator with, e.g., a linear relation between the position of a mass and a force. In this case, according to Newton's second law, we get a linear differential equation of the second order. A solution of the equation can be represented by a time series of the mass position. All states of the dynamical system with position and velocity at a certain time can be represented as points in a phase space. Each solution with different initial conditions determines a trajectory in the phase space. We get a completely computable linear world like a Laplacean demon.

In the nonlinear differential equation of Verhulst dynamics (e.g., the growth of a population), the initial exponential growth is damped by a quadratic feedback term. In 1845, Verhulst already analyzed a nonlinear difference equation as a recursive iteration scheme for modeling the growth of a population xn during discrete time points, n= 1, 2, 3, . . . We get different time series of the nonlinear Verhulst dynamics depending on an increasing control parameter of growth. Weak growth leads to the well-known logistic S-shaped curve with saturation in equilibrium. Increasing growth leads to an oscillating curve between two local equilibria. Strong growth displays chaotic fluctuations depending sensitively on the initial dates. Verhulst dynamics can easily be visualized in a phase space with different attractors of a fixed point, periodic oscillation between two points, and a complete irregularity without any periodicity. It is rather astonishing that a simple mathematical law like a logistic map produces a complexity of bifurcations and chaos.

Another famous example of nonlinear dynamics is celestial mechanics. Since Poincar�'s famous many bodies problem, it has been known among mathematicians that nonlinear mechanical systems could display chaotic motion. But as long as scientists did not have suitable tools to deal with nonintregrable systems, deterministic chaos was considered a mere curiosity. During the first decades of this century, many numerical procedures were developed to deal with the mathematical complexity of nonlinear differential equations at least approximately. The calculating power of modern high speed computers and refined experimental techniques have supported the recent successes of the nonlinear complex system approach in natural and social sciences. The visualizations of nonlinear models by computer-assisted techniques promote interdisciplinary applications with far-reaching consequences in many branches of science.

In general, the causality of chaotic systems is still deterministic: any tiny change of a cause displays exponentially expanding trajectories of effects. The degree of exponential deviation can be measured mathematically by a (positive) Lyapunov exponent. In the computational version, any change of digitalized initial data of a digitalized chaotic dynamics leads to an exponentially increasing computable time of future data, limiting long-term predictions practically.

2. NATURAL LIFE AND NONLINEAR WORLD VIEW

Nonlinearity is a necessary condition not only of chaos, but also of self-organization and order in complex multicomponent systems. We have to distinguish self-organization of closed (conservative) systems near to thermal equilibrium and open (dissipative) systems far from thermal equilibrium. A typical example of conservative self-organization can be illustrated by a ferromagnet. This kind of solid body is a multicomponent system of many atomic dipoles with two possible states, up and down. If the system is very hot, there is a nearly homogeneous distribution of up and down states on the atomic microlevel. If the system is cooled down to the Curie point, the dipoles seem to organize themselves in a pattern of aligned dipoles with the same up or down orientation corresponding to the macroscopic state of magnetization. Thus, the distribution function of atomic states on the microlevel is defined as an order parameter corresponding to macroscopic pattern formation with low energy and entropy. Physically, the self-organization or emergence of order is caused by a phase transition with bifurcation of two possible equilibrium states, up and down. It is a kind of symmetry breaking, because the bifurcation of both possible ordering types (attractors) depends on tiny initial fluctuations preventing their prediction. Another nice example of conservative self-organization is the emergence of snow crystals near to the freezing point. Nanostructures such as Buckminsterfullerenes forming large balls of carbon molecules are examples of chemical self-assemblies near thermal equilibrium.

Far from thermal equilibrium, order of complex systems can emerge by increasing input of energy and matter. In a B�nard experiment, a fluid layer between two plates is heated from below. If the difference of temperature between the plates is increased to a critical value, the order formation of two possible convection rolls emerges. The molecules seem to organize themselves in a macroscopic pattern of rolls with two possible rolling directions. Thus, again, we have a bifurcation scheme with two possible order states (attractors). It is a kind of symmetry breaking, because the realized direction of a roll depends on tiny molecular fluctuation during the initial state of phase transition which cannot be forecast. If the system is driven farther and farther away from thermal equilibrium, we get a growing bifurcation scheme of possible local order states from oscillating to finally chaotic patterns.

Macroscopic patterns arise from the complex nonlinear cooperation of microscopic elements when the energetic interaction of the dissipative (open) system with its environment reaches some critical value. The stability of the emergent structures is guaranteed by some balance of nonlinearity and dissipation. Too much nonlinear interaction or dissipation would destroy the structure. As the conditions of dissipative phase transition are very general, there is a broad variety of interdisciplinary applications. A typical physical example is the laser. In chemistry, the concentric rings or moving spirals in the Belousov-Zhabotinski (BZ) reaction arise when specific chemicals are poured together with a critical value. The competition of the separated ring waves shows the nonlinearity of these phenomena very clearly, because in the case of a superposition principle the ring waves would penetrate each other like optical waves. In the language of a chemist, the nonlinearity of the process is shown by autocatalytic reactions in the chemical reaction scheme. In the language of a mathematician, we get nonlinear differential equations with concentration rates of chemical substances. The phase transitions to new order states far from thermal equilibrium are represented by time series (or energetic spectra) from periodically oscillating and bifurcating to chaotic patterns.

In general, we may say that old structures become unstable and break down by changing control parameters. On the microscopic level the stable modes of the old states are dominated by unstable modes. They determine order parameters which describe the macroscopic structure and patterns of systems. There are different final patterns of phase transitions corresponding to different attractors. Different attractors may be pictured by a stream, the velocity of which is accelerated step by step. At the first level a homogeneous state of equilibrium is shown (fixed point). At a higher level of velocity the bifurcation of two or more vortices can be observed corresponding to periodic and quasi-periodic attractors. Finally, the order decays into chaos as a fractal attractor of complex systems.

In the framework of complex systems the emergence of life is not contingent, but necessary and lawful. Only the conditions for the emergence of life (for instance on planet Earth) may be contingent in the universe. Open (dissipative) physical and chemical systems lose their structure when the input of energy and matter is stopped or changed (e.g., laser, BZ-reaction). Organismic systems (like cells) are able to conserve much of their structure at least for a relatively long time. On the other hand, they need energy and matter in a certain interval of time to keep their structure more or less far from thermal equilibrium. Thus, biological systems combine conservative and dissipative structures which are determined and reproducible by DNA-replication. After thermodynamic (conservative and dissipative) self-organization of atomic and molecular systems, DNA-replication is understood as a new kind of cellular self- organization. It is an open question of prebiotic evolution how the phase transition from conservative and dissipative to DNA-reproducible structures has been arranged. In any case, we have complex systems the development of which can be explained by the evolution of (macroscopic) order parameters caused by nonlinear (microscopic) interactions of molecules, cells, etc., in phase transitions near to or far from thermal equilibrium. Forms of biological systems (plants, animals, etc.) are described by order parameters. Spencer's idea that the evolution of life is characterized by increasing complexity can be made precise in the context of complex systems. It is well known that Turing analyzed a mathematical model of organisms represented as complex cellular systems. The evolution of the order parameter corresponds to the aggregation forms during the phase transition of the macroscopic organism. The mature multicellular body can be interpreted as the goal or (better) attractor of organic growth.

Even the ecological growth of biological populations may be simulated using the concepts of complex systems. Ecological systems are complex dissipative systems of plants or animals with mutual nonlinear metabolic interactions with each other and with their environment. The symbiosis of two populations with their source of nutrition can be described by three coupled differential equations which were already used by Edward Lorenz to describe the development of weather in meteorology. In the 19th century the Italian mathematicians Lotka and Volterra described the development of two populations in ecological competition. The nonlinear interactions of the two complex populations are determined by two coupled differential equations of prey and predator species. The evolution of the coupled systems has stationary points of equilibrium. The attractors of evolution are periodic oscillations (limit cycles).

The human organism is one of the most complex dynamic systems in biological evolution. The nonlinear network of metabolism,e.g., in a single liver cell,consists of a complex system with sensible equilibria and feedback slopes. The heart is a complex cellular organ with oscillating and chaotic patterns of behavior. The electric interactions of cells initiate action potentials with oscillating contractions as macroscopic patterns (order parameters). Their periodic pattern is represented in the time series of an electrodiagram.

Perhaps the most speculative interdisciplinary application of complex systems is the human brain as a multi-cellular system. The emergence of mental states (for instance pattern recognition, feelings, thoughts) is explained by the evolution of (macroscopic) order parameters of cerebral assemblies which are caused by nonlinear (microscopic) interactions of neural cells in learning strategies far from thermal equilibrium. After conservative and dissipative self-organization of atomic and molecular systems, and after DNA-codified self- organization of cellular systems, learning strategies are understood as neural self-organization of the brain which are not directed by the DNA code. Cell assemblies with mental states are interpreted as attractors (fixed points, periodic, quasi-periodic, or chaotic) of phase transitions. If the brain is regarded as a complex system of neural cells, then its dynamics is assumed to be described by the nonlinear mathematics of neural networks. Pattern recognition, for instance, is interpreted as a kind of phase transition by formal analogy with the evolution equations which are used for pattern emergence in physics, chemistry, and biology.

In general, it is the aim of a learning algorithm to diminish the information-theoretic measure of the discrepancy between the brain's internal model of the world and the real environment via self-organization. The recent revival of interest in the field of neural networks is mainly inspired by the successful technical applications of statistical mechanics and nonlinear dynamics to solid state physics, spun glass physics, chemical parallel computers, optical parallel computers, and laser systems. Other reasons are the recent development of computing resources and the level of technology which make a computational treatment of nonlinear systems more and more feasible.

3. ARTIFICIAL LIFE AND NONLINEAR WORLD VIEW

Besides Artificial Intelligence (AI) as a classical discipline of computer science Artificial Life (AL) is a new growing field of research in the framework of the sciences of complexity. Natural life on earth is organized on the molecular, cellular, organism, and population-ecosystem level. Artificial life aims at modeling tools powerful enough to capture the key concepts of living systems on these levels of increasing complexity.

John von Neumann's concept of cellular automata gave the first hints of mathematical models of living organisms conceived as self- reproducing networks of cells. The state space is a homogeneous lattice which is divided into equal cells like a chess board. An elementary cellular automaton is a cell which can have different states, for instance, "occupied" (by a mark), "free," or "colored." An aggregation of elementary automata is called a composite automaton or configuration. Each automaton is characterized by its environment, i.e., the neighboring cells. The dynamics of the automata is determined by synchronous transformation rules. Von Neumann proved that the typical feature of living systems, their tendency to reproduce themselves, can be simulated by an automaton with 200,000 cells (in the plane), where each cell has 29 possible states and the four orthogonal neighboring cells as environment. Although this idea is justified by a mathematically precise proof, it is hard to realize in technical computers. The reason is von Neumann's requirement that the self-reproducing structure must be a universal computer which has the complexity degree of a universal Turing machine. Obviously, self-reproducing molecules of prebiotic evolution were hardly capable of universal constructions. If one drops the requirement of universality, very simple cellular automata can be designed that can reproduce themselves.

In general, cellular automata have turned out to be discrete models of complex systems with nonlinear differential equations describing their evolution dynamics. The time evolution of their rules, characterizing the dynamics of cellular automata, produces very different cellular patterns, starting from random initial conditions. Computer experiments give rise to the following classes of attractors that the cellular patterns of evolution are aiming at. After a few steps, systems of class 1 reach a homogeneous state of equilibrium independently of the initial conditions. This final state of equilibrium is visualized by a totally white plane and corresponds to a fixed point as attractor. Systems of class 2 , after a few time steps, show a constant or periodic pattern of evolution which is relatively independent of the initial conditions. Specific positions of the pattern may depend on the initial conditions, but not the global pattern structure itself. Systems of class 3 evolve toward chaotic states as final attractors without any global periodicity. These chaotic patterns depend sensitively on the initial conditions, and show self-similar behavior with fractal dimension. Systems of class 4 produce highly complex structures with locally propagating forms. Systems of class 3 and 4 are sensitive to tiny fluctuations that can effect a global change of order (the butterfly effect). Thus, in these cases, the process of evolution cannot be forecast in the long run.

Obviously, these four classes of cellular automata model the attractor behavior of nonlinear complex systems, which is well known from self-organizing processes in nature. They can easily be related to our concept of complex systems with order and control parameters. Order parameters correspond to macroscopic spatio-temporal properties of cellular patterns such as fixed point, periodic, complex or chaotic attractors. In the complex systems approach, stable and unstable collective motions (modes) can be distinguished close to critical points of instability. This instability is caused by a change of a control parameter and leads to new macroscopic spatio-temporal patterns. In cellular automata, stable and unstable motions can be characterized by transition rules of unchanging and changing previous states. Thus, a control parameter of cellular automata must refer to their transition rules and the transient length of their patterns. It must correspond to the intuitive idea that complex systems lie somewhere in the continuum between order (like ice crystals and carbon Buckminsterfullerenes) and chaos (like molecules in a gas): organisms and brains are complex, but neither wholly ordered nor wholly disordered.

Since Darwin, selection had been emphasized as a key mechanism of biological evolution. But there are also selection processes in dissipative systems like the laser in physics or the Belousov-Zhabotinsky reaction in chemistry. The nonlinearity of their dynamics shows that there is no superposition of pattern waves, but a competition of stable and unstable modes. In the late 1960s, Ingo Rechenberg already used evolutionary strategies as blue- prints for optimizing technical systems. John Holland applied the process of selection to algorithmic learning procedures. His genetic algorithms have become a key concept in artificial life research. In general, a genetic algorithm is a method for "fitter" population. The genotypes may be represented by, for example, bit strings for the genes of an organism or by possible solutions of a problem.

The basic concept of a genetic algorithm works as follows: (1) Start with a randomly generated population of genotypes. (2) Calculate the fitness of each genotype in the population. (3) Apply selection and genetic operators to the population to create a new population. (4) Go to step 2. Selection chooses those genotypes in the population that can reproduce and decides how many offspring each is likely to have. The fitter genotypes produce on average more offspring than less fit ones. Holland distinguishes the genetic operators of crossover, mutation, and inversion. Crossover exchanges subparts of two genotypes. Mutation randomly changes the values of some locations in the genotype. Inversion reverses the order of a contiguous section of the genotype.

If the algorithmic procedure (1)-(4) is repeated over many time steps, it produces a series of populations (generations) until one or more highly fit genotype emerges. Genetic algorithms have been applied successfully to many scientific and engineering problems. Neural networks sometimes work with competitive learning procedures and learning classifier systems. Eigen's self-optimizing strategy of genetics can be understood as a kind of genetic algorithm, too.

Extremely interesting applications of genetic algorithms are computer models and computer experiments in gene technology. Another example is the medical simulation of immune systems in which learning takes place. In recent years, immunology has become a tremendously important field of medical research, especially with respect to therapies for cancer. The human immune system is capable of recognizing on the order of 1016 different foreign molecules. The genome encoding the construction rules of the immune system only contains about 105 genes. Thus, the immune system is an example of a complex dynamic system which is highly effective for pattern recognition with relatively few instruction rules. There is no central organ to control it. Thus, pattern recognition by an immune system is a macroeffect of local molecular interactions. It can be characterized by order parameters. They represent the extent of molecular binding between the cells performing the recognition (antibodies) and the foreign material (antigens). Genetic algorithms may constitute artificial immune systems in the medium of a computer by representing antigens and antibodies with binary strings. The binary immune systems have been used to study the ability to detect typical patterns in an environment of randomly distributed antigens or to learn new patterns if the pathogenic material is rapidly evolving by mutations.

In biology, artificial life systems are applied as software, hardware and wetware systems. Wetware means the techniques of a biochemical laboratory which, for example, are used to model the molecular origins of prebiotic evolution in an evolutionary reactor. These are, of course, empirical experiments reproducing the biochemical evolution of life or creating new forms of biological life. They are only called "artificial" because the conditions for the self-organization of biological life are arranged by human experimenters. Software systems are computer programs or networks modeling key concepts of life and evolution. We already analyzed the cellular automata approach. There are also computer programs modeling the evolution of animals. In the program GENESYS, for example, animals are represented as neural nets or finite automata. The genes of each organism are represented as bit strings encoding the weights of a neural net or the transition lists of a finite automaton. In order to increase the modeling capacity of the program, it is executed on a massively parallel connection machine.

Clearly, the artificial life approach is not a homogeneous field of research. But it seems to provide many fruitful modeling instruments for chemical and biological research. Besides these practical applications, there is a visionary dream of AL with a hard scientific core. Factual biological evolution is only a model of a complex dynamics which is governed by highly nonlinear equations. Today, we only know some properties of these equations, and where we know them, we have no analytical tools to solve them exactly. Even numerical approximations would be restricted by their immense degree of computational complexity. Nevertheless, computer models may allow computer experiments to become familiar with possible scenarios under several restricted constraints. These experiences with computer experiments may be used to create particular conditions under which new materials may construct themselves and new forms of life may organize themselves. The first steps have already been taken. Even the emergence of artificial con- sciousness cannot be excluded in principle. If we know the complex neural dynamics behind conscious states, then the wetware and hardware of the human brain is only a particular model. It is not matter itself that makes possible new materials, life, and consciousness, but its particular organization and dynamic laws which can be modeled in more or less complex systems. Thus, in the long run, it becomes an ethical question whether we want an artificial evolution with an open end which we cannot forecast.

4. THE TECHNICAL EVOLUTION OF COMPUTER-ASSISTED COMPLEX SOCIETIES

Life science and computer science are not only the highlights of future technologies, but they are growing together with common objects of research in artificial life and artificial evolution. After Artificial Intelligence (AI) as a classical discipline of computer science, Artificial Life (AL) is a new growing field of research in the framework of the sciences of complexity.

Literally, the term artificial life means life made by humans rather than by nature. Natural life on earth is organized on the molecular, cellular, organism, and population-ecological level. Artificial life aims to find modeling tools powerful enough to capture the key concepts of living systems on these levels of increasing complexity at the edge of chaos. The attempt to recreate biological phenomena in alternative media (and not based on carbon-chain chemistry) will result in not only better theoretical understanding of the phenomena under study, but also in practical applications of biological principles in the technology of computer hardware and software, mobile robots, spacecraft, medicine, nanotechnology, industrial fabrication and assembly, and other vital engineering projects.

Obviously, nonlinear complex systems have become a successful problem solving approach, from the life sciences to AL-technology. It is now recognized that many of our social, economic, ecological, and political problems are also of a global, complex, and nonlinear nature. The capability to manage the complexity of modern societies depends decisively on an effective computer-assisted communication network. Like the neural nets of biological brains, these networks determine the learning capability of human societies. Thus, we speak of informational and computational ecologies. These large artificial ecological networks, emerging from the increasing connectivity between diverse computer- assisted information centers, are becoming more or less self-organizing systems. Incomplete knowledge and delayed information are the typical features of open computational systems which have no central controls. These large networks, emerging from the increasing connectivity between diverse computer-assisted information centers, are becoming self- organizing systems which are different from their individual program- controlled components. Their unplanned growth leads to an immense diversity of technical composition and use, with increasing difficulties of interoperability. The horrifying vision of a separated robotic world enslaving human culture may be the ultimate consequence of many possible uncomfortable scenarios.

As the worldwide growth of local information and computation centers cannot be planned by a centralized processor, it seems to have a nonlinear dynamics which should be studied in the framework of complex systems. Even simplified case studies would provide crucial insights into the complex dynamics of modern sociocultural evolution. The complex interdependencies of computational ecologies violate the traditional requirements for a hierarchical decomposition into technical, industrial, or administrative modules as used in traditional management. Modern technical communication networks are growing open systems which must be used without central control, synchronicity, or consistent data from other agents like machines or humans. Thus, the dynamic theory of informational and computational ecologies, which incorporates the features of incomplete knowledge and delayed information, will provide well known evolutionary patterns like fixed points, oscillations, or chaos.

Under the conditions of complexity more and more control functions and human activities must be ceded and replaced by artificial intelligent systems. Human responsibility is not abolished, but restricted by collective and nonlinear effects of complex systems which cannot be forecast or controlled in the long run. Thus, we do not promote any kind of biologism or reductionism. But, under the conditions of complexity, it is not enough to have good individual intentions. We have to consider the nonlinear dynamic effects of computer technology, Artificial Life, and Artificial Intelligence for the future of human society.

REFERENCES

Holland, J. H. 1992. Adaptation in Natural and Artificial Systems, 2d ed. Cambridge, MA: MIT Press.

Langton, C. G. 1989. Artificial Life. Redwood City, CA: Addison Wesley.

Lindenmayer, A., and G. Rozenberg, eds. 1976. Automata, Languages, Development. Amsterdam: North Holland.

Mainzer, K. 1995. Computer,Neue Fl gel des Geistes?, 2d ed. Berlin and New York: De Gruyter.

_______. 1997a. Thinking in Complexity: The Complex Dynamics of Matter, Mind, and

Mankind, 3d ed. Berlin, Heidelberg, and New York: Springer.

_______. 1997b. Gehirn, Computer, Komplexit�t. Berlin, Heidelberg, and New York: Springer.

Rechenberg, I. 1973. Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution. Stuttgart: Fromman- Holzboog