(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Recent Additions

V. Systems Evolution: A 21st Century Genesis Synthesis

C. An Evolutionary Intelligence

    The image is a cover for a March 26, 2016 issue of the New Scientist wherein features editor Kate Douglas reports on a novel approach to evolutionary theory, which we have been posting for some time, about how life seems to develop and emerge akin to a learning, informed brain. Its origins may be traced to Richard Watson’s computer and living systems group at the University of Southampton, UK. Lately, as these citations convey, the insightful parallel has been endorsed by authorities such as Eors Szathmary and Gunter Wagner. These qualities are then seen as similar to algorithmic operations, as the previous section describes. While some randomness is alluded, an oriented emergence occurs by virtue of referring to, and building upon, prior, remembered experience. This section will also report how (artificial) neural networks are being found to apply to quantum, chemical, biological, and linguistic phases.


Evolutionary Biology and the Theory of Computing. simons.berkeley.edu/programs/evolution2014. Reported more in Natural Algorithms, this is a Simons Foundation program in the spring of 2014 whose organizers include Christos Papadimitriou and Leslie Valiant. Various seminars were Evolutionary Biology Boot Camp (January tutorial) with Eugene Koonin; Computational Theories of Evolution, Nick Barton organizer, with speakers such as Richard Watson, Chrisantha Fernando, Adi Livnat, and Steven Frank; and New Directions in Probabilistic Models of Evolution.

Azulay, Aharon, et al. The C. elegans Connectome Consists of Homogenous Circuits with Defined Functional Roles. PLoS Computational Biology. Online September, 2016. At the frontier of integral syntheses across cerebral, social, and cognitive evolution, Hebrew University, Jerusalem, geneticists and neuroscientists distill from neural network complexities a common, independent principle which is at work from Internet media and human brains to this multicellular roundworm. To reflect and appreciate, a designated “connectome” can be seen in ramifying effect across life’s episodic emergence. See also in this section The Multilayer Connectome of Caenorhabditis elegans by Bentley, et al for a similar study.

A major goal of systems neuroscience is to decipher the structure-function relationship in neural networks. Here we study network functionality in light of the common-neighbor-rule (CNR) in which a pair of neurons is more likely to be connected the more common neighbors it shares. Focusing on the fully-mapped neural network of C. elegans worms, we establish that the CNR is an emerging property in this connectome. Moreover, sets of common neighbors form homogenous structures that appear in defined layers of the network. Simulations of signal propagation reveal their potential functional roles: signal amplification and short-term memory at the sensory/inter-neuron layer, and synchronized activity at the motoneuron layer supporting coordinated movement. A coarse-grained view of the neural network based on homogenous connected sets alone reveals a simple modular network architecture that is intuitive to understand. These findings provide a novel framework for analyzing larger, more complex, connectomes once these become available. (Abstract)

While the emergence of the CNR in social networks might be intuitive to understand, the benefit of such a design in neural networks is not trivially apparent. In the rat cortex the observed CNR is thought to be an organizing principle that clusters neurons into elementary
building blocks of cortical computations and memory. Our study provides several novel insights to this phenomenon: we established that the CNR is indeed an emerging organizing principle in the C. elegans neural network, and that sets of common neighbor neurons can be viewed as building blocks found in defined layers of the network exerting valuable functional roles (e.g., signal amplification, synchronization, and robust information processing). These novel findings may explain the emergence of the CNR in mammalian neural networks as well. For example, signal amplification and robust information processing are essential for efficient cortical computations. Thus, it will be fascinating to study these cortical building blocks in light of the observed CNR once such connectomes become available. (10)

Bentley, Barry, et al. The Multilayer Connectome of Caenorhabditis elegans.. arXiv:1608.08793. A six person team from the MRC Laboratory of Molecular Biology, Cambridge, UK and Cambridge University, including neuroscientist Edward Bullmore, explore some ways in which neural networks, as a prototypical nonlinear system, can equally apply to the rudimentary sensory circuit of this roundworm phylum. And these many 2016 correlations across natural scales seem to find a generic dynamic complexity in the guise of a cerebral icon with an “omics” suffix which is effect from a quickening uniVerse to our human epitome.

Connectomics represents an effort to map brain structure at the level of individual neurons and their synaptic connections. However, neural circuits also depend on other types of interneuronal signalling, such as extrasynaptic modulation by monoamines and peptides. Here we present a draft monoamine connectome, along with a partial neuropeptide connectome, for the nematode C. elegans, based on new and published expression data for biosynthetic genes and receptors. We describe the structural properties of these "wireless" networks, including their topological features and modes of interaction with the wired synaptic and gap-¬junction connectomes. This multilayer connectome of C. elegans can serve as a prototype for understanding the multiplex networks comprising larger nervous systems, including the human brain. (Summary)

Betti, Alessandro and Marco Gori. The Principle of Least Cognitive Action. Theoretical Computer Science. 633/83, 2016. In a special issue on Biologically Inspired Processes in Neural Computation, edited by UM Amherst computer scientist Hava Siegelmann, a University of Pisa physicist and a University of Siena mathematician consider how nature’s evolutionary ascent of cerebral cognizance can be traced to and rooted in an intrinsic physical lawfulness. We even wonder if a new field such as “physioinformatics” (suggest a better term) could be added? See also in this issue The Grammar of Mammalian Brain Capacity, Topological Evolution for Embodied Cellular Automata, and A Circuit Basis for Morphogenesis.

By and large, the interpretation of learning as a computational process taking place in both humans and machines is primarily provided in the framework of statistics. In this paper, we propose a radically different perspective in which the emergence of learning is regarded as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment. We introduce a natural learning theory based on the principle of least cognitive action, which is inspired to the related mechanical principle, and to the Hamiltonian framework for modeling the motion of particles. The introduction of the kinetic and of the potential energy leads to a surprisingly natural interpretation of learning as a dissipative process. The kinetic energy reflects the temporal variation of the synaptic connections, while the potential energy is a penalty that describes the degree of satisfaction of the environmental constraints. The theory gives a picture of learning in terms of the energy balancing mechanisms, where the novel notions of boundary and bartering energies are introduced. (Abstract)

In this paper, we investigate the emergence of learning as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment, regardless of their nature. The underlying principle is that the acquisition of cognitive skills by learning obeys information-based laws on these interactions, which hold regardless of biology. In this new perspective, in particular, we introduce a natural learning theory aimed at discovering the fundamental temporally-embedded mechanisms of environmental interaction. (83-84) More than looking for smart algorithmic solutions to deal with temporal coherence, in this paper we rely on the of regarding learning as a process which is deeply interwound with time, just like in most laws of nature. (84)

Cabessa, Jeremie and Hava Siegelmann. The Computation Power of Interactive Recurrent Neural Networks. Network: Computation in Neural Systems. 24/4, 2012. Reviewed more in Universal Darwinism, University of Massachusetts, Amherst, computational neuroscientists take these cerebral complexities to exemplify how nature evolves, develops and learns. We are then invited to realize that the same dynamical trial and error, feedback to move forward, iterative process is in effect everywhere. See also Turing on Super-Turing and Adaptivity by Hava Siegelmann in Progress in Biophysics and Molecular Biology (113/117, 2013).

Chaistain, Erick, at al. Algorithms, Games, and Evolution. Proceedings of the National Academy of Sciences. 111/10620, 2014. This entry is cited in the Kate Douglas report that headlines the section. Reported more in Natural Algorithms, with Adi Livnat, Christos Papadimitriou, and Umesh Vazirani, theoretical biologists pose a working comparison between natural selection seen as a search, iterate, and optimize process, and a machine learning procedure called “multiplicative weight updates algorithm” (MWUA). Both modes involve explorations and samplings of diverse populations, subject to trials, errors, then retests, so to reach a “good enough” state or solution. A Commentary in the same issue, Diverse Forms of Selection in Evolution and Computer Science by Nicholas Barton, et al, supports the finding.

Churchill, Alexander, et al. Learning to Generate Genotypes with Neural Networks. arXiv: 1604.04153. Reviewed more in Natural Algorithms, Queen Mary University of London enter another way to understand how genomes and connectomes (neuromes) are basically similar in form and function.

de Vladar, Harold and Eors Szathmary. Neuronal Boost to Evolutionary Dynamics. Interface Focus. 5/6, 2015. In an issue on Are There Limits to Evolution?, Parmenides Foundation, Germany, and Eotvos University, Hungary, biologists, who have been collaborators on a novel view of life’s gestation akin to neural networks (search Watson), here view brain development via Darwinian genetic inheritance as a replicative neurodynamics. In this sense, analogous rugged landscapes serve a “fitness climbing and learning” function. A notable precursor cited is Hebb and Darwin by Paul Adams in the Journal of Theoretical Biology (195/419, 1998). A current companion would be Understanding Emergent Dynamics by John Hopfield, the 1980s founder of neural net theory (Neural Computation 27/10, 2015).

Standard evolutionary dynamics is limited by the constraints of the genetic system. A central message of evolutionary neurodynamics is that evolutionary dynamics in the brain can happen in a neuronal niche in real time, despite the fact that neurons do not reproduce. We show that Hebbian learning and structural synaptic plasticity broaden the capacity for informational replication and guided variability provided a neuronally plausible mechanism of replication is in place. The synergy between learning and selection is more efficient than the equivalent search by mutation selection. We also consider asymmetric landscapes and show that the learning weights become correlated with the fitness gradient. That is, the neuronal complexes learn the local properties of the fitness landscape, resulting in the generation of variability directed towards the direction of fitness increase, as if mutations in a genetic pool were drawn such that they would increase reproductive success. (Abstract)

In higher animals, complex and robust behaviors are produced by the microscopic details of large structured ensembles of neurons. I describe how the emergent computational dynamics of a biologically based neural network generates a robust natural solution to the problem of categorizing time-varying stimulus patterns such as spoken words or animal stereotypical behaviors. (Hopfield Abstract) Robust emergent behaviors of large systems (whether physical or biological) are a logical consequence of the behaviors and interactions of the more microscopic elements making up the large system. (2030).

Fernando, Chrisantha, et al. Selectionist and Evolutionary Approaches to Brain Function. Frontiers in Computational Neuroscience. 6/Art. 24, 2012. With Eors Szathmary and Phil Husbands, another contribution that articulates the deep affinity of neural activities with life’s long iterative development. As Richard Watson, Hava Siegelmann, John Mayfield, Steven Frank, and increasing number contend, this achieves a 21st century appreciation of how “natural selection” actually applies. While a winnowing optimization toward “good enough to survive” goes on, the discovery of dynamic, learning-like, algorithms can now provide a prior genetic-like guidance.

We consider approaches to brain dynamics and function that have been claimed to be Darwinian. These include Edelman’s theory of neuronal group selection, Changeux’s theory of synaptic selection and selective stabilization of pre-representations, Seung’s Darwinian synapse, Loewenstein’s synaptic melioration, Adam’s selfish synapse, and Calvin’s replicating activity patterns. Except for the last two, the proposed mechanisms are selectionist but not truly Darwinian, because no replicators with information transfer to copies and hereditary variation can be identified in them. Bayesian models and reinforcement learning are formally in agreement with selection dynamics. A classification of search algorithms is shown to include Darwinian replicators (evolutionary units with multiplication, heredity, and variability) as the most powerful mechanism for search in a sparsely occupied search space. Finally, we review our recent attempts to construct and analyze simple models of true Darwinian evolutionary units in the brain in terms of connectivity and activity copying of neuronal groups. (Abstract)

The production of functional molecules is critical for life and also for an increasing proportion of industry. It is also important that genes represent what in cognitive science has been called a “physical symbol system.” Today, the genetic code is an arguably symbolic mapping between nucleotide triplets and amino acids. Moreover, enzymes “know” how to transform a substrate into a product, much like a linguistic rule “knows” how to act on some linguistic constructions to produce others. How can such functionality arise? Combinatorial chemistry is one of the possible approaches. The aim is to generate-and-test a complete library of molecules up to a certain length. (9)

In summary we have distinguished between selectionist and truly Darwinian theories, and have proposed a truly Darwinian theory of Darwinian Neurodynamics. The suggestion that true Darwinian evolution can happen in the brain during, say, complex thinking, or the development of language in children, is ultimately an empirical issue. Three possible outcomes are possible: (i) nothing beyond the synapse level undergoes Darwinian evolution in the brain; (ii) units of evolution will be identified that are very different from our “toy model” suggestions in this paper (and elsewhere); and (iii) some of the units correspond, with more complex details, to our suggested neuronal replicators. (17)

Fernando, Chrisantha, et al. The Neuronal Replicator Hypothesis. Neural Computation. 22/2809, 2010. As the decadal review expresses, in the 21st century, by our collaborative humankind, many cross-fertilizations between topical natural and social fields are underway. Here neuroscientists Fernando, with Richard Goldstein and Eors Szathmary, propose the presence of evolutionary algorithms in cerebral functions which search, improve, and select as we learn and think. See also Evolvable Neuronal Paths: A Novel Basis for Information and Search in the Brain by CF, et al, in PLoS One (6/8, 2011).

We propose that replication (with mutation) of patterns of neuronal activity can occur within the brain using known neurophysiological processes. Thereby evolutionary algorithms implemented by neuronal circuits can play a role in cognition. Replication of structured neuronal representations is assumed in several cognitive architectures. Replicators overcome some limitations of selectionist models of neuronal search. (Abstract)

Hochberg, Michael, et al. Innovation: An Emerging Focus from Cells to Societies. Philosophical Transactions of the Royal Society B. Vol. 372/Iss. 1736, 2017. Hochberg, University of Montpellier, Pablo Marquet, Santa Fe Institute, Robert Boyd, Arizona State University, and Andreas Wagner, University of Zurich introduce a focus issue with this title. We place its 16 papers by leading researchers and theorists in this section because they attempt to specify an important tendency of life’s developmental evolution to seek behavioral, communal, and artificial novelties for survival and thrival. With a notice of the major transitions scale, an intensifying cumulative culture and intelligent knowledge can be traced from microbe colonies to civilizations. A. Wagner has been an advocate (search), which he reviews in Information Theory, Evolutionary Innovations and Evolvability. Douglas Erwin follows with The Topology of Evolutionary Novelty. Some other entries are The Origin of Heredity in Protocells, Nascent Life Cycles and the Emergence of Higher-Level Individuality, and Innovation and the Growth of Human Population.

This insight into life’s personal and communal cleverness, as the episodic tandem of complexity and sentience arises to our global retrospect, alludes to an quickening individuality. Although not noted, a companion effort is the Open-Ended Creativity school of Walter Banzhaf, Hector Zenil, Sara Walker, Ricard Sole and company, search each. While the ratio of men to women remains 10 to 1, a luminous entry is Innovation and Social Transmission in Experimental Micro-Societies: Exploring the Scope of Cumulative Culture in Young Children by Nicola McGuigan, Emily Burdet, Vanessa Burgess, Lewis Dean, Amanda Lucas, Gillian Vale, and Andrew Whiten. An Abstract for this iconic microcosm is the third quote.

Innovations are generally unexpected, often spectacular changes in phenotypes and ecological functions. The contributions to this theme issue are the latest conceptual, theoretical and experimental developments, addressing how ecology, environment, ontogeny and evolution are central to understanding the complexity of the processes underlying innovations. Here, we set the stage by introducing and defining key terms relating to innovation and discuss their relevance to biological, cultural and technological change. Discovering how the generation and transmission of novel biological information, environmental interactions and selective evolutionary processes contribute to innovation as an ecosystem will shed light on how the dominant features across life come to be, generalize to social, cultural and technological evolution, and have applications in the health sciences and sustainability. (Main Abstract)

How difficult is it to ‘discover’ an evolutionary adaptation or innovation? I here suggest that information theory, in combination with high-throughput DNA sequencing, can help answer this question by quantifying a new phenotype's information content. I apply this framework to compute the phenotypic information associated with novel gene regulation and with the ability to use novel carbon sources. The framework can also help quantify how DNA duplications affect evolvability, estimate the complexity of phenotypes and clarify the meaning of ‘progress’ in Darwinian evolution. (Wagner Abstract)

The experimental study of cumulative culture and the innovations essential to it is a young science, with child studies so rare that the scope of cumulative cultural capacities in childhood remains largely unknown. Here we report a new experimental approach to the inherent complexity of these phenomena. Groups of 3–4-year-old children were presented with an elaborate array of challenges affording the potential cumulative development of a variety of techniques to gain increasingly attractive rewards. We found evidence for elementary forms of cumulative cultural progress, with inventions of solutions at lower levels spreading to become shared innovations, and some children then building on these to create more advanced but more rewarding innovations. This contrasted with markedly more constrained progress when children worked only by themselves, or if groups faced only the highest-level challenges from the start. Our results show children are not merely ‘cultural sponges’, but when acting in groups, display the beginnings of cycles of innovation and observational learning that sustain cumulative progress in problem solving. (McGuigan Abstract)

Khajehabdollahi, Sina and Olaf Witkowski. Critical Learning vs. Evolution. Ikegami, Takashi, et al, eds.. ALIFE 2018 Conference Proceedings. Cambridge: MIT Press, 2018. A select paper from this online volume (Ikegami) by University of Western Ontario and Earth-Life Science Institute, Tokyo biophysicists who seek better insights into life’s quickening sentience by way of inherent complexity principles. By current turns, these dynamic phenomena appear to be increasingly cerebral in kind and function. In an extension of this view, just as brains are found to prefer and reside in a critically poised optimum state, so it seems that so does evolutionary developmental emergence.

Criticality is thought to be crucial for complex systems to adapt, at the boundary between regimes with different dynamics, where the system may transition from one phase to another. Numerous systems, from sandpiles to gene regulatory networks, to swarms and human brains, seem to work towards preserving a precarious balance right at their critical point. Understanding criticality therefore seems strongly related to a broad, fundamental theory for the physics of life as it could be, which still lacks a clear description of how it can arise and maintain itself in complex systems. (Abstract excerpt)

Understanding the utility of criticality in artificial life systems is important for understanding how complexity can self-organize into predictable but adaptive systems. This project applied the methods of critical learning to a community of Ising-embodied organisms subject to evolutionary selection pressures in order to understand how criticality affects the behavior and genotypes of the organisms and how these changes in turn affect the fitness and adaptability of the community. (53)

Olaf Witkowski’s research tackles distributed intelligence in living systems and societies, employing the tools of artificial life, connectionist learning, and information theory, to reach a better understanding of the following triptych of complex phenomena: the emergence of information flows that led to the origins of life, the evolution of intelligence in the major evolutionary transitions, the expansion of communication and cooperation in the future of the bio- and technosphere. (OW website)

1 | 2 | 3  Next