(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

VI. Life’s Cerebral Cognizance Becomes More Complex, Smarter, Informed, Proactive, Self-Aware

1. Intelligence Evolution and Knowledge Gain as a Central Course

Evolutionary Biology and the Theory of Computing. simons.berkeley.edu/programs/evolution2014. Reported more in Natural Algorithms, this is a Simons Foundation program in the spring of 2014 whose organizers include Christos Papadimitriou and Leslie Valiant. Various seminars were Evolutionary Biology Boot Camp (January tutorial) with Eugene Koonin; Computational Theories of Evolution, Nick Barton organizer, with speakers such as Richard Watson, Chrisantha Fernando, Adi Livnat, and Steven Frank; and New Directions in Probabilistic Models of Evolution.

Azulay, Aharon, et al. The C. elegans Connectome Consists of Homogenous Circuits with Defined Functional Roles. PLoS Computational Biology. Online September, 2016. At the frontier of integral syntheses across cerebral, social, and cognitive evolution, Hebrew University, Jerusalem, geneticists and neuroscientists distill from neural network complexities a common, independent principle which is at work from Internet media and human brains to this multicellular roundworm. To reflect and appreciate, a designated “connectome” can be seen in ramifying effect across life’s episodic emergence. See also in this section The Multilayer Connectome of Caenorhabditis elegans by Bentley, et al for a similar study.

A major goal of systems neuroscience is to decipher the structure-function relationship in neural networks. Here we study network functionality in light of the common-neighbor-rule (CNR) in which a pair of neurons is more likely to be connected the more common neighbors it shares. Focusing on the fully-mapped neural network of C. elegans worms, we establish that the CNR is an emerging property in this connectome. Moreover, sets of common neighbors form homogenous structures that appear in defined layers of the network. Simulations of signal propagation reveal their potential functional roles: signal amplification and short-term memory at the sensory/inter-neuron layer, and synchronized activity at the motoneuron layer supporting coordinated movement. A coarse-grained view of the neural network based on homogenous connected sets alone reveals a simple modular network architecture that is intuitive to understand. These findings provide a novel framework for analyzing larger, more complex, connectomes once these become available. (Abstract)

While the emergence of the CNR in social networks might be intuitive to understand, the benefit of such a design in neural networks is not trivially apparent. In the rat cortex the observed CNR is thought to be an organizing principle that clusters neurons into elementary
building blocks of cortical computations and memory. Our study provides several novel insights to this phenomenon: we established that the CNR is indeed an emerging organizing principle in the C. elegans neural network, and that sets of common neighbor neurons can be viewed as building blocks found in defined layers of the network exerting valuable functional roles (e.g., signal amplification, synchronization, and robust information processing). These novel findings may explain the emergence of the CNR in mammalian neural networks as well. For example, signal amplification and robust information processing are essential for efficient cortical computations. Thus, it will be fascinating to study these cortical building blocks in light of the observed CNR once such connectomes become available. (10)

Bentley, Barry, et al. The Multilayer Connectome of Caenorhabditis elegans.. arXiv:1608.08793. A six person team from the MRC Laboratory of Molecular Biology, Cambridge, UK and Cambridge University, including neuroscientist Edward Bullmore, explore some ways in which neural networks, as a prototypical nonlinear system, can equally apply to the rudimentary sensory circuit of this roundworm phylum. And these many 2016 correlations across natural scales seem to find a generic dynamic complexity in the guise of a cerebral icon with an “omics” suffix which is effect from a quickening uniVerse to our human epitome.

Connectomics represents an effort to map brain structure at the level of individual neurons and their synaptic connections. However, neural circuits also depend on other types of interneuronal signalling, such as extrasynaptic modulation by monoamines and peptides. Here we present a draft monoamine connectome, along with a partial neuropeptide connectome, for the nematode C. elegans, based on new and published expression data for biosynthetic genes and receptors. We describe the structural properties of these "wireless" networks, including their topological features and modes of interaction with the wired synaptic and gap-¬junction connectomes. This multilayer connectome of C. elegans can serve as a prototype for understanding the multiplex networks comprising larger nervous systems, including the human brain. (Summary)

Betti, Alessandro and Marco Gori. The Principle of Least Cognitive Action. Theoretical Computer Science. 633/83, 2016. In a special issue on Biologically Inspired Processes in Neural Computation, edited by UM Amherst computer scientist Hava Siegelmann, a University of Pisa physicist and a University of Siena mathematician consider how nature’s evolutionary ascent of cerebral cognizance can be traced to and rooted in an intrinsic physical lawfulness. We even wonder if a new field such as “physioinformatics” (suggest a better term) could be added? See also in this issue The Grammar of Mammalian Brain Capacity, Topological Evolution for Embodied Cellular Automata, and A Circuit Basis for Morphogenesis.

By and large, the interpretation of learning as a computational process taking place in both humans and machines is primarily provided in the framework of statistics. In this paper, we propose a radically different perspective in which the emergence of learning is regarded as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment. We introduce a natural learning theory based on the principle of least cognitive action, which is inspired to the related mechanical principle, and to the Hamiltonian framework for modeling the motion of particles. The introduction of the kinetic and of the potential energy leads to a surprisingly natural interpretation of learning as a dissipative process. The kinetic energy reflects the temporal variation of the synaptic connections, while the potential energy is a penalty that describes the degree of satisfaction of the environmental constraints. The theory gives a picture of learning in terms of the energy balancing mechanisms, where the novel notions of boundary and bartering energies are introduced. (Abstract)

In this paper, we investigate the emergence of learning as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment, regardless of their nature. The underlying principle is that the acquisition of cognitive skills by learning obeys information-based laws on these interactions, which hold regardless of biology. In this new perspective, in particular, we introduce a natural learning theory aimed at discovering the fundamental temporally-embedded mechanisms of environmental interaction. (83-84) More than looking for smart algorithmic solutions to deal with temporal coherence, in this paper we rely on the of regarding learning as a process which is deeply interwound with time, just like in most laws of nature. (84)

Brun-Usan, Miguel, et al.. How to Fit In: The Learning Principles of Cell Differentiation. PLoS Computational Biology.. April, 2020. University of Southampton, UK, computer scientists including Richard Watson continue their revisionary studies of biological metabolisms by viewing them through a learning lens. A cerebral perspective, as this section reports, can provide better insights into cellular processes because both evolution and learning are explorations in search of solutions. A further step is to integrate this view with gene regulatory networks so these common models can reinforce each other. Altogether this approach implies that life’s oriented emergence is trying to achieve some manner of its own self-description and comprehension.

Cell differentiation in multicellular organisms requires cells to respond to complex combinations of extracellular cues, such as morphogen concentrations. But a general theory describing how cells integrate multi-dimensional signals is still lacking. In this work, we propose a framework from learning theory to understand the relationships between environmental cues (inputs) and phenotypic responses (outputs) underlying cell plasticity.. Altogether, these results illustrate the functional parallelisms between learning in neural networks and the action of natural selection on environmentally sensitive gene regulatory networks. This offers a theoretical framework for responses that integrate information from multiple cues, a phenomenon that underpins the evolution of multicellularity and developmental robustness. (Abstract excerpt)

Cabessa, Jeremie and Hava Siegelmann. The Computation Power of Interactive Recurrent Neural Networks. Network: Computation in Neural Systems. 24/4, 2012. Reviewed more in Universal Darwinism, University of Massachusetts, Amherst, computational neuroscientists take these cerebral complexities to exemplify how nature evolves, develops and learns. We are then invited to realize that the same dynamical trial and error, feedback to move forward, iterative process is in effect everywhere. See also Turing on Super-Turing and Adaptivity by Hava Siegelmann in Progress in Biophysics and Molecular Biology (113/117, 2013).

Chaistain, Erick, at al. Algorithms, Games, and Evolution. Proceedings of the National Academy of Sciences. 111/10620, 2014. This entry is cited in the Kate Douglas report that headlines the section. Reported more in Natural Algorithms, with Adi Livnat, Christos Papadimitriou, and Umesh Vazirani, theoretical biologists pose a working comparison between natural selection seen as a search, iterate, and optimize process, and a machine learning procedure called “multiplicative weight updates algorithm” (MWUA). Both modes involve explorations and samplings of diverse populations, subject to trials, errors, then retests, so to reach a “good enough” state or solution. A Commentary in the same issue, Diverse Forms of Selection in Evolution and Computer Science by Nicholas Barton, et al, supports the finding.

Churchill, Alexander, et al. Learning to Generate Genotypes with Neural Networks. arXiv: 1604.04153. Reviewed more in Natural Algorithms, Queen Mary University of London enter another way to understand how genomes and connectomes (neuromes) are basically similar in form and function.

Csermely, Peter, et al. Learning of Signaling Networks. arXiv:2001.11679. We cite this paper by five Semmelweis University, Budapest system scientists as an example of how cerebral facilities can be easily grafted onto and evident in all manner of genetic and metabolic anatomy and physiology, because they naturally spring from and manifest one, same source. By this perception, life’s long evolutionary development can increasingly appear as an oriented encephalization and cosmic education.

Molecular processes of neuronal learning have been well-described. However, learning mechanisms of non-neuronal cells have not been fully understood. Here, we discuss molecular mechanisms of cellular learning, including conformational memory of intrinsically disordered proteins and prions, signaling cascades, protein translocation, RNAs, and chromatin memory. We hypothesize that these processes constitute the learning of signaling networks and correspond to a generalized Hebbian learning process of single, non-neuronal cells. We then discuss how cellular learning may open novel directions in drug design and inspire new artificial intelligence methods. (Abstract)

Czegel, Daniel, et al. Bayes and Darwin: How Replicator Populations Implement Bayesian Computations. BioEssays. 44/4, 2022. DC and Eors Szmathary, Institute of Evolution, Budapest, Hamza Glaffar, Cold Spring Harbor Laboratory and Joshua Tenenbaum, MIT, continue their project to perceive and identify life’s developmental emergence as primarily a cerebral, cognitive, knowledge-gaining occasion. It is argued that every organism across all Metazoan domains must be aware of, learn about and be able to predict their ever-changing environs so as to survive. Here bodily evolution (Darwin) and proactive mind (Bayes) are seen to advance in a parallel, reciprocal way. Their 2020s version can now be informed by novel probalistic, iterative, cognitive models. Writ large, a semblance of a self-educating, making, affirming, autocatalytic, proactive reality becomes evident. A premier feature is then a consilence of past reference and open future. See also Novelty and Imitation within the Brain by this group in Nature Scientific Reports (11:12513, 2021).

Bayesian learning theory and evolutionary theory both formalize adaptive competition dynamics in variable environments. What do they have in common and how do they differ? In this paper, we discuss structural and process analogies at a computational and an algorithmic-mechanical level. We point out mathematical equivalence and isomorphism between Bayesian update and replicator dynamics. We discuss how these mechanisms provide similar ways to adapt to stochastic conditions at multiple timescales. We thus find replicator populations to encode regularities so as to predict future environments. As a notable result, a unified view of the theories of learning and evolution can be achieved. (Abstract)

Czegel, Daniel, et al. Novelty and Imitation within the Brain: A Darwinian Neurodynamic Approach to Combinatorial Problems. Nature Scientific Reports. 11:12513, 2021. ). DC, Eors Szmathary, Marton Csillag, and Balint Futo, Institute of Evolution, Budapest, along with Hamza Glaffar, Cold Spring Harbor Laboratory post a latest version of their studies of life’s creaturely evolution as most involved with progressively gaining intelligence and knowledge so to best survive. See also Bayes and Darwin: How Replicator Populations Implement Bayesian Computations by this collegial team in BioEssay. (44/4, 2022.)

Efficient search in combinatorial spaces, such as those of possible action sequences, linguistic structures, or causal explanations, is an essential component of intelligence. Based our prior work, we propose that a Darwinian process, operating over sequential cycles of imperfect copying and selection of neural informational patterns, is a promising candidate. In teacher and learner settings, we demonstrate that the emerging Darwinian population of readout activity patterns can maintain and continually improve upon existing solutions A novel analysis method, neural phylogenies, is then proposed that displays the unfolding of the neural-evolutionary process. (Abstract excerpt)

de Vladar, Harold and Eors Szathmary. Neuronal Boost to Evolutionary Dynamics. Interface Focus. 5/6, 2015. In an issue on Are There Limits to Evolution?, Parmenides Foundation, Germany, and Eotvos University, Hungary, biologists, who have been collaborators on a novel view of life’s gestation akin to neural networks (search Watson), here view brain development via Darwinian genetic inheritance as a replicative neurodynamics. In this sense, analogous rugged landscapes serve a “fitness climbing and learning” function. A notable precursor cited is Hebb and Darwin by Paul Adams in the Journal of Theoretical Biology (195/419, 1998). A current companion would be Understanding Emergent Dynamics by John Hopfield, the 1980s founder of neural net theory (Neural Computation 27/10, 2015).

Standard evolutionary dynamics is limited by the constraints of the genetic system. A central message of evolutionary neurodynamics is that evolutionary dynamics in the brain can happen in a neuronal niche in real time, despite the fact that neurons do not reproduce. We show that Hebbian learning and structural synaptic plasticity broaden the capacity for informational replication and guided variability provided a neuronally plausible mechanism of replication is in place. The synergy between learning and selection is more efficient than the equivalent search by mutation selection. We also consider asymmetric landscapes and show that the learning weights become correlated with the fitness gradient. That is, the neuronal complexes learn the local properties of the fitness landscape, resulting in the generation of variability directed towards the direction of fitness increase, as if mutations in a genetic pool were drawn such that they would increase reproductive success. (Abstract)

In higher animals, complex and robust behaviors are produced by the microscopic details of large structured ensembles of neurons. I describe how the emergent computational dynamics of a biologically based neural network generates a robust natural solution to the problem of categorizing time-varying stimulus patterns such as spoken words or animal stereotypical behaviors. (Hopfield Abstract) Robust emergent behaviors of large systems (whether physical or biological) are a logical consequence of the behaviors and interactions of the more microscopic elements making up the large system. (2030).

1 | 2 | 3 | 4  Next