![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
|
![]() |
![]() |
||||||||||
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
|
VI. Life’s Cerebral Cognizance Becomes More Complex, Smarter, Informed, Proactive, Self-Aware1. Intelligence Evolution and Knowledge Gain as a Central Course Evolutionary Biology and the Theory of Computing. simons.berkeley.edu/programs/evolution2014. Reported more in Natural Algorithms, this is a Simons Foundation program in the spring of 2014 whose organizers include Christos Papadimitriou and Leslie Valiant. Various seminars were Evolutionary Biology Boot Camp (January tutorial) with Eugene Koonin; Computational Theories of Evolution, Nick Barton organizer, with speakers such as Richard Watson, Chrisantha Fernando, Adi Livnat, and Steven Frank; and New Directions in Probabilistic Models of Evolution. Azulay, Aharon, et al. The C. elegans Connectome Consists of Homogenous Circuits with Defined Functional Roles. PLoS Computational Biology. Online September, 2016. At the frontier of integral syntheses across cerebral, social, and cognitive evolution, Hebrew University, Jerusalem, geneticists and neuroscientists distill from neural network complexities a common, independent principle which is at work from Internet media and human brains to this multicellular roundworm. To reflect and appreciate, a designated “connectome” can be seen in ramifying effect across life’s episodic emergence. See also in this section The Multilayer Connectome of Caenorhabditis elegans by Bentley, et al for a similar study. A major goal of systems neuroscience is to decipher the structure-function relationship in neural networks. Here we study network functionality in light of the common-neighbor-rule (CNR) in which a pair of neurons is more likely to be connected the more common neighbors it shares. Focusing on the fully-mapped neural network of C. elegans worms, we establish that the CNR is an emerging property in this connectome. Moreover, sets of common neighbors form homogenous structures that appear in defined layers of the network. Simulations of signal propagation reveal their potential functional roles: signal amplification and short-term memory at the sensory/inter-neuron layer, and synchronized activity at the motoneuron layer supporting coordinated movement. A coarse-grained view of the neural network based on homogenous connected sets alone reveals a simple modular network architecture that is intuitive to understand. These findings provide a novel framework for analyzing larger, more complex, connectomes once these become available. (Abstract) Bates, Timothy. Cognitive rationality is heritable and lies under general cognitive ability. Intelligence. 108/101895, 2025. In this Elsevier journal for cerebral capabilities, a senior Edinburgh psychologist seeks to clarify and sort various terminologies in this endeavor. See also Possible evidence for the Law of General Intelligence in honey bees in the 106/101856, 2024 issue. If one might take a Planatural PhiloSopher view, a long evolutionary emergent arrow of ever smarter, communal, knowledge becomes evident as it may just now reaches our Earthuman retrospective and potential self-realization. ntelligence and rationality both predict optimal decision making. However, whether cognitive rationality (CR) and general cognitive ability (CA) are identical or reflect fundamentally distinct processes is still debated. Here, we report a twin study to distinguish the cognitive mechanisms involved in CR and CA.. Multivariate modelling of CA scales and CR indicated that CR was due to a latent g-factor, which itself was strongly heritable. We conclude that CR is not distinct from CA, but instead that the reflexive and reflective aspects of cognitive ability make CR a robust and efficient test of general cognitive ability. (Abstract) Bentley, Barry, et al. The Multilayer Connectome of Caenorhabditis elegans.. arXiv:1608.08793. A six person team from the MRC Laboratory of Molecular Biology, Cambridge, UK and Cambridge University, including neuroscientist Edward Bullmore, explore some ways in which neural networks, as a prototypical nonlinear system, can equally apply to the rudimentary sensory circuit of this roundworm phylum. And these many 2016 correlations across natural scales seem to find a generic dynamic complexity in the guise of a cerebral icon with an “omics” suffix which is effect from a quickening uniVerse to our human epitome. Connectomics represents an effort to map brain structure at the level of individual neurons and their synaptic connections. However, neural circuits also depend on other types of interneuronal signalling, such as extrasynaptic modulation by monoamines and peptides. Here we present a draft monoamine connectome, along with a partial neuropeptide connectome, for the nematode C. elegans, based on new and published expression data for biosynthetic genes and receptors. We describe the structural properties of these "wireless" networks, including their topological features and modes of interaction with the wired synaptic and gap-¬junction connectomes. This multilayer connectome of C. elegans can serve as a prototype for understanding the multiplex networks comprising larger nervous systems, including the human brain. (Summary) Bettencourt, Luis, et al. Redefining Fitness: Evolution as a Dynamic Learning Process. arXiv:2503.09057. Into 2025, University of Chicago physicists and evolutionary biologists can add an array of further insights into how life’s boney and now brainy development can reveal its central, whole scale trajectory of advances via iterative stages of individual and communal accumulated knowledge. Evolution is a process of optimal adaptation of biological populations to their living environments via the concept of fitness. Here, we show that this function by way of mapping population dynamics to Bayesian learning can provide a more suitable explanation. We show how probabilistic models of fitness can easily be construed and how their averages acquire meaning as information. The approach creates a bridge between population dynamics under selection, statistical learning theory, and emerging models of artificial intelligence. (Excerpt) Betti, Alessandro and Marco Gori. The Principle of Least Cognitive Action. Theoretical Computer Science. 633/83, 2016. In a special issue on Biologically Inspired Processes in Neural Computation, edited by UM Amherst computer scientist Hava Siegelmann, a University of Pisa physicist and a University of Siena mathematician consider how nature’s evolutionary ascent of cerebral cognizance can be traced to and rooted in an intrinsic physical lawfulness. We even wonder if a new field such as “physioinformatics” (suggest a better term) could be added? See also in this issue The Grammar of Mammalian Brain Capacity, Topological Evolution for Embodied Cellular Automata, and A Circuit Basis for Morphogenesis. By and large, the interpretation of learning as a computational process taking place in both humans and machines is primarily provided in the framework of statistics. In this paper, we propose a radically different perspective in which the emergence of learning is regarded as the outcome of laws of nature that govern the interactions of intelligent agents with their own environment. We introduce a natural learning theory based on the principle of least cognitive action, which is inspired to the related mechanical principle, and to the Hamiltonian framework for modeling the motion of particles. The introduction of the kinetic and of the potential energy leads to a surprisingly natural interpretation of learning as a dissipative process. The kinetic energy reflects the temporal variation of the synaptic connections, while the potential energy is a penalty that describes the degree of satisfaction of the environmental constraints. The theory gives a picture of learning in terms of the energy balancing mechanisms, where the novel notions of boundary and bartering energies are introduced. (Abstract)
Brun-Usan, Miguel, et al..
How to Fit In: The Learning Principles of Cell Differentiation.
PLoS Computational Biology..
April,
2020.
University of Southampton, UK, computer scientists including Richard Watson continue their revisionary studies of biological metabolisms by viewing them through a learning lens. A cerebral perspective, as this section reports, can provide better insights into cellular processes because both evolution and learning are explorations in search of solutions. A further step is to integrate this view with gene regulatory networks so these common models can reinforce each other. Altogether this approach implies that life’s oriented emergence is trying to achieve some manner of its own self-description and comprehension. Cell differentiation in multicellular organisms requires cells to respond to complex combinations of extracellular cues, such as morphogen concentrations. But a general theory describing how cells integrate multi-dimensional signals is still lacking. In this work, we propose a framework from learning theory to understand the relationships between environmental cues (inputs) and phenotypic responses (outputs) underlying cell plasticity.. Altogether, these results illustrate the functional parallelisms between learning in neural networks and the action of natural selection on environmentally sensitive gene regulatory networks. This offers a theoretical framework for responses that integrate information from multiple cues, a phenomenon that underpins the evolution of multicellularity and developmental robustness. (Abstract excerpt) Cabessa, Jeremie and Hava Siegelmann. The Computation Power of Interactive Recurrent Neural Networks. Network: Computation in Neural Systems. 24/4, 2012. Reviewed more in Universal Darwinism, University of Massachusetts, Amherst, computational neuroscientists take these cerebral complexities to exemplify how nature evolves, develops and learns. We are then invited to realize that the same dynamical trial and error, feedback to move forward, iterative process is in effect everywhere. See also Turing on Super-Turing and Adaptivity by Hava Siegelmann in Progress in Biophysics and Molecular Biology (113/117, 2013). Chaistain, Erick, at al. Algorithms, Games, and Evolution. Proceedings of the National Academy of Sciences. 111/10620, 2014. This entry is cited in the Kate Douglas report that headlines the section. Reported more in Natural Algorithms, with Adi Livnat, Christos Papadimitriou, and Umesh Vazirani, theoretical biologists pose a working comparison between natural selection seen as a search, iterate, and optimize process, and a machine learning procedure called “multiplicative weight updates algorithm” (MWUA). Both modes involve explorations and samplings of diverse populations, subject to trials, errors, then retests, so to reach a “good enough” state or solution. A Commentary in the same issue, Diverse Forms of Selection in Evolution and Computer Science by Nicholas Barton, et al, supports the finding. Churchill, Alexander, et al. Learning to Generate Genotypes with Neural Networks. arXiv: 1604.04153. Reviewed more in Natural Algorithms, Queen Mary University of London enter another way to understand how genomes and connectomes (neuromes) are basically similar in form and function. Csermely, Peter, et al. Learning of Signaling Networks. arXiv:2001.11679. We cite this paper by five Semmelweis University, Budapest system scientists as an example of how cerebral facilities can be easily grafted onto and evident in all manner of genetic and metabolic anatomy and physiology, because they naturally spring from and manifest one, same source. By this perception, life’s long evolutionary development can increasingly appear as an oriented encephalization and cosmic education. Molecular processes of neuronal learning have been well-described. However, learning mechanisms of non-neuronal cells have not been fully understood. Here, we discuss molecular mechanisms of cellular learning, including conformational memory of intrinsically disordered proteins and prions, signaling cascades, protein translocation, RNAs, and chromatin memory. We hypothesize that these processes constitute the learning of signaling networks and correspond to a generalized Hebbian learning process of single, non-neuronal cells. We then discuss how cellular learning may open novel directions in drug design and inspire new artificial intelligence methods. (Abstract) Czegel, Daniel, et al. Bayes and Darwin: How Replicator Populations Implement Bayesian Computations. BioEssays. 44/4, 2022. DC and Eors Szmathary, Institute of Evolution, Budapest, Hamza Glaffar, Cold Spring Harbor Laboratory and Joshua Tenenbaum, MIT, continue their project to perceive and identify life’s developmental emergence as primarily a cerebral, cognitive, knowledge-gaining occasion. It is argued that every organism across all Metazoan domains must be aware of, learn about and be able to predict their ever-changing environs so as to survive. Here bodily evolution (Darwin) and proactive mind (Bayes) are seen to advance in a parallel, reciprocal way. Their 2020s version can now be informed by novel probalistic, iterative, cognitive models. Writ large, a semblance of a self-educating, making, affirming, autocatalytic, proactive reality becomes evident. A premier feature is then a consilence of past reference and open future. See also Novelty and Imitation within the Brain by this group in Nature Scientific Reports (11:12513, 2021). Bayesian learning theory and evolutionary theory both formalize adaptive competition dynamics in variable environments. What do they have in common and how do they differ? In this paper, we discuss structural and process analogies at a computational and an algorithmic-mechanical level. We point out mathematical equivalence and isomorphism between Bayesian update and replicator dynamics. We discuss how these mechanisms provide similar ways to adapt to stochastic conditions at multiple timescales. We thus find replicator populations to encode regularities so as to predict future environments. As a notable result, a unified view of the theories of learning and evolution can be achieved. (Abstract)
|
![]() |
|||||||||||||||||||||||||||||||||||||||||||||
HOME |
TABLE OF CONTENTS |
Introduction |
GENESIS VISION |
LEARNING PLANET |
ORGANIC UNIVERSE |