(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent Complex Network Systems, Computational Programs, Genetic Ecode Scripts

2. Biteracy: Natural Algorithmic Computation

Hein, Andrew, et al. Natural Search Algorithms as a Bridge between Organisms, Evolution, and Ecology. Proceedings of the National Academy of Sciences. 113/9413, 2016. A team from Princeton and MIT, including Simon Levin, is in quest of commonalities between creaturely explorations of niche environments for optimum resources and reproduction. A synthesis of cellular and animal strategies is broached as a convergent dynamic state.

The ability to navigate is a hallmark of living systems, from single cells to higher animals. Searching for targets, such as food or mates in particular, is one of the fundamental navigational tasks many organisms must execute to survive and reproduce. Here, we argue that a recent surge of studies of the proximate mechanisms that underlie search behavior offers a new opportunity to integrate the biophysics and neuroscience of sensory systems with ecological and evolutionary processes, closing a feedback loop that promises exciting new avenues of scientific exploration at the frontier of systems biology. (Abstract)

Hernandez-Orozco, Santiago, et al. Algorithmically Probable Mutations Reproduce Aspects of Evolution, such as Convergence Rate, Genetic Memory and Modularity. Royal Society Open Science. August, 2018. Algorithmic Dynamics Lab, SciLifeLab, Centre for Molecular Medicine, Stockholm computational scientists take up Gregory Chaitin’s polymath project (search) to quantify a deep mathematical source for life’s long evolution. Co-author Hector Zenil, with many colleagues including Chaitin, have pursued this task with technical finesse for some time (search here and arXiv eprints). As the quotes refer, this entry makes a strong case, which eludes an extended evolutionary synthesis (Laland), that selective effects alone are insufficient to adequately explain an oriented biological emergence from origins to us. This radical expansion is at odds with the vested Darwinian paradigm, but understandable (we add) if located within a procreative organic ecosmos as its consequent natural genetic code. Here the contrast is set between a “classical” randomness and a novel algorithmic, informational guidance. Similar work, such as by Sara Walker, Paul Davies and others, also explore ways to qualify and integrate this vital missing dimension. See also a concurrent paper An Algorithmic Information Calculus for Causal Discovery and Reprogramming Systems by Zenil and seven collaborators at arXiv:1709.05429. For more see Reprogramming Matter, Life, and Purpose by Zenil (2017) in Cosmocene Destiny.

Natural selection explains how life has evolved over millions of years from more primitive forms. The speed at which this happens, however, has sometimes defied formal explanations when based on random mutations. Here, we investigate the application of a simplicity bias based on a natural but algorithmic distribution of mutations in various examples, particularly binary matrices, in order to compare evolutionary convergence rates. Results both on synthetic and on small biological examples indicate an accelerated rate when mutations are not statistically uniform but algorithmically uniform. We show that algorithmic distributions can evolve modularity and genetic memory by preservation of structures when they first occur sometimes leading to an accelerated production of diversity but also to population extinctions. These results validate some suggestions in the direction that computation may be an equally important driver of evolution. We also show that inducing the method on problems of optimization, such as genetic algorithms, has the potential to accelerate convergence of artificial evolutionary algorithms. (Abstract excerpts)

On the other hand, random mutation implies no evidence for a directing force and in artificial genetic algorithms, mutation has traditionally been uniform even if other strategies are subject to continuous investigation and have been introduced as a function of, for example, time or data size. More recently, it has been suggested that the deeply informational and computational nature of biological organisms makes them amenable to being studied or considered as computer programs following (algorithmic) random walks in software space, that is, the space of all possible—and valid—computer programs. Here, we numerically test this hypothesis and explore the consequences vis-a-vis our understanding of the biological aspects of life and natural evolution by natural selection, as well as for applications to optimization problems in areas such as evolutionary programming. (2)

We introduce a conceptual framework and an interventional calculus to steer, manipulate, and reconstruct the dynamics and generating mechanisms of dynamical systems from partial and disordered observations based on the algorithmic contribution of each of the systems elements to the whole by exploiting principles from the theory of computability and algorithmic randomness. This calculus entails finding and applying controlled interventions to an evolving object to estimate how its algorithmic information content is affected in terms of positive or negative shifts towards and away from randomness in connection to causation. The approach is an alternative to statistical approaches for inferring causal relationships and formulating theoretical expectations from perturbation analysis. We find that the algorithmic information landscape of a system runs parallel to its dynamic landscape, affording an avenue for moving systems on one plane so they may have controlled effects on the other plane. (1709.05429 Abstract)

Hillberry, Logan, et al. Entangled Quantum Cellular Automata (QCA), Physical Complexity, and Goldilocks Rules. arXiv:2005.1763. We cite this entry by a nine member team based at UT Austin, CalTech, and the University of Padova including Nicole Yunger Halpern as a current example of how disparate classical and quantum domains along with mathematic computations are joining up and cross-informing on the way to a phenomenal synthesis. The Abstract and quote convey a essential sense of the frontier project. So into the 2020s can we begin to realize (again) that an extant nature does have its own encoded reality and procreative purpose that we peoples can philosophize about? A good part of the project would be to translate the arcane terms into a human-familial image, which is what this site attempts to do.


Cellular automata are interacting classical bits that display diverse behaviors, from fractals to random-number generators to Turing-complete computation. We introduce entangled quantum cellular automata subject to Goldilocks rules, tradeoffs of the kind underpinning biological, social, and economic complexity. Tweaking digital and analog quantum-computing protocols generates persistent entropy fluctuations; robust dynamical features, including an entangled breather; and network structure and dynamics consistent with complexity. Present-day quantum platforms---Rydberg arrays, trapped ions, and superconducting qubits---can implement Goldilocks protocols, which generate quantum many-body states with rich entanglement and structure. Moreover, the complexity studies reported here underscore an emerging idea in many-body quantum physics: some systems fall outside the integrable/chaotic dichotomy. (Abstract)

We have discovered a physically potent feature of entangled quantum cellular automata: the emergence of complexity under Goldilocks rules. Goldilocks rules balance activity and inactivity. This tradeoff produces new, highly entangled, yet highly structured, quantum states. These states are persistently dynamic and neither uniform nor random. (14) Moreover, we have demonstrated that our QCA time-evolution protocols are implementable in extant digital and analog quantum computers. (14)

Hsu, Sheryl, et al. A Physarum-inpsired Approach to the Euclidean Steiner Tree Problem. Nature Scientific Reports. 22/14536, 2022. University of Chicago, Illinois researchers including Laura Schaposnik describe a latest instance of how the individual and colonial cognizance of slime-mold microbes can well serve to study and improve complex situations (the US highway is route an another case). See also the work of Tanya Latty, University of Sydney, which was profiled on the PBS NOVA show Secret Mind of Slime.

This paper presents a novel biologically-inspired, explore-and-fuse approach to a large array of problems. The inspiration comes from Physarum, a unicellular slime mold capable of solving complex situations. These characteristics of Physarum imply that many such organisms can explore the problem space in parallel, each individual gathering information and partial solutions. When the organisms meet, they fuse and share information, eventually forming one entity with a relative overview and find an overall solution. Here we develop the Physarum Steiner Algorithm which can find feasible ways to deal with Euclidean Steiner tree issues. (Abstract excerpt)

Ibsen-Jensen, Rasmus, et al. Computational Complexity of Ecological and Evolutionary Spatial Dynamics. Proceedings of the National Academy of Sciences. 112/15636, 2015. With Krishnendu Chatterjee and Martin Nowak, Institute of Science and Technology, Austria and Harvard University theorists contribute to novel understandings of living systems in terms of and guided by algorithmic programs. And as this missing or underweighted dimension becomes more evident, it gains a role as a natural genetic code.

There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. (Abstract)

Ikegami, Takashi, et al, eds. ALIFE 2018 Conference Proceedings. Cambridge: MIT Press, 2018. The edition is from a combined meeting of the European Artificial Life and Synthesis and Simulation of Living Systems groups held in Tokyo in July. It is available online as abstracts and full papers at www.mitpressjournals.org/toc/isal/30, Typical sessions are Hybrid Life: Approaches to Integrate Biological, Artificial and Cognitive Systems, Evolutionary Dynamics and Information Theory and Flow. Amongst the titles are The Self-Assembling Brain by Robin Hiesinger, Integrated Information and Autonomy in the Thermodynamic Limit by Miguel Aguilera and Ezequiel Di Paolo, Interfacing Synthetic Cells with Biological Cells by Giordano Rampioni, Socio-Technical Evolution by Martin Rosenlyst, et al, Holonomic Cellular Automata by Ada Diaconescu, et al, and An Iterated Learning Approach to the Origins of the Standard Genetic Code by Tom Froese, at al. In addition, we review (search) Major Transitions in Planetary Evolution by Hikaru Furukawa and Sara Walker, and Critical Learning vs. Evolution by Sina Khajehabdollahi and Olaf Witkowski.

Karig, David, et al. Stochastic Turing Patterns in a Synthetic Bacterial Population. Proceedings of the National Academy of Sciences. 115/6572, 2018. Johns Hopkins University, MIT, and University of Illinois researchers including Nigel Goldenfeld report an evidential presence in all manner of living phenomena of an inherent computationally programmic source. While natural selection is often invoked and stretched to explain, a worldwise quantification is proceeding to articulate and quantify this missing natural, genotype-like mathematical dimension. Circa 2018, a 21st century genesis evolutionary synthesis by way of this major expansion gains a robust credence. See also for example Robust Stochastic Turing Patterns in the Development of a One-Dimensional Cyanobacterial Organism by Francesce Di Patti, et al in PLoS Biology (May 2018) and Post-Turing Tissue Pattern Formation by Felix Brinkmann, et al in PLoS Computational Biology (July 2018).

The origin of biological morphology and form is one of the deepest problems in science, underlying our understanding of development and the functioning of living systems. In 1952, Alan Turing showed that chemical morphogenesis could arise from a linear instability of a spatially uniform state, giving rise to periodic pattern formation in reaction–diffusion systems but only those with a rapidly diffusing inhibitor and a slowly diffusing activator. These conditions are disappointingly hard to achieve in nature, and the role of Turing instabilities in biological pattern formation has been called into question. Recently, the theory was extended to include noisy activator–inhibitor birth and death processes. Surprisingly, this stochastic Turing theory predicts the existence of patterns over a wide range of parameters, in particular with no severe requirement on the ratio of activator–inhibitor diffusion coefficients. (Abstract excerpt)

Kasabov, Nikola, ed. Springer Handbook of Bio-/Neuroinformatics. Berlin: Springer, 2014. The editor of this 62 chapter, 1200 page volume is Auckland University of Technology chair of knowledge engineering. Its unique synthesis of bio/genetic and neural phases by way of informational qualities covers cellular, genomic, proteomic, morphogenetic, and bacterial aspects, machine learning and computational analysis, regulatory networks in systems biology, and ontologies and databases for medicine and health. Dynamic cerebral phenomena involves information processing by synapses, spiking neural networks, fMRI imaging, nonlinear signaling, and onto perception, sensation, and cognition. With all this in place, a final section engages Nature Inspired Integrated Information Technologies for brain, gene, quantum intelligence and creativity futures. Might we thus consider an astroinformatics and cosmoinformatics, a naturomics, quantomics, and anthropomics, as our human epitome begins to self-sequence a genesis uniVerse?

The Springer Handbook of Bio-/Neuro-Informatics is the first published book in one volume that explains together the basics and the state-of-the-art of two major science disciplines in their interaction and mutual relationship, namely: information sciences, bioinformatics and neuroinformatics. Bioinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery. Neuroinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery.

Kaznatcheev, Artem. Evolution is Exponentially More Powerful with Frequency-Dependent Selection. . An Oxford University computer scientist posts a latest appreciation of how living systems appear to evolve and develop as stochastic explore and educate optimization processes. In any event, the insight admits a deep mathematical presence of operative programs, which are modified along the way. See also Computational Complexity as an Ultimate Constraint on Evolution by AK in Genetics (212/245, 2019).

In 2009 (Leslie) Valiant (search) proposed to treat Darwinian evolution as a special kind of computational learning by way of statistical queries which represent a genotype’s fitness over a distribution of challenges. His model noted various environments that are “adaptable-to” from those that are not, but omits vital ecological interactions between different evolving agents. Here I extend an algorithmic Darwinism to include the ecological exigiences of frequency-dependent selection as a population-dependent bias and develop a game landscape view of evolution so to generalize the popular fitness landscape. The evolutionary game dynamics of finite populations are essential for finding a short adaptive path to the global fitness peak during the second stage of the adaptation process. This highlights the rich interface between computational learning theory, evolutionary games, and long-term evolution. (Abstract excerpt)

Artem K. website Prior to Oxford, I was at the Department of Integrated Mathematical Oncology at Moffitt Cancer, and at McGill University where I developed my interest in evolutionary dynamics, computer science, mathematical oncology and computational learning theory. In regard, I marvel at the world through algorithmic lenses. My theoretical work aims to ground the extended evolutionary synthesis in algorithmic game theory, computational learning theory, and combinatorial optimization.

Krause, Andrew, et al. Introduction to “Recent Progress and Open Frontiers for Turing’s Theory of Morphogenesis.”. Philosophical Transactions of the Royal Society A. November, 2021. Oxford University editors including Philip Maini post a latest survey of Alan Turing’s visionary 1950s insights into such natural generative computations, which by now are ubiquitously evident. Some entries are Modern Perspectives on Near-Equilibrium Analysis of Turing Systems, Turing Pattern Design Principles, and Insights from Chemical Systems. Altogether into the 2020s an evidential case ever builds for a revolutionary procreative ecosmic uniVerse.

How pattern naturally form is an on-going task for the physical, chemical and biological sciences. Alan Turing's contribution went on to spawn many mathematical projects to understand the ways that patterning effects can emerge from homogeneous chemical mixtures due to the reaction and diffusion of chemical species. This theme issue discusses the latest foundational work in chemical and synthetic settings the ways that Turing's method is a development basis for morphogenesis. (Abstract excerpt)

Kubota, Tomoyuki. et al. Reservoir Computing Generalized. arXiv:2412.12104.. We record this work by University of Tokyo and National Institute of Industrial Science and Technology, Japan analysts as an example whence even a domain long seen as insensate matter, can yet now be able to represent cerebral and cognitive qualities. With quantum phenomena similarly suffused with and treatable by neural nets, one is led to consider an entire atomic, cosmic and humanic existence which could as brain-like with relative sentience and knowledge.

A physical neural network (PNN) has the potential to solve machine learning tasks and learn intrinsic physical properties. Reservoir computing (RC) is an apt method to advance information processing with dynamical systems by way of PNNs. Here we propose a novel approach called generalized reservoir computing (GRC) by making conventional RC a special case. We propose a way to obtain a reliable output and show that processed inputs are retrievable and note that spatiotemporal chaos can be used to emulate complex nonlinear dynamics. Overall, our framework can build an information processing device to constructing a computational system capable of a wider variety of physical dynamics. (Excerpt)

Reservoir computing is derived from recurrent neural network theory that maps input signals into higher dimensional spaces through a fixed, non-linear system called a reservoir. After the input signal is received, a device is trained to read the state of the reservoir and map it to the desired output.

Lamm, Ehud and Ron Unger. Biological Computation. London: Chapman & Hall, 2011. A Tel Aviv University philosopher of science and Bar-Ilan University computational biologist provide a comprehensive text on this salient, growing 21st century synthesis. Its sections are Introduction and Biological Background, Cellular Automata, Evolutionary Computation, Artificial Neural Networks, Molecular Computation, and The Never-Ending Story: the Interface between Biology and Computation.

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next