(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Ibsen-Jensen, Rasmus, et al. Computational Complexity of Ecological and Evolutionary Spatial Dynamics. Proceedings of the National Academy of Sciences. 112/15636, 2015. With Krishnendu Chatterjee and Martin Nowak, Institute of Science and Technology, Austria and Harvard University theorists contribute to novel understandings of living systems in terms of and guided by algorithmic programs. And as this missing or underweighted dimension becomes more evident, it gains a role as a natural genetic code.

There are deep, yet largely unexplored, connections between computer science and biology. Both disciplines examine how information proliferates in time and space. Central results in computer science describe the complexity of algorithms that solve certain classes of problems. An algorithm is deemed efficient if it can solve a problem in polynomial time, which means the running time of the algorithm is a polynomial function of the length of the input. Another criterion is the space requirement of the algorithm. There is a crucial distinction between algorithms that can find a solution, verify a solution, or list several distinct solutions in given time and space. The complexity hierarchy that is generated in this way is the foundation of theoretical computer science. (Abstract)

Ikegami, Takashi, et al, eds. ALIFE 2018 Conference Proceedings. Cambridge: MIT Press, 2018. The edition is from a combined meeting of the European Artificial Life and Synthesis and Simulation of Living Systems groups held in Tokyo in July. It is available online as abstracts and full papers at www.mitpressjournals.org/toc/isal/30, Typical sessions are Hybrid Life: Approaches to Integrate Biological, Artificial and Cognitive Systems, Evolutionary Dynamics and Information Theory and Flow. Amongst the titles are The Self-Assembling Brain by Robin Hiesinger, Integrated Information and Autonomy in the Thermodynamic Limit by Miguel Aguilera and Ezequiel Di Paolo, Interfacing Synthetic Cells with Biological Cells by Giordano Rampioni, Socio-Technical Evolution by Martin Rosenlyst, et al, Holonomic Cellular Automata by Ada Diaconescu, et al, and An Iterated Learning Approach to the Origins of the Standard Genetic Code by Tom Froese, at al. In addition, we review (search) Major Transitions in Planetary Evolution by Hikaru Furukawa and Sara Walker, and Critical Learning vs. Evolution by Sina Khajehabdollahi and Olaf Witkowski.

Karig, David, et al. Stochastic Turing Patterns in a Synthetic Bacterial Population. Proceedings of the National Academy of Sciences. 115/6572, 2018. Johns Hopkins University, MIT, and University of Illinois researchers including Nigel Goldenfeld report an evidential presence in all manner of living phenomena of an inherent computationally programmic source. While natural selection is often invoked and stretched to explain, a worldwise quantification is proceeding to articulate and quantify this missing natural, genotype-like mathematical dimension. Circa 2018, a 21st century genesis evolutionary synthesis by way of this major expansion gains a robust credence. See also for example Robust Stochastic Turing Patterns in the Development of a One-Dimensional Cyanobacterial Organism by Francesce Di Patti, et al in PLoS Biology (May 2018) and Post-Turing Tissue Pattern Formation by Felix Brinkmann, et al in PLoS Computational Biology (July 2018).

The origin of biological morphology and form is one of the deepest problems in science, underlying our understanding of development and the functioning of living systems. In 1952, Alan Turing showed that chemical morphogenesis could arise from a linear instability of a spatially uniform state, giving rise to periodic pattern formation in reaction–diffusion systems but only those with a rapidly diffusing inhibitor and a slowly diffusing activator. These conditions are disappointingly hard to achieve in nature, and the role of Turing instabilities in biological pattern formation has been called into question. Recently, the theory was extended to include noisy activator–inhibitor birth and death processes. Surprisingly, this stochastic Turing theory predicts the existence of patterns over a wide range of parameters, in particular with no severe requirement on the ratio of activator–inhibitor diffusion coefficients. (Abstract excerpt)

Kasabov, Nikola, ed. Springer Handbook of Bio-/Neuroinformatics. Berlin: Springer, 2014. The editor of this 62 chapter, 1200 page volume is Auckland University of Technology chair of knowledge engineering. Its unique synthesis of bio/genetic and neural phases by way of informational qualities covers cellular, genomic, proteomic, morphogenetic, and bacterial aspects, machine learning and computational analysis, regulatory networks in systems biology, and ontologies and databases for medicine and health. Dynamic cerebral phenomena involves information processing by synapses, spiking neural networks, fMRI imaging, nonlinear signaling, and onto perception, sensation, and cognition. With all this in place, a final section engages Nature Inspired Integrated Information Technologies for brain, gene, quantum intelligence and creativity futures. Might we thus consider an astroinformatics and cosmoinformatics, a naturomics, quantomics, and anthropomics, as our human epitome begins to self-sequence a genesis uniVerse?

The Springer Handbook of Bio-/Neuro-Informatics is the first published book in one volume that explains together the basics and the state-of-the-art of two major science disciplines in their interaction and mutual relationship, namely: information sciences, bioinformatics and neuroinformatics. Bioinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery. Neuroinformatics is the area of science which is concerned with the information processes in biology and the development and applications of methods, tools and systems for storing and processing of biological information thus facilitating new knowledge discovery.

Kaznatcheev, Artem. Evolution is Exponentially More Powerful with Frequency-Dependent Selection. . An Oxford University computer scientist posts a latest appreciation of how living systems appear to evolve and develop as stochastic explore and educate optimization processes. In any event, the insight admits a deep mathematical presence of operative programs, which are modified along the way. See also Computational Complexity as an Ultimate Constraint on Evolution by AK in Genetics (212/245, 2019).

In 2009 (Leslie) Valiant (search) proposed to treat Darwinian evolution as a special kind of computational learning by way of statistical queries which represent a genotype’s fitness over a distribution of challenges. His model noted various environments that are “adaptable-to” from those that are not, but omits vital ecological interactions between different evolving agents. Here I extend an algorithmic Darwinism to include the ecological exigiences of frequency-dependent selection as a population-dependent bias and develop a game landscape view of evolution so to generalize the popular fitness landscape. The evolutionary game dynamics of finite populations are essential for finding a short adaptive path to the global fitness peak during the second stage of the adaptation process. This highlights the rich interface between computational learning theory, evolutionary games, and long-term evolution. (Abstract excerpt)

Artem K. website Prior to Oxford, I was at the Department of Integrated Mathematical Oncology at Moffitt Cancer, and at McGill University where I developed my interest in evolutionary dynamics, computer science, mathematical oncology and computational learning theory. In regard, I marvel at the world through algorithmic lenses. My theoretical work aims to ground the extended evolutionary synthesis in algorithmic game theory, computational learning theory, and combinatorial optimization.

Krause, Andrew, et al. Introduction to “Recent Progress and Open Frontiers for Turing’s Theory of Morphogenesis.”. Philosophical Transactions of the Royal Society A. November, 2021. Oxford University editors including Philip Maini post a latest survey of Alan Turing’s visionary 1950s insights into such natural generative computations, which by now are ubiquitously evident. Some entries are Modern Perspectives on Near-Equilibrium Analysis of Turing Systems, Turing Pattern Design Principles, and Insights from Chemical Systems. Altogether into the 2020s an evidential case ever builds for a revolutionary procreative ecosmic uniVerse.

How pattern naturally form is an on-going task for the physical, chemical and biological sciences. Alan Turing's contribution went on to spawn many mathematical projects to understand the ways that patterning effects can emerge from homogeneous chemical mixtures due to the reaction and diffusion of chemical species. This theme issue discusses the latest foundational work in chemical and synthetic settings the ways that Turing's method is a development basis for morphogenesis. (Abstract excerpt)

Lamm, Ehud and Ron Unger. Biological Computation. London: Chapman & Hall, 2011. A Tel Aviv University philosopher of science and Bar-Ilan University computational biologist provide a comprehensive text on this salient, growing 21st century synthesis. Its sections are Introduction and Biological Background, Cellular Automata, Evolutionary Computation, Artificial Neural Networks, Molecular Computation, and The Never-Ending Story: the Interface between Biology and Computation.

Ldeborova, Lenka and Florent Krzakala. Statistical Physics of Inference: Thresholds and Algorithms. arXiv:1511.02476. With an initial nod to Pierre-Simon Laplace (second quote), University of Paris Saclay theorists engage a 21st century synthesis of these parallel methods. As one may peruse and imagine, a mathematical, genetic uniVerse trying to describe and realize itself out of an infinity of stochastic information may be implied.

Many questions of fundamental interest in todays science can be formulated as inference problems: Some partial, or noisy, observations are performed over a set of variables and the goal is to recover, or infer, the values of the variables based on the indirect information contained in the measurements. A growing body of work has shown that often we can understand these fundamental barriers by thinking of them as phase transitions in the sense of statistical physics. Moreover, it turned out that we can use the gained physical insight to develop new promising algorithms. Connection between inference and statistical physics is currently witnessing an impressive renaissance and we review here the current state-of-the-art, with a pedagogical focus on the Ising model which formulated as an inference problem we call the planted spin glass. In terms of applications we review two classes of problems: (i) inference of clusters on graphs and networks, with community detection as a special case and (ii) estimating a signal from its noisy linear measurements, with compressed sensing as a case of sparse estimation. (Abstract)

Our goal in this review is to describe recent developments in a rapidly evolving field at the interface between statistical inference and statistical physics. Statistical physics was developed to derive macroscopic properties of material from microscopic physical laws. Inference aims to discover structure in data. On the first look, these goals seem quite different. Yet, it was the very same man, Pierre Simon, Marquis de Laplace (1749{1827), who did one of the first sophisticated derivations of the gas laws within the caloric theory, and who also created the field of statistical inference [208]. This suggests there may be a link after all. In fact, the methods and tools of statistical physics are designed to describe large assemblies of small elements - such as atoms or molecules. These atoms can quite readily be replaced by other elementary constituents - bits, nodes, agents, neurons, data points. This simple fact, and the richness and the power of the methods invented by physicists over the last century, is at the roots of the connection, and the source of many fascinating work in the last decades. (First paragraph, 4)

Le Verge-Serandour, Mathieu and Karen Alim.. Le Verge-Serandour, Mathieu and Karen Alim. Physarum polycephalum: Smart Network Adaptation. Annual Review of Condensed Matter Physics. Volume 15, 2024. Center for Protein Assemblies; Technical University of Munich biophysicists provide a network neuroscience review of this cellular invertebrate whom is able to exhibit an advanced behavioral responses. Once again, an early, deep insistence of intelligent agency is evident as if a universal repertoire to access.

Life evolved organisms to adapt to their environment and autonomously exhibit behaviours. While complex behaviours are associated with the capability of neurons to process information, the unicellular organism Physarum polycephalum is able to solve complex tasks despite being a single cell shaped into a tubular network. In Physarum, smart behaviours arise as network tubes grow or shrink due to coupling, fluid flows and transport. From our physicist's perspective, we introduce the biology and active chemo-mechanics of this living matter entity (Abstract)

Physarum polycephalum, a giant single-cell slime mould, fascinates researchers with its sophisticated behaviour despite its simple build. The network-shaped body of Physarum plasmodia tops typical cell size reaching up to meters in length while enclosing thousands even millions of nuclei, which allows for complex behaviour similar to multi-cellular organisms. Here is a life form that beat the odds of 600 million years of evolution to thrive on earth today and combine traits of what later on became animals, plants and fungi in itself.

Levy, Pierre. The Philosophical Concept of Algorithmic Intelligence. Spanda Journal. Volume 2, 2014. The University of Ottawa Canada Research Chair in Collective Intelligence has been a pioneer theorist is this regard since the 1990s (search). Some twenty years on as he notes, this paper can present a work-in-progress of an “over-language” that a global cerebral faculty needs. As an Information Economy MetaLanguage or IEML, it remains an algebraic, computational “semantics,” but could add a “reflexivity” so the world brain can know that it knows. See also his Innovation in Coding in a later Spanda Creativity & Collective Enlightenment issue (VI/2, 2015). All these entreaties about an emerging 21st century “sapiensphere” (my term) seem to be getting closer to the actual witness, articulation, and hopeful avail of a palliative worldwide wisdom.

Li, Hui, et al. Multi-Level Formation of Complex Software Systems. Entropy. Online May, 2016. We cite this paper by Dalian Maritime University, China, information scientists as an example of how common network topologies are equally being found in computer programs. By turns, this inherence could infer that other locales such as genomes, connectomes, and an intelligent evolution could be seen to proceed and perform in an algorithmic manner.

Livnat, Adi. Simplification, Innateness, and the Absorption of Meaning from Context. Evolutionary Biology. Online March, 2017. Reviewed more in Systems Evolution, the University of Haifa theorist continues his project to achieve a better explanation of life’s evolution by way of algorithmic computations, innate network propensities, genome – language affinities, neural net deep learning, and more.

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next