(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Scalise, Dominic and Rebecca Shulman. Emulating Cellular Automata in Chemical Reaction-Diffusion Networks. Natural Computing. 15/2, 2016. In an issue based on the DNA Computing 2014 conference (Google), Johns Hopkins computational theorists contribute to frontier realizations that along with genetic phenomena, chemical interactions can similarly be explicated by means of such complex dynamic Turing-like programs, from which orderly structures result. Search also Rondelez above, and Doty in Systems Chemistry for more. Although these advances seem to evince a material cosmic genesis which is brought into complex, vivifying emergence by virtue of an intrinsic, genome-like program.

Chemical reactions and diffusion can produce a wide variety of static or transient spatial patterns in the concentrations of chemical species. Here we show that given simple, periodic inputs, chemical reactions and diffusion can reliably emulate the dynamics of a deterministic cellular automaton, and can therefore be programmed to produce a wide range of complex, discrete dynamics. We describe a modular reaction–diffusion program that orchestrates each of the fundamental operations of a cellular automaton: storage of cell state, communication between neighboring cells, and calculation of cells’ subsequent states. Reaction–diffusion based cellular automata could potentially be built in vitro using networks of DNA molecules that interact via branch migration processes and could in principle perform universal computation, storing their state as a pattern of molecular concentrations, or deliver spatiotemporal instructions encoded in concentrations to direct the behavior of intelligent materials. (Abstract excerpts)

Siddique, Nazmul and Hojjat Adeli. Nature Inspired Computing. Cognitive Computation. 7/6, 2015. Akin to Xin-She Yang herein, Ulster University, Londonderry and Ohio State University informatics researchers gather and introduce an array of physics-based and biology-based algorithms that have arisen in the 21st century. In addition, the associated fields of neural networks, evolutionary computing and fuzzy logic are reviewed.

Siddique, Nazmul and Hojjat Adeli. Physics-Based Search and Optimization: Inspirations from Nature. Expert Systems. 33/6, 2016. In this Wiley journal, Ulster University, Londonderry and Ohio State University computer scientists gather for the first time a broad survey of dynamic iterative phenomena which occur even across cosmic and material realms, akin to many biological and behavioral examples cited herein (Yang). The novel insight is that nature’s universal (Darwinian) evolutionary process of myriad candidates from which a relative optimum result or condition is winnowed and selected can also be seen in universal effect. See, for example, Black Hole: A New Heuristic Optimization for Data Clustering by Abdolreza Hatamlou in Information Sciences (222/175, 2013).

This paper presents a review of recently developed physics‐based search and optimization algorithms that have been inspired by natural phenomena. They include Big Bang–Big Crunch, black hole search, galaxy‐based search, artificial physics optimization, electromagnetism optimization, charged system search, colliding bodies optimization, and particle collision algorithm. (Abstract)

Sloss, Andrew and Steven Gustafson. 2019 Evolutionary Algorithm Review. arXiv:1906.08870. Bellevue, WA software scientists post a thorough survey as the field of artificial intelligence, broadly conceived, becomes ever more biological in its basis. By turns, life’s genetically programmed development is broached as an “Idealized Darwinism.” Section 5.1 is an Auto-Constructive Evolution, while 5.2 is Deep Neuroevolution and 5.3 Self-Replicating Neural Networks.

In this review, we explore a new taxonomy of evolutionary algorithms and classifications that look at five main areas: the ability to manage the control of the environment with limiters, how to explain and repeat the search process, understandings of input and output causality within a solution, the ability to manage algorithm bias due to data or user design, and lastly, and how to add corrective measures. As many reviews of evolutionary algorithms exist, after motivating this new taxonomy, we briefly classify a broad range of algorithms and identify areas of future research. (Abstract excerpt)

Smith, Eric and Supriya Krishnamurthy. Symmetry and Collective Fluctuations in Evolutionary Games. Bristol, UK: IOP Publishing, 2015. Santa Fe Institute and Stockholm University physicists engage a technical survey of Darwinian selectivity by way of game theories, an approach defined as the mathematics of interactions, transmission, and information, which engender statistical adaptive, Bayesian responses. A closing chapter then sees the intent and result as an evolving individuality.

Sommaruga, Giovanni and Thomas Strahm, eds. Turing’s Revolution. Switzerland: Birkhauser, 2015. A follow up volume to the Alan Turing’s 2012 centenary collections (search his name) which situates computational theories in a continuity from Al-Khwarizmi’s (c. 780-850) algorithms to Leibniz’s (1646-1716) universal calculus to our 21st century global Internet society. By this view, the historic inklings of a deep mathematical source that is programmatic in kind can at last reach a definitive articulation. A persistent incentive (also for the great work alchemical project) was that if a natural code could be discerned, human beings could avail to make a better life and world. A theme running through the book is then a creative universe that somehow computes itself into being and becoming, wherein human beings have a central participatory role.

Stepney, Susan and Andrew Adamatzky, eds. Inspired by Nature. International: Springer, 2018. A 60th birthday tribute to Julian Miller, the University of York, UK computer applications pioneer with works such as Cartesian Genetic Programming (Springer, 2011, search). Seven years later, a further sense of universal computations occurs in chapters Evolution in Nanomaterio by Hajo Broersma, Code Evolution with Genetic Programming by Wolfgang Banzhaf, and Chemical Computing through Simulated Evolution by Larry Bull, et al, and others. This concurrent edition adds to the similar content of Computational Matter, (herein) also edited by S. Stepney. A tacit theme is a Big Code in pervasive effect beyond and before Big Data. As the sample quote conveys, within this scenario, it seems that natural cosmic genesis may intend to pass off its procreative to our human intention for all futures, if we might altogether appreciate this.

Evolution has done a great job for many living organisms, and it has done so through a conceptually rather easy but smart, albeit very slow bottom up process of natural Darwinian evolution. There seems to be no design involved in this process: the computational instructions that underlie living creatures have not been designed but rather have (been) evolved. Two questions come to mind: Can we use something similar as evolution on ‘dead’ matter in order to get something useful? Can we do it quickly, via a rapidly converging evolutionary process? (H. Broersma, University of Twente, 95)

Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018. With coeditors Steen Rasmussen and Martyn Amos, the volume is a broad survey of endeavors with an umbrella Unconventional Computing UCOMP name, drawn originally from Stanislaw Ulam, which considers every manifest phase from quantum to social realms to arise in an exemplary way from a mathematical source code. Some entries are Cellular Computing and Synthetic Biology (M. Amos, et al), Decoding Genetic Information (G. Franco and V. Manca), BIOMICS: A Theory of Interaction Computing (P. Dini, et al), and Nanoscale Molecular Automata (R. Rinaldi, et al). An inference is a natural “digitalization” process as a double reality via a software-like informative program and an overt material, creaturely result. And it would seem to readily infer “genotype and phenotype,” see also The Principles of Informational Genomics by V. Manca (search) in Theoretical computer Science (701/190, 2017).

This book is concerned with computing in materio: that is, unconventional computing performed by directly harnessing the physical properties of materials. It offers an overview of the field, covering four main areas: theory, practice, applications and implications. Each chapter synthesizes current understanding by bringing together researchers across a collection of related research projects. The book is useful for graduate students, researchers in the field, and the general scientific reader who is interested in inherently interdisciplinary research at the intersections of computer science, biology, chemistry, physics, engineering and mathematics. (Springer)

Thompson, Sarah, et al. The Fractal Geometry of Fitness Landscapes at the Local Optima Level. Natural Computing. December, 2020. University of Stirling, UK and Universite du Littoral Côte d’Opale, Calais systems mathematicians show how an evolutionary dynamics meant to attain and select a fittest result can take on a self-similar topology.

A local optima network (LON) encodes local optima connectivity in the fitness landscape of a combinatorial optimisation problem. Recently, LONs have been studied for their fractal dimension which is a complexity index where a non-integer can be assigned to a pattern. We use visual analysis, correlation analysis, and machine learning to show that relationships exist and that fractal features of LONs can contribute to explaining and predicting algorithm performance. (Abstract excerpt)

A heuristic is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation.

Valiant, Leslie. Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World. New York: Basic Books, 2013. A Harvard University mathematician broaches an innovative view of life’s advance as due to interactions between creatures and their environment. This novel evolution is a learning process that proceeds by way of algorithmic computations, which in this sense are dubbed “ecorithms” as they try to reach a sufficient optimum. In so doing, Valiant joins current perceptions from Bayesian, rationality (Okasha), Markov, genetic, approaches that view not only Darwinian evolution as a computational learning mechanism, but seem to imply the entire universe is some optimization or maximization process.

Other LV works are a presentation in the Computation, Computational Efficiency, and Cognitive Science unit at the 2013 AAAS meeting entitled “Biological Evolution as a Form of Learning.” (http://aaas.confex.com/aaas/2013/webprogram/Paper9186.html), and an earlier paper “Evolvability” in the Journal of the Association for Computing Machinery (56/1, 2009). Both the talk and paper Abstracts are below. And we might add that the author notes the modern word algorithm is attributed to the Persian mathematician and scholar Al-Khwarizmi (780-850, see Wikipedia) who worked at the House of Wisdom in Baghdad as a founder of algebraic equations. Once again here is the true heart of Islam that must be remembered and recovered anew today.

Algorithms are the step-by-step instructions used in computing for achieving desired results, much like recipes in cooking. In both cases the recipe designer has a certain controlled environment in mind for realizing the recipe, and foresees how the desired outcome will be achieved. The algorithms I discuss in this book are special. Unlike most algorithms, they can be run in environments unknown to the designer, and they learn by interacting with the environment how to act effectively in it. After sufficient interaction they will have expertise not provided by the designer but extracted from the environment. I call these algorithms ecorithms. The model of learning they follow, known as the probably approximately correct model, provides a quantitative framework in which designers can evaluate the expertise achieved and the cost of achieving it. These ecorithms are not merely a feature of computers. I argue in this book that such learning mechanisms impose and determine the character of life on Earth. The course of evolution is shaped entirely by organisms interacting with and adapting to their environments. This biological inheritance, as well as further learning from the environment after conception and birth, have a determining influence on the course of an individual’s life. The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science. (Book Summary)

Living organisms function as protein circuits. We suggest that computational learning theory offers the framework for investigating the question of how such circuits can come into being adaptively from experience without a designer. We formulate Darwinian evolution as a form of learning from examples. The targets of the learning process are the functions of highest fitness. The examples are the experiences. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. We suggest that the close technical connection this establishes between, on the one hand, learning by individuals, and on the other, biological evolution, has important ramifications for the fundamental nature of cognition. (Talk Abstract)

Living organisms function according to complex mechanisms that operate in different ways depending on conditions. Evolutionary theory suggests that such mechanisms sevolved through random variation guided by selection. However, there has existed no theory that would explain quantitatively which mechanisms can so evolve in realistic population sizes within realistic time periods, and which are too complex. In this paper we suggest such a theory. Evolution is treated as a form of computational learning from examples in which the course of learning is influenced only by the fitness of the hypotheses on the examples, and not otherwise by the specific examples. We formulate a notion of evolvability that quantifies the evolvability of different classes of functions. It is shown that in any one phase of evolution where selection is for one beneficial behavior, monotone Boolean conjunctions and disjunctions are demonstrably evolvable over the uniform distribution, while Boolean parity functions are demonstrably not. The framework also allows a wider range of issues in evolution to be quantified. We suggest that the overall mechanism that underlies biological evolution is evolvable target pursuit, which consists of a series of evolutionary stages, each one pursuing an evolvable target in our technical sense, each target being rendered evolvable by the serendipitous combination of the environment and the outcome of previous evolutionary stages. (Evolvability Abstract)

Vega-Rodriguez, Miguel and Jose Granado-Criado. Parallel Computing in Computational Biology. Journal of Computational Biology. 25/8, 2018. An introduction to a special issue about current technological methods by University of Extermadura, Spain computer scientists. For example see Large-Scale Simulations of Bacterial Populations over Complex Networks, Signature Gene Identification of Cancer Occurrence and Pattern Recognition and Human Splice-Site Prediction with Deep Neural Networks.

Computational biology allows and encourages the application of many different parallel computing approaches. This special issue brings together high-quality state-of-the-art contributions about parallel computing in computational biology, from different technological points of view or perspectives. The special issue collects considerably extended and improved versions of the best articles, accepted and presented at the 2017 International Workshop on Parallelism in Bioinformatics.

Walker, Sara Imari. Top-Down Causation and the Rise of Information in the Emergence of Life. Information. 5/3, 2014. Reviewed more in Information Computation, the Arizona State University astrophysicist explains the presence and role of algorithmic processes that appear to originate and drive a complexifying evolution. See also the author's 2015 paper The Descent of Math at arXiv:1505.00312.

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next