(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Sloss, Andrew and Steven Gustafson. 2019 Evolutionary Algorithm Review. arXiv:1906.08870. Bellevue, WA software scientists post a thorough survey as the field of artificial intelligence, broadly conceived, becomes ever more biological in its basis. By turns, life’s genetically programmed development is broached as an “Idealized Darwinism.” Section 5.1 is an Auto-Constructive Evolution, while 5.2 is Deep Neuroevolution and 5.3 Self-Replicating Neural Networks.

In this review, we explore a new taxonomy of evolutionary algorithms and classifications that look at five main areas: the ability to manage the control of the environment with limiters, how to explain and repeat the search process, understandings of input and output causality within a solution, the ability to manage algorithm bias due to data or user design, and lastly, and how to add corrective measures. As many reviews of evolutionary algorithms exist, after motivating this new taxonomy, we briefly classify a broad range of algorithms and identify areas of future research. (Abstract excerpt)

Smith, Eric and Supriya Krishnamurthy. Symmetry and Collective Fluctuations in Evolutionary Games. Bristol, UK: IOP Publishing, 2015. Santa Fe Institute and Stockholm University physicists engage a technical survey of Darwinian selectivity by way of game theories, an approach defined as the mathematics of interactions, transmission, and information, which engender statistical adaptive, Bayesian responses. A closing chapter then sees the intent and result as an evolving individuality.

Sommaruga, Giovanni and Thomas Strahm, eds. Turing’s Revolution. Switzerland: Birkhauser, 2015. A follow up volume to the Alan Turing’s 2012 centenary collections (search his name) which situates computational theories in a continuity from Al-Khwarizmi’s (c. 780-850) algorithms to Leibniz’s (1646-1716) universal calculus to our 21st century global Internet society. By this view, the historic inklings of a deep mathematical source that is programmatic in kind can at last reach a definitive articulation. A persistent incentive (also for the great work alchemical project) was that if a natural code could be discerned, human beings could avail to make a better life and world. A theme running through the book is then a creative universe that somehow computes itself into being and becoming, wherein human beings have a central participatory role.

Stepney, Susan and Andrew Adamatzky, eds. Inspired by Nature. International: Springer, 2018. A 60th birthday tribute to Julian Miller, the University of York, UK computer applications pioneer with works such as Cartesian Genetic Programming (Springer, 2011, search). Seven years later, a further sense of universal computations occurs in chapters Evolution in Nanomaterio by Hajo Broersma, Code Evolution with Genetic Programming by Wolfgang Banzhaf, and Chemical Computing through Simulated Evolution by Larry Bull, et al, and others. This concurrent edition adds to the similar content of Computational Matter, (herein) also edited by S. Stepney. A tacit theme is a Big Code in pervasive effect beyond and before Big Data. As the sample quote conveys, within this scenario, it seems that natural cosmic genesis may intend to pass off its procreative to our human intention for all futures, if we might altogether appreciate this.

Evolution has done a great job for many living organisms, and it has done so through a conceptually rather easy but smart, albeit very slow bottom up process of natural Darwinian evolution. There seems to be no design involved in this process: the computational instructions that underlie living creatures have not been designed but rather have (been) evolved. Two questions come to mind: Can we use something similar as evolution on ‘dead’ matter in order to get something useful? Can we do it quickly, via a rapidly converging evolutionary process? (H. Broersma, University of Twente, 95)

Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018. With coeditors Steen Rasmussen and Martyn Amos, the volume is a broad survey of endeavors with an umbrella Unconventional Computing UCOMP name, drawn originally from Stanislaw Ulam, which considers every manifest phase from quantum to social realms to arise in an exemplary way from a mathematical source code. Some entries are Cellular Computing and Synthetic Biology (M. Amos, et al), Decoding Genetic Information (G. Franco and V. Manca), BIOMICS: A Theory of Interaction Computing (P. Dini, et al), and Nanoscale Molecular Automata (R. Rinaldi, et al). An inference is a natural “digitalization” process as a double reality via a software-like informative program and an overt material, creaturely result. And it would seem to readily infer “genotype and phenotype,” see also The Principles of Informational Genomics by V. Manca (search) in Theoretical computer Science (701/190, 2017).

This book is concerned with computing in materio: that is, unconventional computing performed by directly harnessing the physical properties of materials. It offers an overview of the field, covering four main areas: theory, practice, applications and implications. Each chapter synthesizes current understanding by bringing together researchers across a collection of related research projects. The book is useful for graduate students, researchers in the field, and the general scientific reader who is interested in inherently interdisciplinary research at the intersections of computer science, biology, chemistry, physics, engineering and mathematics. (Springer)

Thompson, Sarah, et al. The Fractal Geometry of Fitness Landscapes at the Local Optima Level. Natural Computing. December, 2020. University of Stirling, UK and Universite du Littoral Côte d’Opale, Calais systems mathematicians show how an evolutionary dynamics meant to attain and select a fittest result can take on a self-similar topology.

A local optima network (LON) encodes local optima connectivity in the fitness landscape of a combinatorial optimisation problem. Recently, LONs have been studied for their fractal dimension which is a complexity index where a non-integer can be assigned to a pattern. We use visual analysis, correlation analysis, and machine learning to show that relationships exist and that fractal features of LONs can contribute to explaining and predicting algorithm performance. (Abstract excerpt)

A heuristic is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation.

Valiant, Leslie. Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World. New York: Basic Books, 2013. A Harvard University mathematician broaches an innovative view of life’s advance as due to interactions between creatures and their environment. This novel evolution is a learning process that proceeds by way of algorithmic computations, which in this sense are dubbed “ecorithms” as they try to reach a sufficient optimum. In so doing, Valiant joins current perceptions from Bayesian, rationality (Okasha), Markov, genetic, approaches that view not only Darwinian evolution as a computational learning mechanism, but seem to imply the entire universe is some optimization or maximization process.

Other LV works are a presentation in the Computation, Computational Efficiency, and Cognitive Science unit at the 2013 AAAS meeting entitled “Biological Evolution as a Form of Learning.” (http://aaas.confex.com/aaas/2013/webprogram/Paper9186.html), and an earlier paper “Evolvability” in the Journal of the Association for Computing Machinery (56/1, 2009). Both the talk and paper Abstracts are below. And we might add that the author notes the modern word algorithm is attributed to the Persian mathematician and scholar Al-Khwarizmi (780-850, see Wikipedia) who worked at the House of Wisdom in Baghdad as a founder of algebraic equations. Once again here is the true heart of Islam that must be remembered and recovered anew today.

Algorithms are the step-by-step instructions used in computing for achieving desired results, much like recipes in cooking. In both cases the recipe designer has a certain controlled environment in mind for realizing the recipe, and foresees how the desired outcome will be achieved. The algorithms I discuss in this book are special. Unlike most algorithms, they can be run in environments unknown to the designer, and they learn by interacting with the environment how to act effectively in it. After sufficient interaction they will have expertise not provided by the designer but extracted from the environment. I call these algorithms ecorithms. The model of learning they follow, known as the probably approximately correct model, provides a quantitative framework in which designers can evaluate the expertise achieved and the cost of achieving it. These ecorithms are not merely a feature of computers. I argue in this book that such learning mechanisms impose and determine the character of life on Earth. The course of evolution is shaped entirely by organisms interacting with and adapting to their environments. This biological inheritance, as well as further learning from the environment after conception and birth, have a determining influence on the course of an individual’s life. The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science. (Book Summary)

Living organisms function as protein circuits. We suggest that computational learning theory offers the framework for investigating the question of how such circuits can come into being adaptively from experience without a designer. We formulate Darwinian evolution as a form of learning from examples. The targets of the learning process are the functions of highest fitness. The examples are the experiences. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. We suggest that the close technical connection this establishes between, on the one hand, learning by individuals, and on the other, biological evolution, has important ramifications for the fundamental nature of cognition. (Talk Abstract)

Living organisms function according to complex mechanisms that operate in different ways depending on conditions. Evolutionary theory suggests that such mechanisms sevolved through random variation guided by selection. However, there has existed no theory that would explain quantitatively which mechanisms can so evolve in realistic population sizes within realistic time periods, and which are too complex. In this paper we suggest such a theory. Evolution is treated as a form of computational learning from examples in which the course of learning is influenced only by the fitness of the hypotheses on the examples, and not otherwise by the specific examples. We formulate a notion of evolvability that quantifies the evolvability of different classes of functions. It is shown that in any one phase of evolution where selection is for one beneficial behavior, monotone Boolean conjunctions and disjunctions are demonstrably evolvable over the uniform distribution, while Boolean parity functions are demonstrably not. The framework also allows a wider range of issues in evolution to be quantified. We suggest that the overall mechanism that underlies biological evolution is evolvable target pursuit, which consists of a series of evolutionary stages, each one pursuing an evolvable target in our technical sense, each target being rendered evolvable by the serendipitous combination of the environment and the outcome of previous evolutionary stages. (Evolvability Abstract)

Vega-Rodriguez, Miguel and Jose Granado-Criado. Parallel Computing in Computational Biology. Journal of Computational Biology. 25/8, 2018. An introduction to a special issue about current technological methods by University of Extermadura, Spain computer scientists. For example see Large-Scale Simulations of Bacterial Populations over Complex Networks, Signature Gene Identification of Cancer Occurrence and Pattern Recognition and Human Splice-Site Prediction with Deep Neural Networks.

Computational biology allows and encourages the application of many different parallel computing approaches. This special issue brings together high-quality state-of-the-art contributions about parallel computing in computational biology, from different technological points of view or perspectives. The special issue collects considerably extended and improved versions of the best articles, accepted and presented at the 2017 International Workshop on Parallelism in Bioinformatics.

Walker, Sara Imari. Top-Down Causation and the Rise of Information in the Emergence of Life. Information. 5/3, 2014. Reviewed more in Information Computation, the Arizona State University astrophysicist explains the presence and role of algorithmic processes that appear to originate and drive a complexifying evolution. See also the author's 2015 paper The Descent of Math at arXiv:1505.00312.

Walker, Sara Imari and Paul Davies. The Algorithmic Origins of Life. Journal of the Royal Society Interface. 10/Art. 79, 2012. Reviewed more in Origin of Life, Sara Walker, NASA Astrobiology Institute, and astrophysicist Paul Davies, presently at Arizona State University, articulate how an episodic evolutionary emergence seems to be driven and tracked by way of an native informational source.

Watson, Richard A.. Compositional Evolution. Cambridge: MIT Press, 2006. Reviewed more in Quickening Evolution, a biologist and computer scientist at the University of Southampton applies algorithmic and complexity theory to show that much more is going on than a gradual drift of random mutations. Rather nature employs additional computations which generate a constant modularity engaged in symbiotic assemblies.

Wibral, Michael, et al. Bits from Biology for Computational Intelligence. arXiv:1412.0291. If one may translate and gloss this paper by Wibral, Goethe University, with Joseph Lizier, University of Sydney, and Viola Priesemann, MPI Dynamics and Self-Organization, a novel self-developing, quickening cosmos seems suffused by natural information processing via evolved, genetic algorithms and multiplex networks. As Viola’s own research conveys (summary below, search Danielle Bassett also) human brains are an archetypal microcosm of this macrocosmic genesis. The paper appeared in Frontiers of Robotics and AI in 2015, and will be a chapter in From Matter to Life from Cambridge UP in 2017

Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). We show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. (Abstract excerpts)

The human brain has amazing information processing capacities, which rely on the coordinated activity of 80 billion neurons, each of them interacting with thousands of other neurons. With the aim to understand how the collective neural dynamics ultimately gives rise to the brain’s information processing capacities, I work on a class of collective states, i.e. critical states, because these maximize information processing capacities in models. In this context I recently provided the first evidence that the human brain operates not in a critical state, but keeps a distance to criticality [search Priesemann], despite the computational advantages of criticality. This indicates that maximizing information processing is not the only goal function for the brain. Based on these results, I suggest that the brain only tunes itself closer to criticality when strictly necessary for information processing. Else the brain maintains a larger safety margin, thereby losing processing capacity, but avoiding instability. In the near future, I want to reveal how the brain should allocate its resources and tune its distance to criticality for a given task. In the far future, this will provide insight of how information processing emerges from the collective neural dynamics in nervous systems. (Priesemann website)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next