(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Recent Additions

IV. Ecosmomics: An Independent, UniVersal, Source Code-Script of Generative Complex Network Systems

2. Biteracy: Natural Algorithmic Computation

Stepney, Susan and Andrew Adamatzky, eds. Inspired by Nature. International: Springer, 2018. A 60th birthday tribute to Julian Miller, the University of York, UK computer applications pioneer with works such as Cartesian Genetic Programming (Springer, 2011, search). Seven years later, a further sense of universal computations occurs in chapters Evolution in Nanomaterio by Hajo Broersma, Code Evolution with Genetic Programming by Wolfgang Banzhaf, and Chemical Computing through Simulated Evolution by Larry Bull, et al, and others. This concurrent edition adds to the similar content of Computational Matter, (herein) also edited by S. Stepney. A tacit theme is a Big Code in pervasive effect beyond and before Big Data. As the sample quote conveys, within this scenario, it seems that natural cosmic genesis may intend to pass off its procreative to our human intention for all futures, if we might altogether appreciate this.

Evolution has done a great job for many living organisms, and it has done so through a conceptually rather easy but smart, albeit very slow bottom up process of natural Darwinian evolution. There seems to be no design involved in this process: the computational instructions that underlie living creatures have not been designed but rather have (been) evolved. Two questions come to mind: Can we use something similar as evolution on ‘dead’ matter in order to get something useful? Can we do it quickly, via a rapidly converging evolutionary process? (H. Broersma, University of Twente, 95)

Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018. With coeditors Steen Rasmussen and Martyn Amos, the volume is a broad survey of endeavors with an umbrella Unconventional Computing UCOMP name, drawn originally from Stanislaw Ulam, which considers every manifest phase from quantum to social realms to arise in an exemplary way from a mathematical source code. Some entries are Cellular Computing and Synthetic Biology (M. Amos, et al), Decoding Genetic Information (G. Franco and V. Manca), BIOMICS: A Theory of Interaction Computing (P. Dini, et al), and Nanoscale Molecular Automata (R. Rinaldi, et al). An inference is a natural “digitalization” process as a double reality via a software-like informative program and an overt material, creaturely result. And it would seem to readily infer “genotype and phenotype,” see also The Principles of Informational Genomics by V. Manca (search) in Theoretical computer Science (701/190, 2017).

This book is concerned with computing in materio: that is, unconventional computing performed by directly harnessing the physical properties of materials. It offers an overview of the field, covering four main areas: theory, practice, applications and implications. Each chapter synthesizes current understanding by bringing together researchers across a collection of related research projects. The book is useful for graduate students, researchers in the field, and the general scientific reader who is interested in inherently interdisciplinary research at the intersections of computer science, biology, chemistry, physics, engineering and mathematics. (Springer)

Thompson, Sarah, et al. The Fractal Geometry of Fitness Landscapes at the Local Optima Level. Natural Computing. December, 2020. University of Stirling, UK and Universite du Littoral Côte d’Opale, Calais systems mathematicians show how an evolutionary dynamics meant to attain and select a fittest result can take on a self-similar topology.

A local optima network (LON) encodes local optima connectivity in the fitness landscape of a combinatorial optimisation problem. Recently, LONs have been studied for their fractal dimension which is a complexity index where a non-integer can be assigned to a pattern. We use visual analysis, correlation analysis, and machine learning to show that relationships exist and that fractal features of LONs can contribute to explaining and predicting algorithm performance. (Abstract excerpt)

A heuristic is any approach to problem solving or self-discovery that employs a practical method that is not guaranteed to be optimal, perfect, or rational, but is nevertheless sufficient for reaching an immediate, short-term goal or approximation.

Valiant, Leslie. Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World. New York: Basic Books, 2013. A Harvard University mathematician broaches an innovative view of life’s advance as due to interactions between creatures and their environment. This novel evolution is a learning process that proceeds by way of algorithmic computations, which in this sense are dubbed “ecorithms” as they try to reach a sufficient optimum. In so doing, Valiant joins current perceptions from Bayesian, rationality (Okasha), Markov, genetic, approaches that view not only Darwinian evolution as a computational learning mechanism, but seem to imply the entire universe is some optimization or maximization process.

Other LV works are a presentation in the Computation, Computational Efficiency, and Cognitive Science unit at the 2013 AAAS meeting entitled “Biological Evolution as a Form of Learning.” (http://aaas.confex.com/aaas/2013/webprogram/Paper9186.html), and an earlier paper “Evolvability” in the Journal of the Association for Computing Machinery (56/1, 2009). Both the talk and paper Abstracts are below. And we might add that the author notes the modern word algorithm is attributed to the Persian mathematician and scholar Al-Khwarizmi (780-850, see Wikipedia) who worked at the House of Wisdom in Baghdad as a founder of algebraic equations. Once again here is the true heart of Islam that must be remembered and recovered anew today.

Algorithms are the step-by-step instructions used in computing for achieving desired results, much like recipes in cooking. In both cases the recipe designer has a certain controlled environment in mind for realizing the recipe, and foresees how the desired outcome will be achieved. The algorithms I discuss in this book are special. Unlike most algorithms, they can be run in environments unknown to the designer, and they learn by interacting with the environment how to act effectively in it. After sufficient interaction they will have expertise not provided by the designer but extracted from the environment. I call these algorithms ecorithms. The model of learning they follow, known as the probably approximately correct model, provides a quantitative framework in which designers can evaluate the expertise achieved and the cost of achieving it. These ecorithms are not merely a feature of computers. I argue in this book that such learning mechanisms impose and determine the character of life on Earth. The course of evolution is shaped entirely by organisms interacting with and adapting to their environments. This biological inheritance, as well as further learning from the environment after conception and birth, have a determining influence on the course of an individual’s life. The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science. (Book Summary)

Living organisms function as protein circuits. We suggest that computational learning theory offers the framework for investigating the question of how such circuits can come into being adaptively from experience without a designer. We formulate Darwinian evolution as a form of learning from examples. The targets of the learning process are the functions of highest fitness. The examples are the experiences. The learning process is constrained so that the feedback from the experiences is Darwinian. We formulate a notion of evolvability that distinguishes function classes that are evolvable with polynomially bounded resources from those that are not. We suggest that the close technical connection this establishes between, on the one hand, learning by individuals, and on the other, biological evolution, has important ramifications for the fundamental nature of cognition. (Talk Abstract)

Living organisms function according to complex mechanisms that operate in different ways depending on conditions. Evolutionary theory suggests that such mechanisms sevolved through random variation guided by selection. However, there has existed no theory that would explain quantitatively which mechanisms can so evolve in realistic population sizes within realistic time periods, and which are too complex. In this paper we suggest such a theory. Evolution is treated as a form of computational learning from examples in which the course of learning is influenced only by the fitness of the hypotheses on the examples, and not otherwise by the specific examples. We formulate a notion of evolvability that quantifies the evolvability of different classes of functions. It is shown that in any one phase of evolution where selection is for one beneficial behavior, monotone Boolean conjunctions and disjunctions are demonstrably evolvable over the uniform distribution, while Boolean parity functions are demonstrably not. The framework also allows a wider range of issues in evolution to be quantified. We suggest that the overall mechanism that underlies biological evolution is evolvable target pursuit, which consists of a series of evolutionary stages, each one pursuing an evolvable target in our technical sense, each target being rendered evolvable by the serendipitous combination of the environment and the outcome of previous evolutionary stages. (Evolvability Abstract)

Vega-Rodriguez, Miguel and Jose Granado-Criado. Parallel Computing in Computational Biology. Journal of Computational Biology. 25/8, 2018. An introduction to a special issue about current technological methods by University of Extermadura, Spain computer scientists. For example see Large-Scale Simulations of Bacterial Populations over Complex Networks, Signature Gene Identification of Cancer Occurrence and Pattern Recognition and Human Splice-Site Prediction with Deep Neural Networks.

Computational biology allows and encourages the application of many different parallel computing approaches. This special issue brings together high-quality state-of-the-art contributions about parallel computing in computational biology, from different technological points of view or perspectives. The special issue collects considerably extended and improved versions of the best articles, accepted and presented at the 2017 International Workshop on Parallelism in Bioinformatics.

Walker, Sara Imari. Top-Down Causation and the Rise of Information in the Emergence of Life. Information. 5/3, 2014. Reviewed more in Information Computation, the Arizona State University astrophysicist explains the presence and role of algorithmic processes that appear to originate and drive a complexifying evolution. See also the author's 2015 paper The Descent of Math at arXiv:1505.00312.

Walker, Sara Imari and Paul Davies. The Algorithmic Origins of Life. Journal of the Royal Society Interface. 10/Art. 79, 2012. Reviewed more in Origin of Life, Sara Walker, NASA Astrobiology Institute, and astrophysicist Paul Davies, presently at Arizona State University, articulate how an episodic evolutionary emergence seems to be driven and tracked by way of an native informational source.

Watson, Richard A.. Compositional Evolution. Cambridge: MIT Press, 2006. Reviewed more in Quickening Evolution, a biologist and computer scientist at the University of Southampton applies algorithmic and complexity theory to show that much more is going on than a gradual drift of random mutations. Rather nature employs additional computations which generate a constant modularity engaged in symbiotic assemblies.

Wibral, Michael, et al. Bits from Biology for Computational Intelligence. arXiv:1412.0291. If one may translate and gloss this paper by Wibral, Goethe University, with Joseph Lizier, University of Sydney, and Viola Priesemann, MPI Dynamics and Self-Organization, a novel self-developing, quickening cosmos seems suffused by natural information processing via evolved, genetic algorithms and multiplex networks. As Viola’s own research conveys (summary below, search Danielle Bassett also) human brains are an archetypal microcosm of this macrocosmic genesis. The paper appeared in Frontiers of Robotics and AI in 2015, and will be a chapter in From Matter to Life from Cambridge UP in 2017

Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). We show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. (Abstract excerpts)

The human brain has amazing information processing capacities, which rely on the coordinated activity of 80 billion neurons, each of them interacting with thousands of other neurons. With the aim to understand how the collective neural dynamics ultimately gives rise to the brain’s information processing capacities, I work on a class of collective states, i.e. critical states, because these maximize information processing capacities in models. In this context I recently provided the first evidence that the human brain operates not in a critical state, but keeps a distance to criticality [search Priesemann], despite the computational advantages of criticality. This indicates that maximizing information processing is not the only goal function for the brain. Based on these results, I suggest that the brain only tunes itself closer to criticality when strictly necessary for information processing. Else the brain maintains a larger safety margin, thereby losing processing capacity, but avoiding instability. In the near future, I want to reveal how the brain should allocate its resources and tune its distance to criticality for a given task. In the far future, this will provide insight of how information processing emerges from the collective neural dynamics in nervous systems. (Priesemann website)

Wolfram, Stephen. Logic, Explainability and the Future of Understanding. Complex Systems. 28/1, 2019. The polymath prodigy (bio below) now can provide a 40 page technical survey of the history, present and preview of philosophical knowledge by way of its computational basis. In this journal edited by Hector Zenil, see also On Patterns and Dynamics of Rule 22 Cellular Automaton by Genaro Martinez, et al (28/2).

Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder CEO of Wolfram Research. Born in London in 1959, he was educated at Eton, Oxford and received his PhD in theoretical physics from Caltech at age 20. Wolfram's early scientific work was mainly in high-energy physics, quantum field theory and cosmology. Over the course of four decades, he has been responsible for many discoveries, inventions and innovations in computer science and beyond. (www.stephenwolfram.com)

Wolpert, David, et al, eds. The Energetics of Computing in Life and Machines. Santa Fe: Santa Fe Institute Press, 2018. These highly technical proceedings from SFI seminars consider more efficient computational methods by way a better, deeper integration with vital principles and procedures. For example see Overview of Information Theory and Stochastic Thermodynamics of Computation by Wolpert (search), Information Processing in Chemical Systems by Peter Stadler, et al, and Automatically Reducing Energy consumption of software by Stephanie Forrest, et al.

Why do computers use so much energy? What are the fundamental physical laws governing the relationship between the precise computation run by a system, whether artificial or natural, and how much energy that computation requires? Can we learn how to improve efficiency in computing by examining how biological computers manage to be so efficient? The time is ripe for a new synthesis of systems physics, computer science, and biochemistry. This volume integrates pure and applied concepts from these diverse fields, with the goal of cultivating a modern, nonequilibrium thermodynamics of computation.

Woods, Damian, et al. Diverse and Robust Molecular Algorithms Using Reprogrammable DNA Self-Assembly. Nature. 567/366, 2019. Seven CalTech and Harvard bioinformatic researchers including Erik Winfree and David Doty (search each) advance understandings of how nature’s helical nucleotides can be availed for many more chemical, structural, data storage uses beyond replication. Who then are we cosmic curators to learn all about and intentionally take up life’s organic procreativity?

Molecular biology provides a proof-of-principle that chemical systems can store and process information to direct molecular activities such as the fabrication of complex structures from molecular components. Mathematical tiling and statistical–mechanical models of molecular crystallization have shown that algorithmic behaviour can be embedded within molecular self-assembly processes by DNA nanotechnology. Here we report the design and validation of a DNA tile set that contains 355 single-stranded tiles and can be reprogrammed to implement a wide variety of 6-bit algorithms. These findings suggest that molecular self-assembly could be a reliable algorithmic component within programmable chemical systems. (Abstract excerpt)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next