IV. Ecosmomics: An Independent, UniVersal, Source Code-Script of Generative Complex Network Systems
2. Biteracy: Natural Algorithmic Computation
Stepney, Susan and Andrew Adamatzky, eds. Inspired by Nature. International: Springer, 2018. A 60th birthday tribute to Julian Miller, the University of York, UK computer applications pioneer with works such as Cartesian Genetic Programming (Springer, 2011, search). Seven years later, a further sense of universal computations occurs in chapters Evolution in Nanomaterio by Hajo Broersma, Code Evolution with Genetic Programming by Wolfgang Banzhaf, and Chemical Computing through Simulated Evolution by Larry Bull, et al, and others. This concurrent edition adds to the similar content of Computational Matter, (herein) also edited by S. Stepney. A tacit theme is a Big Code in pervasive effect beyond and before Big Data. As the sample quote conveys, within this scenario, it seems that natural cosmic genesis may intend to pass off its procreative to our human intention for all futures, if we might altogether appreciate this.
Evolution has done a great job for many living organisms, and it has done so through a conceptually rather easy but smart, albeit very slow bottom up process of natural Darwinian evolution. There seems to be no design involved in this process: the computational instructions that underlie living creatures have not been designed but rather have (been) evolved. Two questions come to mind: Can we use something similar as evolution on ‘dead’ matter in order to get something useful? Can we do it quickly, via a rapidly converging evolutionary process? (H. Broersma, University of Twente, 95)
Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018. With coeditors Steen Rasmussen and Martyn Amos, the volume is a broad survey of endeavors with an umbrella Unconventional Computing UCOMP name, drawn originally from Stanislaw Ulam, which considers every manifest phase from quantum to social realms to arise in an exemplary way from a mathematical source code. Some entries are Cellular Computing and Synthetic Biology (M. Amos, et al), Decoding Genetic Information (G. Franco and V. Manca), BIOMICS: A Theory of Interaction Computing (P. Dini, et al), and Nanoscale Molecular Automata (R. Rinaldi, et al). An inference is a natural “digitalization” process as a double reality via a software-like informative program and an overt material, creaturely result. And it would seem to readily infer “genotype and phenotype,” see also The Principles of Informational Genomics by V. Manca (search) in Theoretical computer Science (701/190, 2017).
This book is concerned with computing in materio: that is, unconventional computing performed by directly harnessing the physical properties of materials. It offers an overview of the field, covering four main areas: theory, practice, applications and implications. Each chapter synthesizes current understanding by bringing together researchers across a collection of related research projects. The book is useful for graduate students, researchers in the field, and the general scientific reader who is interested in inherently interdisciplinary research at the intersections of computer science, biology, chemistry, physics, engineering and mathematics. (Springer)
Thompson, Sarah, et al. The Fractal Geometry of Fitness Landscapes at the Local Optima Level. Natural Computing. December, 2020. University of Stirling, UK and Universite du Littoral Côte d’Opale, Calais systems mathematicians show how an evolutionary dynamics meant to attain and select a fittest result can take on a self-similar topology.
A local optima network (LON) encodes local optima connectivity in the fitness landscape of a combinatorial optimisation problem. Recently, LONs have been studied for their fractal dimension which is a complexity index where a non-integer can be assigned to a pattern. We use visual analysis, correlation analysis, and machine learning to show that relationships exist and that fractal features of LONs can contribute to explaining and predicting algorithm performance. (Abstract excerpt)
Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World.
New York: Basic Books,
A Harvard University mathematician broaches an innovative view of life’s advance as due to interactions between creatures and their environment. This novel evolution is a learning process that proceeds by way of algorithmic computations, which in this sense are dubbed “ecorithms” as they try to reach a sufficient optimum. In so doing, Valiant joins current perceptions from Bayesian, rationality (Okasha), Markov, genetic, approaches that view not only Darwinian evolution as a computational learning mechanism, but seem to imply the entire universe is some optimization or maximization process.
Algorithms are the step-by-step instructions used in computing for achieving desired results, much like recipes in cooking. In both cases the recipe designer has a certain controlled environment in mind for realizing the recipe, and foresees how the desired outcome will be achieved. The algorithms I discuss in this book are special. Unlike most algorithms, they can be run in environments unknown to the designer, and they learn by interacting with the environment how to act effectively in it. After sufficient interaction they will have expertise not provided by the designer but extracted from the environment. I call these algorithms ecorithms. The model of learning they follow, known as the probably approximately correct model, provides a quantitative framework in which designers can evaluate the expertise achieved and the cost of achieving it. These ecorithms are not merely a feature of computers. I argue in this book that such learning mechanisms impose and determine the character of life on Earth. The course of evolution is shaped entirely by organisms interacting with and adapting to their environments. This biological inheritance, as well as further learning from the environment after conception and birth, have a determining influence on the course of an individual’s life. The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science. (Book Summary)
Vega-Rodriguez, Miguel and Jose Granado-Criado. Parallel Computing in Computational Biology. Journal of Computational Biology. 25/8, 2018. An introduction to a special issue about current technological methods by University of Extermadura, Spain computer scientists. For example see Large-Scale Simulations of Bacterial Populations over Complex Networks, Signature Gene Identification of Cancer Occurrence and Pattern Recognition and Human Splice-Site Prediction with Deep Neural Networks.
Computational biology allows and encourages the application of many different parallel computing approaches. This special issue brings together high-quality state-of-the-art contributions about parallel computing in computational biology, from different technological points of view or perspectives. The special issue collects considerably extended and improved versions of the best articles, accepted and presented at the 2017 International Workshop on Parallelism in Bioinformatics.
Walker, Sara Imari. Top-Down Causation and the Rise of Information in the Emergence of Life. Information. 5/3, 2014. Reviewed more in Information Computation, the Arizona State University astrophysicist explains the presence and role of algorithmic processes that appear to originate and drive a complexifying evolution. See also the author's 2015 paper The Descent of Math at arXiv:1505.00312.
Walker, Sara Imari and Paul Davies. The Algorithmic Origins of Life. Journal of the Royal Society Interface. 10/Art. 79, 2012. Reviewed more in Origin of Life, Sara Walker, NASA Astrobiology Institute, and astrophysicist Paul Davies, presently at Arizona State University, articulate how an episodic evolutionary emergence seems to be driven and tracked by way of an native informational source.
Watson, Richard A.. Compositional Evolution. Cambridge: MIT Press, 2006. Reviewed more in Quickening Evolution, a biologist and computer scientist at the University of Southampton applies algorithmic and complexity theory to show that much more is going on than a gradual drift of random mutations. Rather nature employs additional computations which generate a constant modularity engaged in symbiotic assemblies.
Wibral, Michael, et al. Bits from Biology for Computational Intelligence. arXiv:1412.0291. If one may translate and gloss this paper by Wibral, Goethe University, with Joseph Lizier, University of Sydney, and Viola Priesemann, MPI Dynamics and Self-Organization, a novel self-developing, quickening cosmos seems suffused by natural information processing via evolved, genetic algorithms and multiplex networks. As Viola’s own research conveys (summary below, search Danielle Bassett also) human brains are an archetypal microcosm of this macrocosmic genesis. The paper appeared in Frontiers of Robotics and AI in 2015, and will be a chapter in From Matter to Life from Cambridge UP in 2017
Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). We show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. (Abstract excerpts)
Wolfram, Stephen. Logic, Explainability and the Future of Understanding. Complex Systems. 28/1, 2019. The polymath prodigy (bio below) now can provide a 40 page technical survey of the history, present and preview of philosophical knowledge by way of its computational basis. In this journal edited by Hector Zenil, see also On Patterns and Dynamics of Rule 22 Cellular Automaton by Genaro Martinez, et al (28/2).
Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder CEO of Wolfram Research. Born in London in 1959, he was educated at Eton, Oxford and received his PhD in theoretical physics from Caltech at age 20. Wolfram's early scientific work was mainly in high-energy physics, quantum field theory and cosmology. Over the course of four decades, he has been responsible for many discoveries, inventions and innovations in computer science and beyond. (www.stephenwolfram.com)
Wolpert, David, et al, eds. The Energetics of Computing in Life and Machines. Santa Fe: Santa Fe Institute Press, 2018. These highly technical proceedings from SFI seminars consider more efficient computational methods by way a better, deeper integration with vital principles and procedures. For example see Overview of Information Theory and Stochastic Thermodynamics of Computation by Wolpert (search), Information Processing in Chemical Systems by Peter Stadler, et al, and Automatically Reducing Energy consumption of software by Stephanie Forrest, et al.
Why do computers use so much energy? What are the fundamental physical laws governing the relationship between the precise computation run by a system, whether artificial or natural, and how much energy that computation requires? Can we learn how to improve efficiency in computing by examining how biological computers manage to be so efficient? The time is ripe for a new synthesis of systems physics, computer science, and biochemistry. This volume integrates pure and applied concepts from these diverse fields, with the goal of cultivating a modern, nonequilibrium thermodynamics of computation.
Woods, Damian, et al. Diverse and Robust Molecular Algorithms Using Reprogrammable DNA Self-Assembly. Nature. 567/366, 2019. Seven CalTech and Harvard bioinformatic researchers including Erik Winfree and David Doty (search each) advance understandings of how nature’s helical nucleotides can be availed for many more chemical, structural, data storage uses beyond replication. Who then are we cosmic curators to learn all about and intentionally take up life’s organic procreativity?
Molecular biology provides a proof-of-principle that chemical systems can store and process information to direct molecular activities such as the fabrication of complex structures from molecular components. Mathematical tiling and statistical–mechanical models of molecular crystallization have shown that algorithmic behaviour can be embedded within molecular self-assembly processes by DNA nanotechnology. Here we report the design and validation of a DNA tile set that contains 355 single-stranded tiles and can be reprogrammed to implement a wide variety of 6-bit algorithms. These findings suggest that molecular self-assembly could be a reliable algorithmic component within programmable chemical systems. (Abstract excerpt)