(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent Complex Network Systems, Computational Programs, Genetic Ecode Scripts

2. Biteracy: Natural Algorithmic Computation

Rozenberg, Grzegorz, ed. Handbook of Natural Computing. Berlin: Springer, 2014. The editor-in-chief is a Polish computer scientist who has inspired this theoretical movement for decades. An Introduction he wrote is available on the Springer site, quoted below. The main topical areas covered are Cellular Automata, Neural, Evolutionary, Molecular, Quantum Computation, and Broader Perspectives.

Natural Computing is the field of research that investigates human-designed computing inspired by nature as well as computing taking place in nature, that is, it investigates models and computational techniques inspired by nature, and also it investigates, in terms of information processing, phenomena taking place in nature. Examples of the first strand include neural computation inspired by the function of the brain, evolutionary computation inspired by Darwinian evolution, cellular automata inspired by intercellular communications, swarm intelligence inspired by the behavior of groups of organisms, artificial immune systems, membrane computing, and amorphous computing inspired by morphogenesis. The second strand is represented by investigations into, among othrs, the computational nature of self-assembly, the computational nature of developmental processes, the computational nature of biochemical reactions, bacterial communication, brain processes, and the systems biology approach to bionetworks where cellular processes are treated in terms of communication and interaction.

Salcedo-Sanz, Sancho. Modern Meta-Heuristics Based on Nonlinear Physics Processes. Reports on Progress in Physics. Vol. 655, 2016. A technical contribution by a Universidad de Alcala, Madrid, computer scientist which gives the project of distilling natural search and optimize algorithms a deeper physical basis. While previous efforts draw from biology and evolution, see Xin-She Yang herein, this study shows how the fundamental self-organized complexities can be an equally valuable source. For example, Boltzmann distribution, fractal structure generation, vortex search, coral reefs (abstract below) simulated annealing, Big Bang-Big Crunch (see abstract), galactic swarm, electromagnetic algorithms, and more, are described. These theories are implying a Darwinian-like cosmos, a metaheuristic universe, which spawns prolific varieties so that good-enough candidates can be selected. One then wonders, does it extend to myriad orbital planets such as our own, who need to choose and affirm by their own volition?

Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. (Abstract excerpt)

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a set of solutions which is too large to be completely sampled. (Wikipedia)

The Coral Reefs Optimization algorithm is a novel global-search method based on corals' biology and coral reefs formation. The original idea is by Sancho Salcedo-Sanz, who realized that corals and reef formation may be artificially simulated to solve optimization problems. The algorithm is based on a reef of solutions to a given optimization problem (corals), that reproduce in a similar way as corals do in Nature (Broadcast spawning, brooding and budding operators are implemented). In addition, the larvae (new individuals) fight for space in the reef against other existing corals, and even weak solutions have the possibility of surviving, if there is enough room in the reef. The final algorithm has similarities to evolutionary algorithms and simulated annealing, and it has been shown to improve both approaches in different problems. (Author’s website)

Nature is the principal source for proposing new optimization methods such as genetic algorithms (GA) and simulated annealing (SA) methods. All traditional evolutionary algorithms are heuristic population-based search procedures that incorporate random variation and selection. The main contribution of this study is that it proposes a novel optimization method that relies on one of the theories of the evolution of the universe; namely, the Big Bang and Big Crunch Theory. In the Big Bang phase, energy dissipation produces disorder and randomness is the main feature of this phase; whereas, in the Big Crunch phase, randomly distributed particles are drawn into an order. Inspired by this theory, an optimization algorithm is constructed, which will be called the Big Bang–Big Crunch (BB–BC) method that generates random points in the Big Bang phase and shrinks those points to a single representative point via a center of mass or minimal cost approach in the Big Crunch phase. (Abstract - A New Optimization Method in Advances in Engineering Software, 37/2, 2007)

Sarma, Gopal. Reductionism and the Universal Calculus. arXiv:1607.06725. A computer scientist and philosopher now in medical school (bio below) suggests that Leibniz’s prescience to seek and decipher a natural programmic source was mostly set aside because of a later scientific emphasis upon a reductive method, which then missed this quality.

In the seminal essay, "On the unreasonable effectiveness of mathematics in the physical sciences," physicist Eugene Wigner poses a fundamental philosophical question concerning the relationship between a physical system and our capacity to model its behavior with the symbolic language of mathematics. In this essay, I examine an ambitious 16th and 17th-century intellectual agenda from the perspective of Wigner's question, namely, what historian Paolo Rossi (Logic and the Art of Memory 2002) calls "the quest to create a universal language." While many elite thinkers pursued related ideas, the most inspiring and forceful was Gottfried Leibniz's effort to create a "universal calculus," a pictorial language which would transparently represent the entirety of human knowledge, as well as an associated symbolic calculus with which to model the behavior of physical systems and derive new truths. I suggest that a deeper understanding of why the efforts of Leibniz and others failed could shed light on Wigner's original question. (Abstract)

My background is in mathematics, physics, data science, and software engineering. After completing my PhD at Stanford and prior to joining the Emory University School of Medicine, I worked for several years for Wolfram Research. I am currently pursuing a medical education with the conviction that the degree of specialization in modern science has created an urgent need for broadly educated scientists with training in multiple subjects at the graduate level. Ultimately, my aim is to shape the ongoing evolution of scientific and intellectual culture in order to responsibly guide ethical and humane use of increasingly sophisticated science and technology. In addition, I am broadly interested in the history and philosophy of science, the origins of scientific thought, and the organizational and cultural issues confronting the biomedical sciences as a result of widespread automation and computational methods. (GS website)

Scalise, Dominic and Rebecca Shulman. Emulating Cellular Automata in Chemical Reaction-Diffusion Networks. Natural Computing. 15/2, 2016. In an issue based on the DNA Computing 2014 conference (Google), Johns Hopkins computational theorists contribute to frontier realizations that along with genetic phenomena, chemical interactions can similarly be explicated by means of such complex dynamic Turing-like programs, from which orderly structures result. Search also Rondelez above, and Doty in Systems Chemistry for more. Although these advances seem to evince a material cosmic genesis which is brought into complex, vivifying emergence by virtue of an intrinsic, genome-like program.

Chemical reactions and diffusion can produce a wide variety of static or transient spatial patterns in the concentrations of chemical species. Here we show that given simple, periodic inputs, chemical reactions and diffusion can reliably emulate the dynamics of a deterministic cellular automaton, and can therefore be programmed to produce a wide range of complex, discrete dynamics. We describe a modular reaction–diffusion program that orchestrates each of the fundamental operations of a cellular automaton: storage of cell state, communication between neighboring cells, and calculation of cells’ subsequent states. Reaction–diffusion based cellular automata could potentially be built in vitro using networks of DNA molecules that interact via branch migration processes and could in principle perform universal computation, storing their state as a pattern of molecular concentrations, or deliver spatiotemporal instructions encoded in concentrations to direct the behavior of intelligent materials. (Abstract excerpts)

Siddique, Nazmul and Hojjat Adeli. Nature Inspired Computing. Cognitive Computation. 7/6, 2015. Akin to Xin-She Yang herein, Ulster University, Londonderry and Ohio State University informatics researchers gather and introduce an array of physics-based and biology-based algorithms that have arisen in the 21st century. In addition, the associated fields of neural networks, evolutionary computing and fuzzy logic are reviewed.

Siddique, Nazmul and Hojjat Adeli. Physics-Based Search and Optimization: Inspirations from Nature. Expert Systems. 33/6, 2016. In this Wiley journal, Ulster University, Londonderry and Ohio State University computer scientists gather for the first time a broad survey of dynamic iterative phenomena which occur even across cosmic and material realms, akin to many biological and behavioral examples cited herein (Yang). The novel insight is that nature’s universal (Darwinian) evolutionary process of myriad candidates from which a relative optimum result or condition is winnowed and selected can also be seen in universal effect. See, for example, Black Hole: A New Heuristic Optimization for Data Clustering by Abdolreza Hatamlou in Information Sciences (222/175, 2013).

This paper presents a review of recently developed physics‐based search and optimization algorithms that have been inspired by natural phenomena. They include Big Bang–Big Crunch, black hole search, galaxy‐based search, artificial physics optimization, electromagnetism optimization, charged system search, colliding bodies optimization, and particle collision algorithm. (Abstract)

Sidl, Leonhard, et al. Computational complexity, algorithmic scope, and evolution. Journal of Physics: Complexity. 6/1, 2025. University of Vienna and University of Leipzig bioinformatic researchers including Peter Stadler consider better ways by which life’s metabolic processes can be perceived and understood in terms of program operating systems. See also Computation in chemical graph rewriting networks by Christoph Flamm, et al in the same issue.

Biological systems are widely regarded as performing computations. Here we explore the idea that evolution confines biological computation to subsets of instances that can be solved with algorithms that are 'hardcoded' in the system. We use RNA secondary structure prediction as a developmental program to show that the salient features of the genotype–phenotype map remain intact even if 'simpler' algorithms are employed that correctly compute the structures for small subsets of instances. (Abstract)

Biological systems are often perceived to perform computations that solve complex problems at low computational cost as an evolutionary adaptation such as the central ‘information metabolism’ of a cell, i.e. DNA replication, transcription, and translation of RNA to proteins. Beyond these transformations of encoded information processing in neural systems, it is often unclear what exactly a biological system is computing. (1)

Returning to the question whether the concepts developed in theoretical computer science to describe computational complexity can also apply to computation in biological systems, we arrive at an affirmative answer. (11)

Sloss, Andrew and Steven Gustafson. 2019 Evolutionary Algorithm Review. arXiv:1906.08870. Bellevue, WA software scientists post a thorough survey as the field of artificial intelligence, broadly conceived, becomes ever more biological in its basis. By turns, life’s genetically programmed development is broached as an “Idealized Darwinism.” Section 5.1 is an Auto-Constructive Evolution, while 5.2 is Deep Neuroevolution and 5.3 Self-Replicating Neural Networks.

In this review, we explore a new taxonomy of evolutionary algorithms and classifications that look at five main areas: the ability to manage the control of the environment with limiters, how to explain and repeat the search process, understandings of input and output causality within a solution, the ability to manage algorithm bias due to data or user design, and lastly, and how to add corrective measures. As many reviews of evolutionary algorithms exist, after motivating this new taxonomy, we briefly classify a broad range of algorithms and identify areas of future research. (Abstract excerpt)

Smith, Eric and Supriya Krishnamurthy. Symmetry and Collective Fluctuations in Evolutionary Games. Bristol, UK: IOP Publishing, 2015. Santa Fe Institute and Stockholm University physicists engage a technical survey of Darwinian selectivity by way of game theories, an approach defined as the mathematics of interactions, transmission, and information, which engender statistical adaptive, Bayesian responses. A closing chapter then sees the intent and result as an evolving individuality.

Sommaruga, Giovanni and Thomas Strahm, eds. Turing’s Revolution. Switzerland: Birkhauser, 2015. A follow up volume to the Alan Turing’s 2012 centenary collections (search his name) which situates computational theories in a continuity from Al-Khwarizmi’s (c. 780-850) algorithms to Leibniz’s (1646-1716) universal calculus to our 21st century global Internet society. By this view, the historic inklings of a deep mathematical source that is programmatic in kind can at last reach a definitive articulation. A persistent incentive (also for the great work alchemical project) was that if a natural code could be discerned, human beings could avail to make a better life and world. A theme running through the book is then a creative universe that somehow computes itself into being and becoming, wherein human beings have a central participatory role.

Stepney, Susan and Andrew Adamatzky, eds. Inspired by Nature. International: Springer, 2018. A 60th birthday tribute to Julian Miller, the University of York, UK computer applications pioneer with works such as Cartesian Genetic Programming (Springer, 2011, search). Seven years later, a further sense of universal computations occurs in chapters Evolution in Nanomaterio by Hajo Broersma, Code Evolution with Genetic Programming by Wolfgang Banzhaf, and Chemical Computing through Simulated Evolution by Larry Bull, et al, and others. This concurrent edition adds to the similar content of Computational Matter, (herein) also edited by S. Stepney. A tacit theme is a Big Code in pervasive effect beyond and before Big Data. As the sample quote conveys, within this scenario, it seems that natural cosmic genesis may intend to pass off its procreative to our human intention for all futures, if we might altogether appreciate this.

Evolution has done a great job for many living organisms, and it has done so through a conceptually rather easy but smart, albeit very slow bottom up process of natural Darwinian evolution. There seems to be no design involved in this process: the computational instructions that underlie living creatures have not been designed but rather have (been) evolved. Two questions come to mind: Can we use something similar as evolution on ‘dead’ matter in order to get something useful? Can we do it quickly, via a rapidly converging evolutionary process? (H. Broersma, University of Twente, 95)

Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018. With coeditors Steen Rasmussen and Martyn Amos, the volume is a broad survey of endeavors with an umbrella Unconventional Computing UCOMP name, drawn originally from Stanislaw Ulam, which considers every manifest phase from quantum to social realms to arise in an exemplary way from a mathematical source code. Some entries are Cellular Computing and Synthetic Biology (M. Amos, et al), Decoding Genetic Information (G. Franco and V. Manca), BIOMICS: A Theory of Interaction Computing (P. Dini, et al), and Nanoscale Molecular Automata (R. Rinaldi, et al). An inference is a natural “digitalization” process as a double reality via a software-like informative program and an overt material, creaturely result. And it would seem to readily infer “genotype and phenotype,” see also The Principles of Informational Genomics by V. Manca (search) in Theoretical computer Science (701/190, 2017).

This book is concerned with computing in materio: that is, unconventional computing performed by directly harnessing the physical properties of materials. It offers an overview of the field, covering four main areas: theory, practice, applications and implications. Each chapter synthesizes current understanding by bringing together researchers across a collection of related research projects. The book is useful for graduate students, researchers in the field, and the general scientific reader who is interested in inherently interdisciplinary research at the intersections of computer science, biology, chemistry, physics, engineering and mathematics. (Springer)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next