(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Livnat, Adi and Christos Papadimitriou. Sex as an Algorithm: The Theory of Evolution Under the Lens of Computation. Communications of the ACM. November, 2016. A University of Haifa evolutionary theorist and a UC Berkeley computer scientist compose a popular update on this digital Darwin synthesis. Its fertile promise is reported in this section, Systems Evolution, Intelligent Evolution, and elsewhere. Search each author for more papers and links.

Sexual reproduction is nearly ubiquitous in nature. Recent research at the interface of evolution and computer science has revealed that evolution under sex possesses a multifaceted computational nature - it can be seen as a coordination game between genes played according to powerful Multiplicative weights Update Algorithm; or as a randomized algorithm for deciding whether genetic variants perform well across all possible combinations; it allows mutation to process and transmit information;, and much more. Computational models and considerations are becoming an indispensable tool for unlocking the secrets of evolution. (Key Insights)

Manca, Vincenzo. The Principles of Informational Genomics. Theoretical Computer Science. 701/190, 2017. The University of Verona computer scientist complements his chapter Decoding Genetic Information with G. Franco in Computational Matter (S. Stepney, 2017) about novel perspectives of genetic activity in terms of their algorithmic, semantic, linguistic qualities.

The present paper investigates the properties of genomes directly related with their long linear structure. A systematic approach is introduced that is based on an integration of string analysis and information theory, applied and verified on real genomes. New concepts and results are given in connection with genome empirical entropies (and related indexes), genome dictionaries and distributions, word elongations, informational divergences, genome assemblies, and genome segmentations.

Marletto, Chiara. Constructor Theory of Life. arXiv:1407.0681. As the Abstract alludes, and many scientists now admit, the claim that selection alone is all that is needed, or going on, is simply inadequate. The Oxford University mathematician and collaborator with physicist David Deutsch (search) articulates another take on something else and more, with affinities to and roots in statistical physics, as nature’s informative source. From disparate entries such as cellular automata, computationalism, algorithmic nature and more, these efforts converge on some manner of an implicate, program-like code from which evolution arises, iterates and exemplifies. As noted for Gordana Dodig-Crnkovic 2014, there is a need for cross-translation and synthesis such as the title and the technical terms, along with an admission of a self-existing reality and procreation of which everything is an intended phenomenon.

Neo-Darwinian evolutionary theory explains how the appearance of purposive design in the sophisticated adaptations of living organisms can have come about without their intentionally being designed. The explanation relies crucially on the possibility of certain physical processes: mainly, gene replication and natural selection. In this paper I show that for those processes to be possible without the design of biological adaptations being encoded in the laws of physics, those laws must have certain other properties. The theory of what these properties are is not part of evolution theory proper, and has not been developed, yet without it the neo-Darwinian theory does not fully achieve its purpose of explaining the appearance of design.

To this end I apply Constructor Theory's new mode of explanation to provide an exact formulation of the appearance of design, of no-design laws, and of the logic of self-reproduction and natural selection, within fundamental physics. I conclude that self-reproduction, replication and natural selection are possible under no-design laws, the only non-trivial condition being that they allow digital information to be physically instantiated. This has an exact characterisation in the constructor theory of information. I also show that under no-design laws an accurate replicator requires the existence of a "vehicle" constituting, together with the replicator, a self-reproducer. (Abstract)

Conclusion: I have proved that the physical processes the theory of evolution relies upon are possible under no-design laws, provided that the latter allow for information media (and enough generic resources). Under such laws, accurate self-reproduction can occur, but only via von Neumann's replicator-vehicle logic; and a high fidelity replicator requires an accurate self-reproducer. My argument also highlights that all accurate constructors, under such laws, must contain knowledge - a special abstract constructor - in the form of a recipe, instantiated in a replicator. I have also extended von Neumann's model of the logic of self-reproduction to quantum theory. This informs further investigations of quantum effects in natural and artificial self-reproducers. Constructor theory has also expressed exactly within fundamental physics, the logic of self-reproduction, replication, and natural selection, and the appearance of design. This has promise for a deep unification in our understanding of life and physics. (25)

Mayfield, John. The Engine of Complexity: Evolution as Computation. New York: Columbia University Press, 2013. Reviewed more in Quickening Evolution, a book-length treatment of life’s temporal occasion, selection, and ramification seen as an algorithmic operation and optimization.

Miller, Julian, ed. Cartesian Genetic Programming. Berlin: Springer, 2011. The University of York editor, a cofounder of this method, has a doctorate in nonlinear mathematics. We cite to report an array of inherent natural computations which are in generative effect. An usage which drew our notice is Artificial Intelligence in Peer Review by Maciej Mrowinski, et al at arXiv:1712.01682, Abstract below, who find CGP to aid editorial processes.

Cartesian Genetic Programming (CGP) is a highly effective and increasingly popular form of genetic programming. It represents programs in the form of directed graphs, and a particular characteristic is that it has a highly redundant genotype–phenotype mapping, in that genes can be noncoding. It has spawned a number of new forms, each improving on the efficiency, among them modular, or embedded, CGP, and self-modifying CGP. It has been applied to many problems in both computer science and applied sciences. This book contains chapters written by the leading figures in the development and application of CGP, and it will be essential reading for researchers in genetic programming and for engineers and scientists solving applications using these techniques. It will also be useful for advanced undergraduates and postgraduates seeking to understand and utilize a highly efficient form of genetic programming. (Book)

With the volume of manuscripts submitted for publication growing every year, the deficiencies of peer review (e.g. long review times) are becoming more apparent. Editorial strategies, sets of guidelines designed to speed up the process and reduce editors workloads, are treated as trade secrets by publishing houses and are not shared publicly. To improve the effectiveness of their strategies, editors in small publishing groups are faced with undertaking an iterative trial-and-error approach. We show that Cartesian Genetic Programming, a nature-inspired evolutionary algorithm, can dramatically improve editorial strategies. The artificially evolved strategy reduced the duration of the peer review process by 30%, without increasing the pool of reviewers (in comparison to a typical human-developed strategy). Evolutionary computation has typically been used in technological processes or biological ecosystems. Our results demonstrate that genetic programs can improve real-world social systems that are usually much harder to understand and control than physical systems. (Abstract)

Molina, Daniel, et al. Comprehensive Taxonomies of Nature-and Bio-inspired Optimization. Cognitive Computation. 12/897, 2020. Five University of Granada, University of the Basque Country, and King Abdulaziz University Saudi Arabia informatics scientists achieve a comprehensive survey to date of iterative search methods across categories such as Physical, Evolutionary, Organism Breeding, Plants, Social and Swarm Intelligence. Examples are Big Bang Big Crunch, Cuttlefish Algorithm, Moth Flame Optimization, Galaxy Based Search, Bus Transport Behavior onto Soccer League Games and many more. Altogether they imply how much this Ecosmos to Earthling scenario in which we find ourselves is involved with reaching and achieving a “good enough” result or resolve at every phase. Whom then are we international inquirers just coming to learn this? What quality or feature is a genesis nature trying to optimize and select for?

In the last years the number of bio-inspired optimization approaches have so grown in number that they compromise this vital field. This paper addresses this problem by proposing two comprehensive, principle-based taxonomies to help organize existing and future developments into two criteria: the source of inspiration and the behavior of each algorithm. In regard, we review more than 300 publications dealing with nature-inspired and bio-inspired algorithms leading to a critical summary of design trends and similarities, between them. We show that more than one-third of the bio-inspired solvers are versions of classical algorithms. We close with recommendations for better methodological practices. (Abstract excerpt, edits)

Moya, Andres. The Calculus of Life: Towards a Theory of Life. Berlin: SpringerBriefs in Biology, 2015. An edition in Springer Briefs in Biology by a Universidad de los Andes, Bogota professor of bioeconomics with a 2013 doctorate from UC Davis. As Moya’s website notes, his work divides between these scientific frontiers and avail to mitigate the chronic social violence of Columbia. At this late frontier, the endeavor is to move beyond an old view of chance and tinkering via systems biology to an innate natural logic and computation. A Turing-type algorithmic mathematics is then engaged from biochemicals to cellular dynamics.

This book explores the exciting world of theoretical biology and is divided into three sections. The first section examines the roles played by renowned scientists such as Jacob, Monod, Rosen, Turing, von Bertalanffy, Waddington and Woodger in developing the field of theoretical biology. The second section, aided with numerous examples, supports the idea that logic and computing are suitable formal languages to describe and understand biological phenomena. The third and final section is, without doubt, the most intellectually challenging and endeavors to show the possible paths we could take to compute a cell - the basic unit of life - or the conditions required for a predictive theory of biological evolution; ultimately, a theory of life in the light of modern Systems Biology. The work aims to show that modern biology is closer than ever to making Goethe's dream come true and that we have reached a point where synthetic and analytical traditions converge to shed light on the living being as a whole.

My current projects focus on the psychological consequences of violence and their implications for economic behavior and poverty dynamics, but also include an impact evaluation of a rural development project, the design of an index insurance program for small coffee farmers, and the design and evaluation of the land restitution program for victims of violence in Colombia. (AM webpage)

Navlakha, Saket and Ziv Bar-Joseph. Algorithms in Nature: The Convergence of Systems Biology and Computational Thinking. Molecular Systems Biology. 7/Art.546, 2011. Along with other entries in this new section, Carnegie Mellon University computer scientists advance this cross-fertilization of evolutionary and genetic programs as they inform their field while biological science is changing by way of nonlinear dynamics. The authors have also posted a website, www.algorithmsinnature.org, with visuals and resources, see quote below.

Computer science and biology have enjoyed a long and fruitful relationship for decades. Biologists rely on computational methods to analyze and integrate large data sets, while several computational methods were inspired by the high-level design principles of biological systems. Recently, these two directions have been converging. In this review, we argue that thinking computationally about biological processes may lead to more accurate models, which in turn can be used to improve the design of algorithms. We discuss the similar mechanisms and requirements shared by computational and biological processes and then present several recent studies that apply this joint analysis strategy to problems related to coordination, network analysis, and tracking and vision. We also discuss additional biological processes that can be studied in a similar manner and link them to potential computational problems. (Abstract)

The mutual relationship between biology and computer science goes back a long time. However, until recently, the application of computational methods to study biological processes and the influence of biological systems on the design of algorithms remained mostly separate. The former involved developing computational methods to solve specific biological problems, while the latter primarily viewed biological systems as motivation to derive new optimization methods. With the improvement in our ability to study detailed aspects of molecular, cellular, and organism-level biological systems, it is now possible to unite these two directions. This direction involves thinking about biological processes as algorithms that nature has designed to solve difficult real- world problems under a variety of constraints and conditions. (8)

Computer science and biology have shared a long history together. For many years, computer scientists have designed algorithms to process and analyze biological data (e.g. microarrays), and likewise, biologists have discovered several operating principles that have inspired new optimization methods (e.g. neural networks). Recently, these two directions have been converging based on the view that biological processes are inherently algorithms that nature has designed to solve computational problems. This website documents new studies that have taken a joint computational-biological approach to study the algorithmic properties of biological processes across all levels of life (molecular, cellular, and organism). (Website)

Neagu, Daniel. Special Issue on Computational Intelligence Algorithms and Applications. Soft Computing. 20/2921, 2016. The University of Bradford, UK, editor introduces this edition. Typical entries are Handwritten Chinese Character Recognition, and Ant Colony Optimization. We note this journal, see citation below, to record the growing perception of algorithms everywhere, which increasingly become an evolutionary arrow.

In computer science, soft computing (sometimes referred to as computational intelligence, though CI does not have an agreed definition) is the use of inexact solutions to computationally hard tasks such as the solution of NP-complete problems, for which there is no known algorithm that can compute an exact solution in polynomial time. Soft computing differs from conventional (hard) computing in that, unlike hard computing, it is tolerant of imprecision, uncertainty, partial truth, and approximation. In effect, the role model for soft computing is the human mind. The principal constituents of Soft Computing (SC) are Fuzzy Logic (FL), Evolutionary Computation (EC), Machine Learning (ML) and Probabilistic Reasoning (PR), with the latter subsuming belief networks, chaos theory and parts of learning theory. (Wikipedia)

Newman, Stuart. Form, function, mind: what doesn't compute (and what might). arXiv:2310.13910. This latest paper (search) by the New York Medical College, Valhalla, NY biophilosopher has several notable aspects. It is a review of his endeavors to discern further structural and physical influences which are involved in life’s cellular development. In 2023, he worries over shifting views of just how computational is biological phenomena, in confluence with genetic sources. But however this vital domain sorts out, our new procreative ecosmos is quite graced by some manner of procreative program.

The applicability of computational and dynamical systems models to organisms is scrutinized, using examples from developmental biology and cognition. Organic morphogenesis is dependent on the inherent material properties of tissues, a non-computational modality, but cell differentiation, which utilizes chromatin-based revisable memory banks and program-like function-calling, has a quasi-computational basis. Multi-attractor dynamical models are argued to be misapplied to global properties of development. Proposals are made for treating brains and other nervous tissues as novel forms of excitable matter with inherent properties which enable the intensification of cell-based basal cognition capabilities present throughout the tree of life. (Excerpt)

The foregoing text introduces this essay on the bases of biological form and function and their relation to computation. My main claim is that key aspects of living organisms, their morphogenesis (generation of form), and aspects of sensory and cognitive capacities, are expressions of the inherencies of the materials of which they are composed. Other features, including the differentiation of specialized cells and retrieval of stored patterns in the brain, utilize quasi-computational storage, data addressing and function calling processes. (3-4)

Nichol, Daniel, et al. Model Genotype-Phenotype Mappings and the Algorithmic Structure of Evolution. Journal of the Royal Society Interface. 16/20190332, 2019. Oxford University and H. Lee Moffitt Cancer Center, FL computer scientists and mathematical oncologists including Peter Jeavons describe an advanced complex systems biology method which joins cellular components into a dynamic synthesis from genes to metabolism. Then a novel program-like factor is brought into play to better quantify and express the metastasis invasions. To so reflect, from our late vantage Earth life’s contingent evolution seems yet to reach a global cumulative knowledge which can be fed back to contain, heal and prevent. We are given to perceive some manner of palliative, self-medicating procreation, which seems meant to pass on to our intentional continuance. What an incredible scenario is being revealed to us.

Cancers are complex dynamic systems that undergo evolution and selection. Personalized medicine in the clinic increasingly relies on predictions of tumour response to one or more therapies, which are complicated by the phenotypic evolution of the tumour. The emergence of resistant phenotypes is not predicted from genomic data, since the relationship between genotypes and phenotypes, termed genotype–phenotype (GP) mapping, is neither injective nor functional. We review mapping models within a generalized evolutionary framework that relates genotype, phenotype, environment and fitness. The GP-mapping provides a pathway for understanding the potential routes of evolution taken by cancers, which will be necessary knowledge for improving personalized therapies. (Abstract excerpt)

Olhede, Sofia and P. J. Wolfe. The Growing Ubiquity of Algorithms in Society. Philosophical Transactions of the Royal Society A. Vol.376/Iss.2128, 2018. University College London and Purdue University computer scientists introduce an issue of papers from a discussion meeting about this invisible but major role that computational methods are taking on in 21st century societies. Topics such as the governance of data, transparency of algorithms, legal and ethical frameworks for automated decision-making and public impacts of hidden programs are considered for both pro and con aspects.

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next