(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Cosmomics: A Genomic Source Code in Procreative Effect

A. Biteracy: Natural (Genetic) Algorithms

The present Cosmic Code chapter reports a broad array of scientific encounters with nature’s generative propensity to form evolutionary self-organizations of complexity and cognition. From our humankind vista, a growing perception can be noticed of an independent, mathematical program-like agency in procreative effect. As Systems Evolution next, there is much evidence that selection alone is inadequate, hence our Natural Genesis title. Here we collect entries from a range of computational, algorithmic, information-based, cellular automata, meta-biology, evolutionary optimization, analog/digital software, DNA data, and more active schools and versions. These endeavors often hark back to Gottfried Leibniz, to Greece before, and later to Alan Turing, Erwin Schrodinger, others to trace a heritage in search of an “alphabetic calculus, a Mathesis Universalis” with many finesses. In regard, the perennial quest (aka magnum opus, great work) was in part to decipher and read a natural logos, a repetitive source code for peoples to avail as an edifying guidance. This reward may at last be in sight via our 21st century sapiensphere, which we are trying to collate and synthesize.

This introduction is a 2018 update because recent contributions seem to be reaching a robust affirmation. We might list Gregory Chaitin, Sara Walker, Hector Zenil, Enrico Borriello, Anne Condon, Leroy Cronin, Agoston Eiben, Hyunju Kim, Doug Moore, Paul Davies. A prime achievement is a common code in the form of nature’s universal proclivity to become poised in a reciprocal self-organized criticality, see for example Dante Chialvo, Bosiljka Tadic, Bryan Daniels, Andrea Roli, and throughout the site.

Another aspect is in need of airing and review. The phrase “mindless mathematics” has recently been bandied about (Aguirre 2018), which infers that cosmic nature (or lack thereof) is bereft any prescriptive content and purpose. But in actuality a mathematical domain can take on two different forms. One is a pure or applied mode, such as arithmetic. Each day on the arXiv preprint site some 200 entries appear under “Mathematics.” The other is a computational (software) program which does carry information. Each day some 300 new contributions are listed. So we may have an arithmetic to which “mindless” doesn’t apply, and maybe an “algorithmetic” that does contain, or has a capacity for, a narrative message.

And finally, we introduce “Biteracy,” a word which does not come up in a Google search. As the section title alludes, natural algorithms have roots in John Holland’s 1970’s original genetic version, with many extensions since, see Xin-She Yang herein. But we intend for J. A. Wheeler’s familiar “bit to it” which evokes an informative (quantum) origin which then courses through an oriented evolution to human observers and recorders. An inference could be that we regnant peoples are present as the way such a participatory uniVerse learns to decipher, read, and intentionally take forth its own genomic endowment.


, . A New Kind of Science: A 15-Year View. https://backchannel.com/a-new-kind-of-science-a-15-year-view-4f5668abe54f. A May 16, 2017 posting by Stephen Wolfram on this anniversary of his 1000 page title opus. It appears on his Backchannel blog website, along with news from frontiers of global computer technology and services. To wit, the 21st century has indeed become a “computational universe,” as algorithmic programs historically come to supersede physical mathematics. He then notes how his popular Mathematica software is lately involved with neural network and linguistic features and operations. In closing, the motivation remains to elucidate and fulfill Gottfried Leibniz’s quest for a common, natural symbolic discursive language.

Evolutionary Biology and the Theory of Computing. simons.berkeley.edu/programs/evolution2014. A Simons Foundation program in the spring of 2014 whose organizers include Christos Papadimitriou and Leslie Valiant. Various seminars were Evolutionary Biology Boot Camp (January tutorial) with Eugene Koonin; Computational Theories of Evolution, Nick Barton organizer, with speakers such as Richard Watson, Chrisantha Fernando, Adi Livnat, and Steven Frank; and New Directions in Probabilistic Models of Evolution. (RW and CF abstracts noted separately) The vital multifaceted content might be summed by a slide from Papadimitriou’s “Evolution and Algorithms:” The special affinity between computation and biology: There is “innate explicit code” in Life.

Evolutionary biology is an intellectually rich field which has advanced remarkably through a synergistic interplay between deep understanding of biology and mathematical techniques, especially from probability and statistics. Over the past several decades, the role of computer science in studying biology has grown enormously, and computation has now become an indispensable part of the intellectual mix. Many current problems in evolutionary biology push the limits of computation, and new algorithmic insights are needed to make progress. The objective of this program is to promote the interaction between theoretical computer scientists and researchers from evolutionary biology, physics, probability, and statistics. The participants of the program will collaborate to identify and tackle some of the most important theoretical and computational challenges arising from evolutionary biology. The major themes of the program will be sound mathematical modeling, rigorous methods for statistical estimation, and computational scalability. (Conference Summary)

During the past decade, models and theories of evolution have been articulated which were inspired by computational considerations: examples are Valiant's evolvability, and the theory of mixability for the role of sex. The purpose of this workshop is to showcase and advance this strand of research, and also to expose it to the feedback and criticism of biologists and mathematicians. A second goal of the workshop is to highlight research questions in evolutionary biology which might benefit from computational insights and methodology, such as intractability proofs and novel algorithmic paradigms. (Computational Theories of Evolution)

Aerts, Diederik, et al. Quantum Entanglement in Physical and Cognitive Systems. arXiv:1903.09103. A seven person team based at Brussels Free University with other postings in Switzerland, the UK and Chile enter their latest work-in-progress toward a theoretical and conceptual, quantum and classical, physical and biological, cosmic integrative whole. Into mid 2019, per the second quote, as new understandings join quantum and human phenomena, we can begin to glimpse a universal Copernican revolution. Physics and people are at last reunited, which in turn implies a lively, literate ecosmos. On this eprint site, over a hundred papers by D. Aerts, this group, and many colleagues can be accessed going back to 2008. An example is The Emergence and Evolution of Integrated Worldviews by DA with Liane Gabora at 1001.1399. Another current posting is Quantum-Theoretic Modeling in Computer Science at 1901.04299 and Quantum Entanglement in Corpuses of Documents 1810.12114.

We provide a general description of the phenomenon of entanglement in bipartite systems, as it manifests in micro and macro physical systems, as well as in human cognitive processes. We do so by observing that when genuine coincidence measurements are considered, the violation of the 'marginal laws', in addition to the Bell-CHSH inequality, is also to be expected. The situation can be described in the quantum formalism by considering the presence of entanglement not only at the level of the states, but also at the level of the measurements. (Abstract excerpt)

But nowadays the predictions of quantum theory are no longer put into question, not only as regards entanglement, which has been shown to be preservable over distances of a thousand kilometers, but also with respect to many other effects such the delocalization of large organic molecules. On the other hand, the debate about the profound meaning of the theory never stopped, and in fact has constantly renewed and expanded over the years, so much so that one can envisage this will produce in the end a Copernican-like revolution in the way we understand the nature of our physical reality. Such a debate, however, is not confined to physicists or philosophers of science, but also reached new fields of investigation, in particular that of psychology, due to the development of that research domain called ‘quantum cognition.’ (2)

Quantum entanglement is a physical phenomenon that occurs when pairs or groups of particles are generated, interact, or share spatial proximity in ways such that the quantum state of each particle cannot be described independently of the state of the others, even when the particles are separated by a large distance.

In physics, the CHSH inequality can be used in the proof of Bell's theorem, which states that certain consequences of entanglement in quantum mechanics cannot be reproduced by local hidden variable theories. Experimental verification of violation of the inequalities is seen as experimental confirmation that nature cannot be described by local hidden variables theories. CHSH stands for John Clauser, Michael Horne, Abner Shimony, and Richard Holt, who described it in 1969.

Al-Mehairi, Yaared, et al. Compositional Distributional Cognition. arXiv:1608.03785. Oxford University computer scientists including Bob Coecke and Martha Lewis contribute a technical paper which in translation expresses innate affinities between its mathematics and widely removed linguistic and quantum dimensions. The ICS reference below about cerebral computation is from The Harmonic Mind by Paul Smolensky and Geraldine Legendre. CatCo is a paper by Coecke, et al in Linguistic Analysis (36/1/345, 2011). See also companion August postings from this group such as Quantum Algorithms for Compositional Natural Language Processing (1608.01406).

We accommodate the Integrated Connectionist/Symbolic Architecture (ICS) of [32] within the categorical compositional semantics (CatCo) of [13], forming a model of categorical compositional cognition (CatCog). This resolves intrinsic problems with ICS such as the fact that representations inhabit an unbounded space and that sentences with differing tree structures cannot be directly compared. We do so in a way that makes the most of the grammatical structure available, in contrast to strategies like circular convolution. Using the CatCo model also allows us to make use of tools developed for CatCo such as the representation of ambiguity and logical reasoning via density matrices, structural meanings for words such as relative pronouns, and addressing over- and under-extension, all of which are present in cognitive processes. Moreover the CatCog framework is sufficiently flexible to allow for entirely different representations of meaning, such as conceptual spaces. Interestingly, since the CatCo model was largely inspired by categorical quantum mechanics, so is CatCog. (Abstract)

Barish, Robert, et al. An Information-Bearing Seed for Nucleating Algorithmic Self-Assembly. Proceedings of the National Academy of Sciences. 106/6054, 2009. Caltech biocomputer scientists working with co-author mentor Erik Winfree contend that “natural, mineral, chemical, and biological structures of great complexity” organize themselves from an initial seed source so infused with a genetic-like code. One might then wonder - did a genesis universe likewise originate in this way?

Thus, algorithmic self-assembly is universal both for computation and for construction. These features establish an analogy to developmental programs in biological organisms: ‘‘Genomic information’’ contained in a seed specifies a complex growth process guided by information processing. (6054) Thus, in addition to their technical relevance, the ability to study seeded growth processes using programmable DNA systems may open up new approaches for studying fundamental natural phenomena. (6059)

Beinhocker, Eric. Evolution as Computation: Integrating Self-Organization with Generalized Darwinism. Journal of Institutional Economics. 7/3, 2011. The McKinsey Global Institute, London, economics theorist presses his view of the physical, biological, and societal universe in terms of and as due to iterative, algorithmic, informative processes. By so doing, the work continues the major project of his 2006 The Origin of Wealth toward a vital synthesis of “self-organization with a generalized Darwinism.” This waxing school perceives complex adaptive systems as a natural ‘software’ that drives or generates the amplifying scales of life’s regnant intricacy.

This paper argues that information theory, rooted in modern thermodynamics, offers the potential to integrate these two perspectives in a common and rigorous framework. Both evolution and self-organization can be generalized as computational processes that can be applied to human social phenomena. Under this view, evolution is a process of algorithmic search through a combinatorial design space, while self-organization is the result of non-zero sum gains from information aggregation. Evolution depends on the existence of self-organizing forces, and evolution acts on designs for self-organizing structures. (Abstract, 1)

By now it should be clear that there is much self-organizing going on under the evolution as computation perspective. The algorithm captures free energy to search enormous combinatorial spaces in search of fit designs, creating novelty through changes in design modules and recombinations of modules to discover and realize previously unrealized designs. In this way, order and structure are (Non-monotonically) created. (21) To summarize, from the point of view of information theory and computation, it is almost impossible to talk about evolution without referring to self-organization, and vice versa. Evolution needs self-organization to bootstrap the process of evolutionary search and order creation, while self-organization leads to conditions where the logic differentiation, selection and retention can take hole. (23)

Beltran, Lester and Suzette Geriente. Quantum Entanglement in Corpuses of Documents. arXiv:1810.12114. Brussels Free University, Interdisciplinary Studies Group researchers led by Diederik Aerts explore how recent clarifications and integrative expansions of quantum theory can reveal how such deep phenomena is actively present even in human literary writings. As if a library of cosmos (taking license), in addition to fractal network complexities, our textual linguistic corpora is found to possess a physical affinity and generative source. And we note, by turns, this extant cosmos becomes graced by a natural narrative (more license). See also Quantum-Theoretic Modeling in Computer Science by these authors and group at 1901.04299 for a later finesse. A parallel effort goes on in Bob Coecke’s Oxford University group, such as The Mathematics of Text Structure at 1904.03478.

Berges, Jurgen. Scaling Up Quantum Simulations. Nature. 569/339, 2019. . A Heidelberg University physicist lauds a paper Self-Verifying Variational Quantum Simulation of Lattice Models by eleven University of Innsbruck researchers in the same issue (Kokail, 569/355) about a composite digital-analog computational method which can span and join quantum and classical phases. Once again this complementarity is found to work best.

It is difficult to carry out and verify digital quantum simulations that use many quantum bits. A hybrid device based on a digital classical computer and an analog quantum processor suggests a way forward.

Bernini, Andrea, et al. Process Calculi for Biological Processes. Natural Computing. 17/2, 2018. Some 17 years since the 2001 human genome sequence that led to a new integrative phase, University of Siena, Sassari, and Pisa, Italy and Pontifical Xavierian University, Cali, Columbia biomathematicians propose that a deeper affinity between systems biology and computational operations exists and should be fostered. By turns, life naturally seems to compute and iterate her/his self into developmental and organismic evolution. An Algorithmic Systems Biology is then scoped out going forward. See also the Springer conference series Computational Methods in Systems Biology.

Systems biology is a research area devoted to developing computational frameworks for modeling biological systems in a holistic fashion. This paper surveys a specific computational approach to systems biology, based on the so-called process calculi, a formalism for describing concurrent systems. We start from a basic process calculus that is then extended with increasingly expressive features to better reflect the biological aspects of interest. We then compare the expressive power of the resulting calculi, mentioning if they are supported by software tools. From this comparison we derive some suggestions on the most suitable frameworks for dealing with specific cases of interest. (Abstract)

There is a substantial analogy between the above view of biological systems and the concurrent systems studied in computer science. These are made of a possibly huge number of independent processing units that perform concurrent computations, exchange information with each other via communications, have a discrete nature and are often endowed with stochastic features. This analogy makes process calculi, a formalism for specifying concurrent systems, a natural candidate for modeling biological entities. (346)

In computer science, the process calculi (or process algebras) are a diverse family of related approaches for formally modelling concurrent systems. Process calculi provide a tool for the high-level description of interactions, communications, and synchronizations between a collection of independent agents or processes. They also provide algebraic laws that allow process descriptions to be manipulated and analyzed, and permit formal reasoning about equivalences between processes. (Wikipedia)

Bonnici, Vincenzo and Vincenzo Manca. Informational Laws of Genome Structures. Nature Scientific Reports. 6/28840, 2016. University of Verona computational biologists exemplify the mid 2010 frontiers of an integral organic cosmos as written as complex systems from physical to genetic phases which can be parsed by a linguistic analysis. It is said that a genome is essentially a text, a descriptive document amenable to information theory. For more examples, we cite papers in Mind Over Matter, and throughout, that speak in terms of grammar, dictionaries, encyclopedias, libraries, and so on. See also Recurrence Distance Distributions in Computational Genomics in American Journal of Bioinformatics and Computational Biology (3/1, 2015), and Infogenomics Tools: A Computational Suite for Informational Analyses of Genomes in Journal of Bioinformatics and Proteomics Review (1/1, 2015), by the authors.

We think that our informational indexes, and the laws relating them, confirm a very simple and general intuition. If life is information represented and elaborated by means of (organic) molecules, then the laws of information necessarily have to reveal the deep logic of genome structures. (5) Our investigation can be compared to the astronomical observations measuring positions and times in the orbits of celestial objects. Kepler’s laws arose from the regularities found in planetary motions, and from Kepler’s laws, the laws of mechanics emerged. This astronomical comparison, which was an inspiring analogy, revealed a surprising coincidence when ellipses were introduced in the representation of entropic and anti-entropic components. Kepler’s laws were explained by Newton’s dynamical and gravitational principles. Continuing our analogy, probably deeper informational principles are the ultimate reason for the laws that we found. (7)

Borriello, Enrico, et al. A Unified, Mechanistic Framework for Developments and Evolutionary Change. arXiv:1809.02331. This frontier team of Arizona State University, Center for Biosocial Complex Systems polyscientists EB, Sara Walker, and Manfred Laubichler, with colleagues in other entries, continue to seek and express an implied mathematical source code for life’s oriented emergence. (If every organism has their own ontogenetic code, why should not the whole phylogenetic evolution have its similar evonomic endowment?) Akin to concurrent groups (see Hyunju Kim) attempts are made to finesse new approaches and methods, here an emphasis is on gene regulatory networks, so as to build a fuller case.

The two most fundamental processes describing change in biology - development and evolution - occur over widely different timescales, making them difficult to reconcile within a single frame. Development involves a temporal sequence of cell states controlled by a hierarchy of regulatory structures. It occurs over the lifetime of a single individual, and is associated to the gene expression level change of a given genotype. Evolution, by contrast entails genotypic change through the acquisition or loss of genes, and the emergence of new, environmentally selected phenotypes over the lifetimes of many individuals.

Here we present a model of regulatory network evolution that accounts for both timescales. We extend Boolean models of gene regulatory networks from only describing development to evolutionary processes so as to identify the phenotypes of the cells as the relevant macrostates of the GRN. A phenotype may now correspond to multiple attractors, and its formal definition no longer requires a fixed size for the genotype. This opens a quantitative study of the phenotypic change of a genotype, which is itself changing over evolutionary timescales. We show how specific phenotypes can be controlled by gene duplication events, and how gene duplication events lead to new regulatory structures via selection. It is these structures that enable control of macroscale patterning, as in development. (Abstract edits)

Bossard, Jeremy, et al. Evolving Random Fractal Cantor Superlattices for the Infrared Using a Genetic Algorithm. Journal of the Royal Society Interface. Vol.13/Iss.114, 2016. As the technical quotes describe, Penn State electrical engineers achieve deep insights into a previously entangled, intractable nature. By novel finesses of fractal geometries, reliable self-similar patterns do indeed appear. Moreover this observation results from the employ of a genetic algorithm which can select an optimum solution in an evolutionary manner from a population of options. Once again in the 2010s, such arcane mysteries become at last amenable to humankind’s discernment. Nature is truly graced by an ordained form and function which we peoples are meant to read and carry forth.

Ordered and chaotic superlattices have been identified in Nature that give rise to a variety of colours reflected by the skin of various organisms. In particular, organisms such as silvery fish possess superlattices that reflect a broad range of light from the visible to the UV. Such superlattices have previously been identified as ‘chaotic’, but we propose that apparent ‘chaotic’ natural structures, which have been previously modelled as completely random structures, should have an underlying fractal geometry. Fractal geometry, often described as the geometry of Nature, can be used to mimic structures found in Nature, but deterministic fractals produce structures that are too ‘perfect’ to appear natural. Introducing variability into fractals produces structures that appear more natural. We suggest that the ‘chaotic’ (purely random) superlattices identified in Nature are more accurately modelled by multi-generator fractals. (Abstract)

In order to exploit the multi-generator Cantor bar fractal to generate superlattices with desired spectral properties, a GA was employed to evolve the superlattice structure. The GA is a robust stochastic optimizer that has been used to solve a variety of challenging electromagnetic design problems. The GA is a popular optimizer within the electromagnetics community, because it is simple to implement and capable of solving problems with many design parameters. The GA itself is inspired by Nature as it emulates the natural evolutionary process, so combining it with the multi-generator fractal model allows us to not only mimic the superlattice geometries identified in Nature, but also to evolve optimum designs as would happen in Nature. The operating principle of the GA comes from the Darwinian notion of natural selection, where a population of design candidates competes for survival at each iteration of the optimization process. (4)

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8  Next