(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Chu, Dominique, et al. Computation by Natural Systems. Interfocus. 8/6, 2018. University of Kent, Sydney, and Kansas researchers including Mikhail Prokopenko introduce an issue with this title about the many ways that evolutionary, organic and cerebral life seems to be primarilly engaged in mathematical processes. Some papers are Computational Modelling Unravels the Clockwork of Cyanobacteria, Haematopoietic Stem Cells, and Semantic Information Autonomous Agency and Non-Equilibrium Statistical Physics (Artemy Kolchinsky and David Wolpert.

Computation is a useful concept far beyond the disciplinary boundaries of computer science. Perhaps the most important class of natural computers can be found in biological systems that perform computation on multiple levels. From molecular and cellular information processing networks to ecologies, economies and brains, life computes. Despite ubiquitous agreement on this fact going back as far as von Neumann automata and McCulloch–Pitts neural nets, we so far lack principles to understand rigorously how computation is done in living, or active, matter. What is the ultimate nature of natural computation that has evolved, and how can we use these principles to engineer intelligent technologies and biological tissues? (Abstract)

Churchill, Alexander, et al. Learning to Generate Genotypes with Neural Networks. arXiv: 1604.04153. With Siddharth Sigtia and Chrisantha Fernando, Queen Mary University of London theorists enter another way to understand how genomes and connectomes (neuromes) are basically similar in form and function.

Neural networks and evolutionary computation have a rich intertwined history. They most commonly appear together when an evolutionary algorithm optimises the parameters and topology of a neural network for reinforcement learning problems, or when a neural network is applied as a surrogate fitness function to aid the evolutionary optimisation of expensive fitness functions. In this paper we take a different approach, asking the question of whether a neural network can be used to provide a mutation distribution for an evolutionary algorithm, and what advantages this approach may offer? Two modern neural network models are investigated, a Denoising Autoencoder modified to produce stochastic outputs and the Neural Autoregressive Distribution Estimator. Results show that the neural network approach to learning genotypes is able to solve many difficult discrete problems, such as MaxSat and HIFF, and regularly outperforms other evolutionary techniques. (Abstract)

Condon, Anne, et al. Will Biologists Become Computer Scientists? EMBO Reports. 19/9, 2018. A report from the I2CELL: From Information to Cells conference held in Oxford, UK in February 2018, funded by the Fourmentin-Guilbert Scientific Foundation. At the outset, it was noted that earlier Francois Jacob, Jacques Monod and Alan Turing before saw an affinity between cellular information processing and algorithmic computations. The international meeting went forth as an endeavors to clarify, fulfill and benefit from this actual essence. Among presenters at the forefront of this bioinformation revolution were Gilles Dowek, Antoine Danchin, Agosto Eiben, Helene Kirchner, Susan Stepney, Jordan Pollack, Tsvi Tlusty, and Albert Libchaber. See also concurrent entries A Unified, Mechanistic Framework for Developments and Evolutionary Change (Borriello); Inform: Efficient Information-Theoretic Analysis of Collective Behaviors (D. Moore); Information and Noise: Chemistry, Biology and Evolution Creating Complex Systems (Beilstein); and Schrodinger at 75 – The Future of Biology. But the working premise of cells as “computational machines” remains flawed. The upside would be to realize that the biomathematical source which is abstractly described is actually a natural, universe to human, genetic code

Currin, Andrew, et al. Computing Exponentially Faster: Implementing a Non-Deterministic Universal Turing Machine using DNA. Journal of the Royal Society Interface. Online March, 2017. A team of University of Manchester scientists across chemical, biological, informatics, and computer fields contribute to this fertile convergence of algorithmic computation with nucleotide structures. As entries here, Genomes and Languages, Mind Over Matter, and elsewhere are finding, DNA and RNA biomolecules seem to possess a wide array of intrinsic capabilities beyond their genetic phase.

D'Ariano, Giacomo Mauro and Paolo Perinotti. Quantum Cellular Automata and Free Quantum Field Theory. Frontiers of Physics. 12/1, 2017. We cite this paper by University of Pavia, QUIT group, physical information theorists as another instance, along with Gerard ‘t Hooft and others, of a 2010s shift to better view this deepest phenomenal realm by the principles of complex, computational dynamics.

De Palo, Giovanna, et al. A Critical-Like Collective State Leads to Long-Range Cell Communication. PLoS Biology. April, 2017. At a critical value of the cell-cell coupling strength correlations may establish among cells that span the whole dell collective independent of its size, leading to a maximally connected collective similar to neurons in the brain. Imperial College London system biophysicists including Robert Endres (search) find that just as cerebral architechures, a beneficial criticality likewise serves cellular viability. As the 2010s unfold toward 2020, we find an increasing notice of a robust, repetitive commonality well in place from microbes to minds.

The transition from single-cell to multicellular behavior is important in early development but rarely studied. The starvation-induced aggregation of the social amoeba Dictyostelium discoideum into a multicellular slug is known to result from single-cell chemotaxis towards emitted pulses of cyclic adenosine monophosphate (cAMP). Here, we developed a multiscale model verified by quantitative microscopy to describe behaviors ranging widely from chemotaxis and excitability of individual cells to aggregation of thousands of cells. To better understand the mechanism of long-range cell—cell communication and hence aggregation, we analyzed cell—cell correlations, showing evidence of self-organization at the onset of aggregation. Surprisingly, cell collectives, despite their finite size, show features of criticality known from phase transitions in physical systems. By comparing wild-type and mutant cells with impaired aggregation, we found the longest cell—cell communication distance in wild-type cells, suggesting that criticality provides an adaptive advantage and optimally sized aggregates for the dispersal of spores. (Abstract)

Denning, Peter. Computational Thinking in Science. American Scientist. January, 2017. The veteran computer philosopher provides a cogent history and tutorial from Alan Turing, Kenneth Wilson, John Holland, and others, of perceptions and methods to this day that cosmic to cultural evolution proceeds by a proliferation of candidates which interact within environments so as to stress test, winnow, and select. The optimizing process then repeats and iterates in search of an optimal entity or solution. In regard, genetic algorithms are explained in a clever diagram. If one might now extend to bioplanets, out of (suffering) creaturely, and human multitudes, many called but few chosen writ large, what feature or quality seems trying to emerge?

Domingos, Pedro. The Master Algorithm: How the Quest for the Ultimate Learning Machine Will Remake Our World. New York: Basic Books, 2015. A University of Washington computer scientist writes an insider’s account of the current intense effort to achieve better ways to retrieve, recognize, store, process, and avail the vast flow in information generated by the prolific Internet. As the quote lists, five main approaches are cited, which generally take on a cerebral cast as if trying to access a global brain. Each method is explained in a chapter, so the book is a good entry to the multifaceted endeavor. The final section cites a working synthesis available at alchemy.ca.washington.edu, with applications such as cancer research. The whole project is viewed as a cumulative knowledge-gaining venture, such as deep neural nets also entered. Many references in this section offer further entries, whence general surmises might be gleaned. With referrals to evolution as nature’s learning algorithm, life’s long developmental course to our phenomenal sapience appears, in retrospect, as an oriented education toward its own self-discovery.

Symbolists view learning as the inverse of deduction and take ideas from philosophy, psychology, and logic. Connectionists reverse engineer the brain and are inspired by neuroscience and physics. Evolutionaries draw on genetics and evolutionary biology. Bayesians believe learning is a form of probabilistic inference and have their roots in statistics. Analogizers learn by extrapolating from similarity judgments and are influenced by psychology and mathematical optimization. (xvii) Each….has its own master algorithm. The symbolists is inverse deduction, the connectionists’ is backpropagation (iterative neural nets), the evolutionaries’ is genetic programming, the Bayesians’ is inference, and the analogizers’ is the support vector machine (pattern recognition). (xvii)

Here, then, is the central hypothesis of this book: All knowledge – past, present, and future – can be derived from data by a single, universal learning algorithm. (25)

Dowek, Gillis. Computation, Proof, Machine: Mathematics Enters a New Age. Cambridge: Cambridge University Press, 2015. This 2007 Winner of the Grand Prix de Philosophie de l'Académie Française by the French Institute for Research in Computer Science and Automation (INRIA) mathematician is here published in English as a 21st century natural philosophy. From this vantage, the text first traces “two thousand years of computation” back to Greek sages. The early 20th century turn from axioms to algorithms, much due to Alan Turing (1912-1954) and Alonzo Church (1903-1995), led to an emphasis on computational programs. This reformulation is lately working its way through the sciences from physics to linguistics in the guise of a mathematical cosmos as, for example, Max Tegmark (2014) advocates. Of course it is noted that Galileo has a 17th century priority with his “great book of nature” written in this language.

This prompts us to distinguish between two variants of Church’s thesis: its psychological form and its physical form. According to the psychological form of Church’s thesis, all the algorithms that a human being is capable of executing in order to solve a specific problem can be expressed by a set of computation rules. The physical form of Church’s thesis, on the other hand, states that all the algorithms that a physical system – a machine – is capable of executing systematically in order to solve a specific problem can be expressed by a set of computation rules. (56)

However, if the physical form of Church’s thesis is true – in other words, if all the algorithms that a physical system is capable of computing really can be expressed by a set of computation rules – then whatever nature can compute, the human being can compute, too. This is tantamount to saying that the human being is among the best computers in nature: there is nothing in the whole natural order, animal or machine, that can compute better than a human being. This thesis, which we will call the “computational completeness of human beings,” can be seen, in a way, as the converse of materialism. (56-57)

Dyson, George. Analogia: The Emergence of Technology beyond Programmable Control. New York: Farrar, Straus and Giroux, 2020. The polycultural sage, at home in both tradition and science, writes another a unique, insightful contribution. Its timely theme courses from Gottfried Leibniz to 1950s mathematic computations at the Institute for Advanced Studies at Princeton where his father, Freeman was posted, and onto the current algorithmic Internet. By this vista, the innate presence of two discrete digital and integral analogue modes can be well discerned. So to say, this natural realm which our human intellect has long tried to comprehend is now found to be graced by such particle/wave, node/link apart/together, serial/spatial, me/We archetypes. By this view, one could infer, as the Civilization section does, that indigenous peoples abided in an analog milieu.

On a personal note, in 2005 I was paired with Freeman Dyson as a speaker (slides on home page for slides) so to attempt a 21st century “natural philosophia,” which he says is vital to recover today. (search FD). For this review, while George D. closes with a concern that a worldwide AI analogic phase bodes to take over, his 2020 witness of a universal reality which is deeply distinguished by an interactivity of these informational states could advise a salutary resolve by way of their complementary unity. (As I write this in late August, our American land is being devasted by their polar opposition.)

George Dyson is an independent historian of technology whose works have included the Aleut kayak (Baidarka, 1986), the evolution of artificial intelligence (Darwin Among the Machines, 1997) and the transition from numbers that mean things to numbers that do things (Turing’s Cathedral, 2012).

In 1716, the philosopher and mathematician Gottfried Wilhelm Leibniz spent eight days with Peter the Great in Saxony, seeking to initiate a digitally-computed takeover of the world. In his Darwin Among the Machines (1997) and Turing’s Cathedral (2012), Dyson chronicled the mid 20th century realization of Leibniz’s dream by a series of iconoclasts who brought his ideas to life. In his pathbreaking new book, Analogia, he chronicles the people who fought for the other side―the Native American leader Geronimo and physicist Leo Szilard, among them―by vignettes that will change our view not only of the past. The convergence of this historical archaeology with Dyson’s personal story - in Princeton frontier physics and computer science and the rainforest of the Northwest Coast - leads to a prophetic vision of an analog revolution already under way.

Nature uses digital coding, embodied in strings of DNA, for the storage, replication, and modification of instructions conveyed from one generation to the next, but relies on analog coding and computing, embodied in brains and nervous systems, for real-time intelligence and control. Coded sequences of nucleotides store the instructions to grow a brain, but the brain itself does not operate like a digital computer by storing and processing digital code. (6) In a digital computer, one thing happens at a time. In an analog computer everything happens at once. Brains process three-dimensional maps continuously, instead of one-dimensional algorithms step by step. Information is pulse-frequency coded in the topology of what connects where, dot digitally coded by precise sequences of logical events. (6)

There is a corollary to the continuum hypothesis concerning computation among living and non living things. Computers, like Cantor’s infinities, can be divided into two kinds. Digital computers are finite but unbounded discrete-state machines whose possible states can be mapped in one-to-one correspondence to the integers. Analog computers, lacking discrete states that can be mapped directly to the integers, belong instead to some subset of the continuum, with every such subset having the power of the whole. Digital computers deal with integers, binary sequences, deterministic logic and time that is idealized into discrete increments. Analog computers deal with real numbers, nondeterministic logic, and continuous functions, including time. (247-248)

Eiben, Agoston and Jim Smith. From Evolutionary Computation to the Evolution of Things. Nature. 521/476, 2015. We cite this entry by VU University Amsterdam and University of the West of England professors of “interactive artificial intelligence” as another expression that Earth life’s developmental course to our ascendant comprehension might be quantified in appearance as if a mathematical program was at work.

Evolution has provided a source of inspiration for algorithm designers since the birth of computers. The resulting field, evolutionary computation, has been successful in solving engineering tasks ranging in outlook from the molecular to the astronomical. Today, the field is entering a new phase as evolutionary algorithms that take place in hardware are developed, opening up new avenues towards autonomous machines that can adapt to their environment. We discuss how evolutionary computation compares with natural evolution and what its benefits are relative to other computing approaches, and we introduce the emerging area of artificial evolution in physical systems. (Abstract)

Analogous to natural evolution, an evolutionary algorithm can be thought of as working on two levels. At the higher level (the original problem context), phenotypes (candidate solutions) have their fitness measured. Selection mechanisms then use this measure to choose a pool of parents for each generation, and decide which parents and offspring go forward to the next generation. At the lower level, genotypes are objects that represent phenotypes in a form that can be manipulated to produce variations (Box 1). Genotype–phenotype mapping bridges the two levels. (476)

Erwig, Martin. Once Upon an Algorithm: How Stories Explain Computing. Cambridge: MIT Press, 2017. An Oregon State University professor of computer science draws an extended analogy between familiar stories and songs as an effective way to convey an array of algorithmic principles. For example, Hansel and Gretel and Sherlock Holmes can illustrate problem solving, representation and data structures, while Over the Rainbow and Harry Potter express language and meaning, control loops, recursion and abstraction. A copious glossary for each chapter adds pertinent definitions. But for this website, another inference surely comes to mind. If a cross-comparison between literary narratives and computational practice can indeed be parsed, it could well imply that nature’s animate processes are truly textual in kind, a cosmic script and score made and meant for we peoples to decipher, read and write a new story and score.

In Once Upon an Algorithm, Martin Erwig explains computation as something that takes place beyond electronic computers, and computer science as the study of systematic problem solving. He points out that many daily activities involve problem solving. In computer science, such a routine is called an algorithm. Here Erwig deftly illustrates concepts in computing with examples from familiar stories. Hansel and Gretel, for example, execute an algorithm to get home from the forest. Sherlock Holmes handles data structures when solving a crime; and the magic in Harry Potter's world is understood through types and abstraction. He also discusses representations and ways to organize data; “intractable” problems; language, syntax, and ambiguity; control structures, loops, and the halting problem; different forms of recursion; and more.

Since recursion is a general control structure and a mechanism for organizing data, it is part of many software systems. In addition, there are several direct applications of recursion. The feedback loop is a recursive description of the repetitious effect. Fractals are self-similar geometric patterns that can be described through recursive equations. Fractals can be found in nature, for example, in snowflakes and crystals, and are also used in analyzing protein and DNA structures. (9)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next