(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Newman, Stuart. Form, function, mind: what doesn't compute (and what might). arXiv:2310.13910. This latest paper (search) by the New York Medical College, Valhalla, NY biophilosopher has several notable aspects. It is a review of his endeavors to discern further structural and physical influences which are involved in life’s cellular development. In 2023, he worries over shifting views of just how computational is biological phenomena, in confluence with genetic sources. But however this vital domain sorts out, our new procreative ecosmos is quite graced by some manner of procreative program.

The applicability of computational and dynamical systems models to organisms is scrutinized, using examples from developmental biology and cognition. Organic morphogenesis is dependent on the inherent material properties of tissues, a non-computational modality, but cell differentiation, which utilizes chromatin-based revisable memory banks and program-like function-calling, has a quasi-computational basis. Multi-attractor dynamical models are argued to be misapplied to global properties of development. Proposals are made for treating brains and other nervous tissues as novel forms of excitable matter with inherent properties which enable the intensification of cell-based basal cognition capabilities present throughout the tree of life. (Excerpt)

The foregoing text introduces this essay on the bases of biological form and function and their relation to computation. My main claim is that key aspects of living organisms, their morphogenesis (generation of form), and aspects of sensory and cognitive capacities, are expressions of the inherencies of the materials of which they are composed. Other features, including the differentiation of specialized cells and retrieval of stored patterns in the brain, utilize quasi-computational storage, data addressing and function calling processes. (3-4)

Nichol, Daniel, et al. Model Genotype-Phenotype Mappings and the Algorithmic Structure of Evolution. Journal of the Royal Society Interface. 16/20190332, 2019. Oxford University and H. Lee Moffitt Cancer Center, FL computer scientists and mathematical oncologists including Peter Jeavons describe an advanced complex systems biology method which joins cellular components into a dynamic synthesis from genes to metabolism. Then a novel program-like factor is brought into play to better quantify and express the metastasis invasions. To so reflect, from our late vantage Earth life’s contingent evolution seems yet to reach a global cumulative knowledge which can be fed back to contain, heal and prevent. We are given to perceive some manner of palliative, self-medicating procreation, which seems meant to pass on to our intentional continuance. What an incredible scenario is being revealed to us.

Cancers are complex dynamic systems that undergo evolution and selection. Personalized medicine in the clinic increasingly relies on predictions of tumour response to one or more therapies, which are complicated by the phenotypic evolution of the tumour. The emergence of resistant phenotypes is not predicted from genomic data, since the relationship between genotypes and phenotypes, termed genotype–phenotype (GP) mapping, is neither injective nor functional. We review mapping models within a generalized evolutionary framework that relates genotype, phenotype, environment and fitness. The GP-mapping provides a pathway for understanding the potential routes of evolution taken by cancers, which will be necessary knowledge for improving personalized therapies. (Abstract excerpt)

Olhede, Sofia and P. J. Wolfe. The Growing Ubiquity of Algorithms in Society. Philosophical Transactions of the Royal Society A. Vol.376/Iss.2128, 2018. University College London and Purdue University computer scientists introduce an issue of papers from a discussion meeting about this invisible but major role that computational methods are taking on in 21st century societies. Topics such as the governance of data, transparency of algorithms, legal and ethical frameworks for automated decision-making and public impacts of hidden programs are considered for both pro and con aspects.

Palazzi, Maria, et al. Online Division of Labour: Emergent Structures in Open Source Software. Nature Scientific Reports. 9/13890, 2019. Five Internet Interdisciplinary Institute (IN3), Universitat Oberta de Catalunya computer scientists cleverly apply complex system phenomena to the multi-player process of software code development. By so doing they find this artificial field to exhibit the same features of self-organization, diverse task allotment, a scalar, nested structure and more as every other natural and social realm. Once again, another life-like domain with this common vitality is revealed, which altogether strongly implies the presence of a natural, seed-like, generative program.

The development Open Source Software depends on the participation and commitment of volunteer developers to progress on a particular task. Several strategies are in effect, but little is known on how these diverse groupings self-organise to work together: any number of contributors can join in a decentralised, distributed, and asynchronous manner. It is helpful to then see some efficient hierarchy and division of labour must be in place within human biological and cognitive limits. We analyze popular GitHub open source projects with regard to three key properties: nestedness, modularity and in-block nestedness. These typify the emergence of heterogeneities among contributors, subgroups working on specific files, and the whole activity. From a complex network perspective, our conclusions create a link between bio-cognitive constraints, group formation and online working environments. (Abstract)

Papadimitriou, Christos. Algorithms, Complexity, and the Sciences. Proceedings of the National Academy of Sciences. 111/15881, 2014. The Simons Institute for the Theory of Computing, UC Berkeley, researcher is a leading spokesperson for perceptions that nature is actually suffused by iterative programs from which physical matter and life’s quickening evolution result. By this general entry, one could discern two broad phenomenal realms of a mathematical program-like source and a manifest complex reality. When we wonder might they be imagined as genotype and phenotype?

Raffaele, Giancarlo, et al. DNA Combinatorial Messages and Epigenomics. Theoretical Computer Science. Online July, 2018. With regard to chromatin organization and nucleosome occupancy in eukaryotic genomes, University of Palermo and IBM Watson Research computational biologists come to this on-going project of parsing nature’s cosmos to children genetic program via this algorithmic (algorithome) realm. They emphasize how a general computational approach is especially apt going forward. All told, our human phenomenon may be the way that a self-deciphering, sequencing, genesis uniVerse learns to read, realize and continue forth its own code.

Epigenomics is the study of modifications on the genetic material of a cell that do not depend on changes in the DNA sequence, since those latter involve specific proteins around which DNA wraps. The result is that Epigenomic changes have a fundamental role in the proper working of each cell in Eukaryotic organisms. A particularly important aspect is the study of chromatin, a fiber composed of a DNA-protein complex. In more than thirty years of research in this area, Mathematics and Theoretical Computer Science have gained a prominent role, in terms of modeling and mining. Starting from some very basic notions of Biology, we illustrate recent advances on the organization and dynamics of chromatin. Then, we review contributions by Combinatorial and Informational Methodologies to the understanding of mechanisms determining the 10 nm fiber. (Abstract excerpt)

Richardson, Alex, et al. Learning spatio-temporal patterns with Neural Cellular Automata. arXiv:2310.14809. University of Edinburgh biophysicists including Richard Blythe present a latest program iteration for more effective pattern recognition by way of these mathematical computations. The entry could be in Natural Algorithms, or Earthifical Intelligence. And as one logs in, it may seem that we at work are forming an actual global brain facility.

Neural Cellular Automata (NCA) are a powerful combination of machine learning and mechanistic modelling. We train NCA to learn complex dynamics from time series of images and Partial Differential Equations (PDE) trajectories. Our method is designed to identify underlying local rules that govern large scale emergent behaviours. We extend NCA to view structures within th(search)e same system, as well as learning rules for Turing pattern formation in nonlinear (PDEs). (Excerpt)

Roberts, Siobhan. The Lasting Lessons of John Conway’s Game of Life. New York Times. December 28, 2020. A science journalist and author of a 2015 Genius at Playbiography of the Princeton University mathematician reviews his prime computational innovation along with its on-going implications. Sadly, John Conway passed away at age 82 in April 2020 from the COVID virus. When at Cambridge University in 1972 he sent several puzzles to Martin Gardner who was then an editor for Scientific American. His Game of Life entry was reviewed in the October issue, and has grown in popularity ever since.

The first quote cites the simple automata algorithm sequence. By running these rules, one can generate all manner of complex life forms as they self-organize into individualities, divisions of labor, groupings, and more. For a wider survey, Roberts asked computer scholars such as Susan Stepney, Stephen Wolfram, Daniel Dennett and Rudy Rucker for their thoughts, we include other comments below. But an overall mindset or interpretation seems to pervade that such natural programs are ultimately unpredictable. In our current mindset a reliable, thematic guidance is not seen to be written in, nor does it exist.

The game of life was simple: Place any configuration of cells on a grid, then watch what transpires according to three rules that dictate how the system plays out. Birth rule: An empty, or “dead,” cell with precisely three “live” neighbors (full cells) becomes live. Death rule: A live cell with zero or one neighbors dies of isolation; a live cell with four or more neighbors dies of overcrowding. Survival rule: A live cell with two or three neighbors remains alive. With each iteration, some cells live, some die and “Life-forms” evolve, one generation to the next. (S. Roberts)

Complexity arises from simplicity! We are used to the idea that anything complex must arise out of something more complex. But the Game of Life shows us that complex virtual “organisms” arise out of the interaction of a few simple rules. (Brian Eno)

In this moment in time, it’s important to emphasize that inherent unpredictability is a feature of life in the real world as well as in the Game of Life. We have to figure out ways to flourish in spite of the vicarious uncertainty we constantly live with. (Melanie Mitchell)

The name Conway chose — the Game of Life — frames his invention as a metaphor. But I’m not sure that he anticipated how relevant Life would become, and that in 50 years we’d all be playing an emergent game of life and death. (William Poundstone)

In normal times, we can keep building stuff one component upon another, but in harder times like this pandemic, we need something that is more resilient and can prepare for the unpreparable. That would need changes in our “rules of life,” which we take for granted. (Bert Chan – see his Lenia and Expanded Universe paper above for extensive displays of this cellular method.)

Rondelez, Yannick and Damien Woods, eds. DNA Computing and Molecular Programming. Switzerland: Springer, 2016. The Lecture Notes in Computer Science 9818 Proceedings of the 22th International Conference on this title subject held in Munich, September 2016, which is summed up as: Research in DNA computing and molecular programming draws together mathematics, computer science, physics, chemistry, biology, and nanotechnology to address the analysis, design, and synthesis of information-based molecular systems. For example, some entries are Hierarchical Self-Assembly of Fractals, A Scheme for Molecular Computation, and Chemical Reaction Network Implementations. Search also David Doty in Systems Chemistry for more.

Rozenberg, Grzegorz, ed. Handbook of Natural Computing. Berlin: Springer, 2014. The editor-in-chief is a Polish computer scientist who has inspired this theoretical movement for decades. An Introduction he wrote is available on the Springer site, quoted below. The main topical areas covered are Cellular Automata, Neural, Evolutionary, Molecular, Quantum Computation, and Broader Perspectives.

Natural Computing is the field of research that investigates human-designed computing inspired by nature as well as computing taking place in nature, that is, it investigates models and computational techniques inspired by nature, and also it investigates, in terms of information processing, phenomena taking place in nature. Examples of the first strand include neural computation inspired by the function of the brain, evolutionary computation inspired by Darwinian evolution, cellular automata inspired by intercellular communications, swarm intelligence inspired by the behavior of groups of organisms, artificial immune systems, membrane computing, and amorphous computing inspired by morphogenesis. The second strand is represented by investigations into, among othrs, the computational nature of self-assembly, the computational nature of developmental processes, the computational nature of biochemical reactions, bacterial communication, brain processes, and the systems biology approach to bionetworks where cellular processes are treated in terms of communication and interaction.

Salcedo-Sanz, Sancho. Modern Meta-Heuristics Based on Nonlinear Physics Processes. Reports on Progress in Physics. Vol. 655, 2016. A technical contribution by a Universidad de Alcala, Madrid, computer scientist which gives the project of distilling natural search and optimize algorithms a deeper physical basis. While previous efforts draw from biology and evolution, see Xin-She Yang herein, this study shows how the fundamental self-organized complexities can be an equally valuable source. For example, Boltzmann distribution, fractal structure generation, vortex search, coral reefs (abstract below) simulated annealing, Big Bang-Big Crunch (see abstract), galactic swarm, electromagnetic algorithms, and more, are described. These theories are implying a Darwinian-like cosmos, a metaheuristic universe, which spawns prolific varieties so that good-enough candidates can be selected. One then wonders, does it extend to myriad orbital planets such as our own, who need to choose and affirm by their own volition?

Meta-heuristic algorithms are problem-solving methods which try to find good-enough solutions to very hard optimization problems, at a reasonable computation time, where classical approaches fail, or cannot even been applied. Many existing meta-heuristics approaches are nature-inspired techniques, which work by simulating or modeling different natural processes in a computer. Historically, many of the most successful meta-heuristic approaches have had a biological inspiration, such as evolutionary computation or swarm intelligence paradigms, but in the last few years new approaches based on nonlinear physics processes modeling have been proposed and applied with success. Non-linear physics processes, modeled as optimization algorithms, are able to produce completely new search procedures, with extremely effective exploration capabilities in many cases, which are able to outperform existing optimization approaches. In this paper we review the most important optimization algorithms based on nonlinear physics, how they have been constructed from specific modeling of a real phenomena, and also their novelty in terms of comparison with alternative existing algorithms for optimization. (Abstract excerpt)

In computer science and mathematical optimization, a metaheuristic is a higher-level procedure or heuristic designed to find, generate, or select a heuristic (partial search algorithm) that may provide a sufficiently good solution to an optimization problem, especially with incomplete or imperfect information or limited computation capacity. Metaheuristics sample a set of solutions which is too large to be completely sampled. (Wikipedia)

The Coral Reefs Optimization algorithm is a novel global-search method based on corals' biology and coral reefs formation. The original idea is by Sancho Salcedo-Sanz, who realized that corals and reef formation may be artificially simulated to solve optimization problems. The algorithm is based on a reef of solutions to a given optimization problem (corals), that reproduce in a similar way as corals do in Nature (Broadcast spawning, brooding and budding operators are implemented). In addition, the larvae (new individuals) fight for space in the reef against other existing corals, and even weak solutions have the possibility of surviving, if there is enough room in the reef. The final algorithm has similarities to evolutionary algorithms and simulated annealing, and it has been shown to improve both approaches in different problems. (Author’s website)

Nature is the principal source for proposing new optimization methods such as genetic algorithms (GA) and simulated annealing (SA) methods. All traditional evolutionary algorithms are heuristic population-based search procedures that incorporate random variation and selection. The main contribution of this study is that it proposes a novel optimization method that relies on one of the theories of the evolution of the universe; namely, the Big Bang and Big Crunch Theory. In the Big Bang phase, energy dissipation produces disorder and randomness is the main feature of this phase; whereas, in the Big Crunch phase, randomly distributed particles are drawn into an order. Inspired by this theory, an optimization algorithm is constructed, which will be called the Big Bang–Big Crunch (BB–BC) method that generates random points in the Big Bang phase and shrinks those points to a single representative point via a center of mass or minimal cost approach in the Big Crunch phase. (Abstract - A New Optimization Method in Advances in Engineering Software, 37/2, 2007)

Sarma, Gopal. Reductionism and the Universal Calculus. arXiv:1607.06725. A computer scientist and philosopher now in medical school (bio below) suggests that Leibniz’s prescience to seek and decipher a natural programmic source was mostly set aside because of a later scientific emphasis upon a reductive method, which then missed this quality.

In the seminal essay, "On the unreasonable effectiveness of mathematics in the physical sciences," physicist Eugene Wigner poses a fundamental philosophical question concerning the relationship between a physical system and our capacity to model its behavior with the symbolic language of mathematics. In this essay, I examine an ambitious 16th and 17th-century intellectual agenda from the perspective of Wigner's question, namely, what historian Paolo Rossi (Logic and the Art of Memory 2002) calls "the quest to create a universal language." While many elite thinkers pursued related ideas, the most inspiring and forceful was Gottfried Leibniz's effort to create a "universal calculus," a pictorial language which would transparently represent the entirety of human knowledge, as well as an associated symbolic calculus with which to model the behavior of physical systems and derive new truths. I suggest that a deeper understanding of why the efforts of Leibniz and others failed could shed light on Wigner's original question. (Abstract)

My background is in mathematics, physics, data science, and software engineering. After completing my PhD at Stanford and prior to joining the Emory University School of Medicine, I worked for several years for Wolfram Research. I am currently pursuing a medical education with the conviction that the degree of specialization in modern science has created an urgent need for broadly educated scientists with training in multiple subjects at the graduate level. Ultimately, my aim is to shape the ongoing evolution of scientific and intellectual culture in order to responsibly guide ethical and humane use of increasingly sophisticated science and technology. In addition, I am broadly interested in the history and philosophy of science, the origins of scientific thought, and the organizational and cultural issues confronting the biomedical sciences as a result of widespread automation and computational methods. (GS website)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next