(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

2. Biteracy: Natural Algorithmic Computation

Zelinka, Ivan. A Survey on Evolutionary Algorithms Dynamics and its Complexity. Swarm and Evolutionary Computation. Online August, 2015. The Technical University of Ostrava, Czech Republic, computer scientist also has a posting at Ton Duc Thang University, Vietnam. He often organizes international conferences and edits volumes (search here and Springer) which consider and express a lively nature that employs a source code which evolves and emerges from uniVerse to humankinder. This paper, with 147 references, covers historical imaginations from Gregor Mendel in Brno, not far from Ostrava, to Alan Turing, Nils Barricelli, John Holland, and others. Lately through journals as this and voluminous book series, a worldwide research project proceeds apace. As I log this in along with Rachel Armstrong’s Vibrant Architecture (2015) phenomenal human peoples appear to be closing in on an innate genetic-like computation, which by its self-recognition can pass to a new genesis future creation.

Swarm and evolutionary based algorithms represent a class of search methods that can be used for solving optimization problems. They mimic natural principles of evolution and swarm based societies like ants, bees, by employing a population-based approach in mutual communication and information sharing and processing, including randomness. In this paper, history of swarm and evolutionary algorithms are discussed in general as well as their dynamics, structure and behavior. The core of this paper is an overview of an alternative way how dynamics of arbitrary swarm and evolutionary algorithms can be visualized, analyzed and controlled. Also selected representative applications are discussed at the end. Both subtopics are based on interdisciplinary intersection of two interesting research areas: swarm and evolutionary algorithms and complex dynamics of nonlinear systems that usually exhibit very complex behavior. (Abstract)

Zelinka, Ivan, et al, eds. Evolutionary Algorithms and Chaotic Systems. Berlin: Springer, 2010. From five years ago, an early volume that begins to address the turn and synthesis of complexity science with this computational dimension. It could be seen as part of the broad movement to quantify this natural program at work, and as the genesis uniVerse intends and requires, to have its dynamic utility pass to intelligent human apply and continuance.

This book discusses the mutual intersection of two interesting fields of research, i.e. deterministic chaos and evolutionary computation. Evolutionary computation which are able to handle tasks such as control of various chaotic systems and synthesis of their structure are explored, while deterministic chaos is investigated as a behavioral part of evolutionary algorithms. (Springer) For stochastic algorithms, in general we have two types: heuristic and metaheuristic, though their difference is small. Loosely speaking, heuristic means ‘to find’ or ‘to discover by trial and error.’ (4)

Zenil, Hector, et al. Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments. Entropy. 14/11, 2012. Zenil and James Marshall, University of Sheffield, Carlos Gershenson and David Rosenbluth, Universidad Nacional Autonoma de Mexico, a younger generation of systems scholars, newly engage the deepest issue and question of what kind of abiding cosmos must there be for regnant life to observe it. By way of a novel computational physics, they show that an evident presence of information-processing, cognitive organisms implies and indeed requires an equivalently viable nature with its own algorithmic essence. As a result, the old random, insensate, accidental scheme is set aside for an inherently textual, creatively responsive reality. With Stephan Wolfram and others, the dichotomy between physics and biology is thus resolved by this witness and inclusion of this informative affinity.

In evolutionary biology, attention to the relationship between stochastic organisms and their stochastic environments has leaned towards the adaptability and learning capabilities of the organisms rather than toward the properties of the environment. This article is devoted to the algorithmic aspects of the environment and its interaction with living organisms. We ask whether one may use the fact of the existence of life to establish how far nature is removed from algorithmic randomness. The paper uses a novel approach to behavioral evolutionary questions, using tools drawn from information theory, algorithmic complexity and the thermodynamics of computation to support an intuitive assumption about the near optimal structure of a physical environment that would prove conducive to the evolution and survival of organisms. We contribute to the discussion of the algorithmic structure of natural environments and provide statistical and computational arguments for the intuitive claim that living systems would not be able to survive in completely unpredictable environments, even if adaptable and equipped with storage and learning capabilities by natural selection (brain memory or DNA). (Abstract)

Information processing in organisms should be understood as the process whereby the organism compares its knowledge about the world with the observable state of the world at a given time; in other words, the process whereby an organism weighs possible future outcomes against its present condition. While the larger the number and the more accurate the processed observables from the environment, the better the predictions and decisions, organisms cannot spend more energy than the total return received from information. We will explain how we think organisms can be modeled using instantaneous descriptions written in bits, their states being updated over time as they interact with the environment. Then we will use this argument to show that thermodynamic trade-offs provide clues to the degree of structure and predictability versus randomness in a stochastic natural environment. The sense in which we use the term algorithmic structure throughout this paper is as opposed to randomness. (2176)

So what does the existence of living organisms tell us about structure versus randomness in nature? That one would need to attribute incredible power to living organisms, power to overcome a random environment having no structure, or alternatively, to accept that a stochastic environment in which living organisms can survive and reproduce needs a degree of structure. (2186)

Conclusions. It is interesting to note that some species enhance their fitness by modifying their environments, bringing about an increase in predictability, a clear evolutionary advantage. This seems particularly obvious in the human species’ ability to fashion stable and predictable environments for itself. Using some aspects of information theory, algorithmic complexity and computational thermodynamics, we have captured some of these properties and made explicit their importance for the organism from the standpoint of the environment. In evolutionary terms, it is difficult to justify how biological algorithmic complexity could significantly surpass that of the environment itself. If the organism’s representation of the environment E has algorithmic complexity x, E should have a complexity of no less than x. We have advanced the claim that the algorithmic complexity of biological systems follows the algorithmic complexity of the environment. (2188)

Zhang, Chuan, et al. DNA Computing for Combinational Logic. arXiv:1807:02010. A seven person team with postings at the Lab of Efficient Architectures for Digital-communication and Signal-processing (LEADS), National Mobile Communications Research Laboratory, and Quantum Information Center, Southeast University, China offer another entry to how these remarkable nucleotide helices are increasingly being found, in addition to genetic purposes, to have a natural affinity for all manner of computations. The paper is slated for publication in the Springer journal Science China Information Sciences.

Future Horizons: As a non-von Neumann architecture machine, DNA computer tries to rethink the notion of computation. Operating in a parallel manner, it starts all the bio-operations from encoding information over the alphabet {A, T, G, C}. Just like the silicon-based computer performs complex calculations on a basic suite of arithmetic and logical operations, DNA has cutting, copying and many others. “Born” to solvea seven-point Hamiltonian path problem, DNA computing shows overwhelming advantages in solving hard problems, especially those problems that conventional computers cannot address. Seemingly as the smallest computer, about a kilogram of its computational substrate DNA could meet the world’s storage requirements as long as the information could be packaged densely enough. (14)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10