(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Recent Additions

IV. Ecosmomics: An Independent, UniVersal, Source Code-Script of Generative Complex Network Systems

2. Biteracy: Natural Algorithmic Computation

Wu, Jun. The Beauty of Mathematics in Computer Science. Boca Raton: CRC Press, 2019. This popular text in Chinese by a Google Brain USA member (bio below) is here published in English. It’s main message is to show the deep affinities between algorithmic code, scriptural languages and innate mathematic principles

The book covers many topics including Natural language processing, Speech recognition and machine translation, Statistical language modeling, Quantitive measurement of information, Pagerank for web search, Matrix operation and document classification, Mathematical background of big data, and Neural networks and Google’s deep learning.

Jun Wu was a research scientist who invented Google’s Chinese, Japanese, and Korean Web Search Algorithms. He wrote official blogs introducing Google technologies in simple languages for Chinese Internet users from 2006-2010. Wu received PhD in computer science from Johns Hopkins University and has been working on natural language processing for more than 20 years.

Xu, Lei. Further Advances on Bayesian Ying-Yang Harmony Learning. Applied Informatics. Online June, 2016. The Chinese University of Hong Kong chair of computer science provides a succinct update on his project since the 1990s to join traditional Chinese organic philosophy with Bayesian probabilities so to achieve a preferred way, via an intricate mathematics, to gather and realize natural knowledge. On the author’s CUHK website is a steady list of publications which explore and refine these unique insights. If one may distill, sentences such as Ying (feminine principle) is primary and comes first, while the Yang (male) is secondary and bases on the Ying express and qualify a dynamic reality with both an inner, constant, code-like source, and an overt, manifest animate structure. By whatever terms, its once and future essence is a universally evident gender complementarity, by creative turns Ying, Yang, and may we say Taome.

His 2010 Bayesian Ying-Yang System, Best Harmony Learning, and Five Action Circling paper (Frontiers of Electrical and Electronic Engineering in China. 5/3), presents a cogent statement of this creative cosmology, from which we quote next. We also note On Essential Topics of BYY Harmony Learning in the same journal (7/1, 2012, each in English by Springer). Its actual validity can be established because it comes closest to an independent phenomenal nature. By this perception, a prime, holistic feminine phase envisions knowledge from subsidiary masculine data reports, which as internal and external phases models well the dynamic learning function of bicameral right and left brain hemispheres. There is great wisdom here which needs to be appreciated and availed.

Complementary composition of Ying-Yang system: A system that survives or interacts with its world is able to be functionally divided into two different but complement parts. One is called Yang that inputs from its external world called Yang domain and transforms what gathered via a Yang pathway into an inner domain; while the other is Ying that consists of this inner domain called Ying domain and a Ying pathway. The Ying domain accumulates, integrates, digests, and condenses whatever came from Yang, and the Ying pathway selects among the Ying domain the best ones to produce the reconstructions back to the Yang domain. (321, 2010) Instead of simply regarding Ying-Yang as two opposite parts, which was frequently misunderstood by westerns, the major natures of a Ying-Yang pair are described by the following propositions: (1) Ying is primary, while Yang is secondary and comes from Ying, (2) Ying and Yang are not exclusive each other, though they were sometimes misunderstood by ones from a logical perspective. (321, 2010)

Lei Xu is a chair professor of The Chinese University of Hong Kong in computer science. He graduated from Harbin Institute of Technology in 1981, and completed his master and Ph.D thesis at Tsinghua University during 1982–1986. During 1989–1993, he worked at several universities in Finland, Canada and USA, including Harvard and MIT. He joined CUHK in 1993, became professor in 1996 and took the current position since 2002. Prof. Xu has published dozens of journal papers and also many papers in conference proceedings and edited books, covering the areas of statistical learning, neural networks, and pattern recognition, with a number of well-cited papers, e.g., his papers got over 5500 citations according to Google Scholar.

Bayesian probability is one interpretation of the concept of probability. In contrast to interpreting probability as frequency or propensity of some phenomenon, Bayesian probability is a quantity that is assigned to represent a state of knowledge,[1] or a state of belief. The Bayesian interpretation of probability can be seen as an extension of propositional logic that enables reasoning with hypotheses, i.e., the propositions whose truth or falsity is uncertain. In the Bayesian view, a probability is assigned to a hypothesis, whereas under frequentist inference, a hypothesis is typically tested without being assigned a probability. (Wikipedia)

Applied Informatics covers the theory and application of informatics in various scientific, technological, engineering and social fields. Aiming to inspire new multidisciplinary research, the journal acts as an integrative venue that collects high-quality original research papers and reviews on various aspects of applied informatics, with the foundations of informatics (information theory, statistical modeling, machine learning, etc) as the driving core and the interactions between essential realms as the promoting focuses; particularly important are the interactions between (a) life sciences (bioinformatics, medical informatics, bioengineering);(b) intelligence sciences (neural and cognitive informatics, multimedia, pattern recognition); and (c) community sciences (social networks, affective computing, big data analytics).

Yang, Shengxiang and Xin Yao, eds. Evolutionary Computation for Dynamic Optimization Problems. Berlin: Springer, 2015. An edition amongst a burst of books which engage and implement the growing notice that nature’s cosmic to culture development involves a program source code which seems on a course to achieve its own intelligent self-realization and intentional continuance. A typical chapter might be Memetic Algorithms for Dynamic Optimizations Problems. See also Nature-Inspired Metaheuristic Algorithms by Xin-She Yang (Luniver Press, 2010).

Yang, Xin-She. Nature-Inspired Optimization Algorithms. Amsterdam: Elsevier, 2014. Reviewed more in Universal Darwinism, the work is a latest survey upon life’s evolutionary proclivity to search, test and reach good-enough solutions by way of such operations.

Yang, Xin-She. Socail Algorithms. arXiv:1805.05855. The Middlesex University, UK computational mathematician and author (search) posts a chapter for the 2020 edition of the Encyclopedia of Complexity and Systems Science (Meyers) which explains an array of features that distinguish this integral class of natural computations. See also Swarm Intelligence: Past, Present and Future by the author at 1804.07999.

This article concerns the review of a special class of swarm intelligence based algorithms for solving optimization problems and these algorithms can be referred to as social algorithms. Social algorithms use multiple agents and the social interactions to design rules for algorithms so as to mimic certain successful characteristics of the social/biological systems such as ants, bees, bats, birds and animals. (Abstract)

Social Algorithms: Social algorithms are a class of nature-inspired algorithms that use some of characteristics of social swarms such as social insects (e.g., ants, bees, bats and fireflies) and reproduction strategies such as cuckoo-host species co-evolution. These algorithms tend to be swarm intelligence based algorithms. Examples are ant colony optimization, particle swarm optimization and cuckoo search. (Glossary, 2)

Many optimization problems in science and engineering are challenging to solve, and the current trend is to use swarm intelligence (SI) and SI-based algorithms to tackle such challenging problems. Some significant developments have been made in recent years, though there are still many open problems in this area. This paper provides a short but timely analysis about SI-based algorithms and their links with self-organization. Different characteristics and properties are analyzed here from both mathematical and qualitative perspectives. (1804.07999).

Yang, Xin-She and Joao Paulo Papa, eds. Bio-Inspired Computation and Applications in Image Processing. Amsterdam: Elsevier, 2016. In this volume, the Middlesex University London mathematician and author is joined by a Sal Paulo State University computer scientist. As these biomimetic ways and means gain wide usage, a typical chapter is Fine-Tuning Deep Belief Networks Using Cuckoo Search.

In this volume, the Middlesex University London mathematician and author is joined by a Sal Paulo State University computer scientist. As these biomimetic ways and means gain wide usage, a typical chapter is Fine-Tuning Deep Belief Networks Using Cuckoo Search.

Yang, Xin-She, et al, eds. Swarm Intelligence and Bio-Inspired Computation. Amsterdam: Elsevier, 2013. An introductory text for spontaneous optimizations due to natural, organics agencies of stochastic, evolutionary, dynamic self-emergence. As noted in this section, the way social insects and animal groupings achieve this can be a good source and guide.

Swarm Intelligence is the collective behavior of decentralized, self-organized systems, natural or artificial. SI systems consist typically of a population of simple agents or boids interacting locally with one another and with their environment. The inspiration often comes from nature, especially biological systems. The agents follow simple rules, and although there is no centralized control structure dictating how individual agents should behave, local, to a certain degree random interactions between such agents lead to the emergence of "intelligent" global behavior, unknown to the individual agents. Examples in natural systems of SI include ant colonies, bird flocking, animal herding, bacterial growth, fish schooling and microbial intelligence. (Wikipedia)

Zarco, Mario and Tom Froese. Self-Optimization in Continuous-Time Recurrent Neural Networks. Frontiers in Robotics and AI. Online August, 2018. In an entry edited by Claudius Gros and reviewed by Richard Watson, Universidad Nacional Autónoma de México mathematicians glimpse an evidential presence even in cerebral cognitions of nature’s universal process of many relatively random states, in this case neurons, which iteratively evolve into viable “attractor configurations.”

A recent advance in complex adaptive systems has revealed a new unsupervised learning technique called self-modeling or self-optimization. Basically, a complex network that can form an associative memory of the state configurations of the attractors on which it converges will optimize its structure: it will spontaneously generalize over these typically suboptimal attractors and thereby also reinforce more optimal attractors. This technique has been applied to social networks, gene regulatory networks, and neural networks, but its application to less restricted neural controllers, as typically used in evolutionary robotics, has not yet been attempted. Here we show for the first time that the self-optimization process can be implemented in a continuous-time recurrent neural network with asymmetrical connections. (Abstract excerpts)

Zarcone, Ryan, et al. Analog Coding in Emerging Memory Systems. Nature Scientific Reports. 10/6831, 2020. We cite this entry by ten UC Berkeley, Stanford, IBM Research, Macronix International, and Google Brain researchers including Jesse Engel as another example, akin to Miguel Nicolelis’ neuroscience (search), of the optimum value of complementary digital byte and analog link pairings. These archetypal modes are now seen to hold across nature’s genesis, except for political elections where they remain pitted against each other, unaware of any scientific findings.

Exponential growth in data generation and big data science has created an imperative for low-power, high-density information storage. This need has motivated research into multi-level memory devices capable of storing multiple bits per device because their memory state is intrinsically analog. Furthermore, much of the data they will store, along with the subsequent operations, are analog-valued. However the current storage paradigm is quantized for use with digital systems. Here, we recast storage as a communication problem, which allows us to use ideas from analog coding and show that analog-valued emerging memory devices can achieve higher capacities. (Abstract excerpt)

Zelinka, Ivan. A Survey on Evolutionary Algorithms Dynamics and its Complexity. Swarm and Evolutionary Computation. Online August, 2015. The Technical University of Ostrava, Czech Republic, computer scientist also has a posting at Ton Duc Thang University, Vietnam. He often organizes international conferences and edits volumes (search here and Springer) which consider and express a lively nature that employs a source code which evolves and emerges from uniVerse to humankinder. This paper, with 147 references, covers historical imaginations from Gregor Mendel in Brno, not far from Ostrava, to Alan Turing, Nils Barricelli, John Holland, and others. Lately through journals as this and voluminous book series, a worldwide research project proceeds apace. As I log this in along with Rachel Armstrong’s Vibrant Architecture (2015) phenomenal human peoples appear to be closing in on an innate genetic-like computation, which by its self-recognition can pass to a new genesis future creation.

Swarm and evolutionary based algorithms represent a class of search methods that can be used for solving optimization problems. They mimic natural principles of evolution and swarm based societies like ants, bees, by employing a population-based approach in mutual communication and information sharing and processing, including randomness. In this paper, history of swarm and evolutionary algorithms are discussed in general as well as their dynamics, structure and behavior. The core of this paper is an overview of an alternative way how dynamics of arbitrary swarm and evolutionary algorithms can be visualized, analyzed and controlled. Also selected representative applications are discussed at the end. Both subtopics are based on interdisciplinary intersection of two interesting research areas: swarm and evolutionary algorithms and complex dynamics of nonlinear systems that usually exhibit very complex behavior. (Abstract)

Zelinka, Ivan, et al, eds. Evolutionary Algorithms and Chaotic Systems. Berlin: Springer, 2010. From five years ago, an early volume that begins to address the turn and synthesis of complexity science with this computational dimension. It could be seen as part of the broad movement to quantify this natural program at work, and as the genesis uniVerse intends and requires, to have its dynamic utility pass to intelligent human apply and continuance.

This book discusses the mutual intersection of two interesting fields of research, i.e. deterministic chaos and evolutionary computation. Evolutionary computation which are able to handle tasks such as control of various chaotic systems and synthesis of their structure are explored, while deterministic chaos is investigated as a behavioral part of evolutionary algorithms. (Springer) For stochastic algorithms, in general we have two types: heuristic and metaheuristic, though their difference is small. Loosely speaking, heuristic means ‘to find’ or ‘to discover by trial and error.’ (4)

Zenil, Hector, et al. Life as Thermodynamic Evidence of Algorithmic Structure in Natural Environments. Entropy. 14/11, 2012. Zenil and James Marshall, University of Sheffield, Carlos Gershenson and David Rosenbluth, Universidad Nacional Autonoma de Mexico, a younger generation of systems scholars, newly engage the deepest issue and question of what kind of abiding cosmos must there be for regnant life to observe it. By way of a novel computational physics, they show that an evident presence of information-processing, cognitive organisms implies and indeed requires an equivalently viable nature with its own algorithmic essence. As a result, the old random, insensate, accidental scheme is set aside for an inherently textual, creatively responsive reality. With Stephan Wolfram and others, the dichotomy between physics and biology is thus resolved by this witness and inclusion of this informative affinity.

In evolutionary biology, attention to the relationship between stochastic organisms and their stochastic environments has leaned towards the adaptability and learning capabilities of the organisms rather than toward the properties of the environment. This article is devoted to the algorithmic aspects of the environment and its interaction with living organisms. We ask whether one may use the fact of the existence of life to establish how far nature is removed from algorithmic randomness. The paper uses a novel approach to behavioral evolutionary questions, using tools drawn from information theory, algorithmic complexity and the thermodynamics of computation to support an intuitive assumption about the near optimal structure of a physical environment that would prove conducive to the evolution and survival of organisms. We contribute to the discussion of the algorithmic structure of natural environments and provide statistical and computational arguments for the intuitive claim that living systems would not be able to survive in completely unpredictable environments, even if adaptable and equipped with storage and learning capabilities by natural selection (brain memory or DNA). (Abstract)

Information processing in organisms should be understood as the process whereby the organism compares its knowledge about the world with the observable state of the world at a given time; in other words, the process whereby an organism weighs possible future outcomes against its present condition. While the larger the number and the more accurate the processed observables from the environment, the better the predictions and decisions, organisms cannot spend more energy than the total return received from information. We will explain how we think organisms can be modeled using instantaneous descriptions written in bits, their states being updated over time as they interact with the environment. Then we will use this argument to show that thermodynamic trade-offs provide clues to the degree of structure and predictability versus randomness in a stochastic natural environment. The sense in which we use the term algorithmic structure throughout this paper is as opposed to randomness. (2176)

So what does the existence of living organisms tell us about structure versus randomness in nature? That one would need to attribute incredible power to living organisms, power to overcome a random environment having no structure, or alternatively, to accept that a stochastic environment in which living organisms can survive and reproduce needs a degree of structure. (2186)

Conclusions. It is interesting to note that some species enhance their fitness by modifying their environments, bringing about an increase in predictability, a clear evolutionary advantage. This seems particularly obvious in the human species’ ability to fashion stable and predictable environments for itself. Using some aspects of information theory, algorithmic complexity and computational thermodynamics, we have captured some of these properties and made explicit their importance for the organism from the standpoint of the environment. In evolutionary terms, it is difficult to justify how biological algorithmic complexity could significantly surpass that of the environment itself. If the organism’s representation of the environment E has algorithmic complexity x, E should have a complexity of no less than x. We have advanced the claim that the algorithmic complexity of biological systems follows the algorithmic complexity of the environment. (2188)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next