![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
|
![]() |
![]() |
||||||||||
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
|
IV. Ecosmomics: Independent Complex Network Systems, Computational Programs, Genetic Ecode Scripts2. Biteracy: Natural Algorithmic Computation Thompson, Sarah, et al. The Fractal Geometry of Fitness Landscapes at the Local Optima Level. Natural Computing. December, 2020. University of Stirling, UK and Universite du Littoral Côte d’Opale, Calais systems mathematicians show how an evolutionary dynamics meant to attain and select a fittest result can take on a self-similar topology. A local optima network (LON) encodes local optima connectivity in the fitness landscape of a combinatorial optimisation problem. Recently, LONs have been studied for their fractal dimension which is a complexity index where a non-integer can be assigned to a pattern. We use visual analysis, correlation analysis, and machine learning to show that relationships exist and that fractal features of LONs can contribute to explaining and predicting algorithm performance. (Abstract excerpt)
Valiant, Leslie.
Probably Approximately Correct: Nature's Algorithms for Learning and Prospering in a Complex World.
New York: Basic Books,
2013.
A Harvard University mathematician broaches an innovative view of life’s advance as due to interactions between creatures and their environment. This novel evolution is a learning process that proceeds by way of algorithmic computations, which in this sense are dubbed “ecorithms” as they try to reach a sufficient optimum. In so doing, Valiant joins current perceptions from Bayesian, rationality (Okasha), Markov, genetic, approaches that view not only Darwinian evolution as a computational learning mechanism, but seem to imply the entire universe is some optimization or maximization process. Algorithms are the step-by-step instructions used in computing for achieving desired results, much like recipes in cooking. In both cases the recipe designer has a certain controlled environment in mind for realizing the recipe, and foresees how the desired outcome will be achieved. The algorithms I discuss in this book are special. Unlike most algorithms, they can be run in environments unknown to the designer, and they learn by interacting with the environment how to act effectively in it. After sufficient interaction they will have expertise not provided by the designer but extracted from the environment. I call these algorithms ecorithms. The model of learning they follow, known as the probably approximately correct model, provides a quantitative framework in which designers can evaluate the expertise achieved and the cost of achieving it. These ecorithms are not merely a feature of computers. I argue in this book that such learning mechanisms impose and determine the character of life on Earth. The course of evolution is shaped entirely by organisms interacting with and adapting to their environments. This biological inheritance, as well as further learning from the environment after conception and birth, have a determining influence on the course of an individual’s life. The focus here will be the unified study of the mechanisms of evolution, learning, and intelligence using the methods of computer science. (Book Summary) Vega-Rodriguez, Miguel and Jose Granado-Criado. Parallel Computing in Computational Biology. Journal of Computational Biology. 25/8, 2018. An introduction to a special issue about current technological methods by University of Extermadura, Spain computer scientists. For example see Large-Scale Simulations of Bacterial Populations over Complex Networks, Signature Gene Identification of Cancer Occurrence and Pattern Recognition and Human Splice-Site Prediction with Deep Neural Networks. Computational biology allows and encourages the application of many different parallel computing approaches. This special issue brings together high-quality state-of-the-art contributions about parallel computing in computational biology, from different technological points of view or perspectives. The special issue collects considerably extended and improved versions of the best articles, accepted and presented at the 2017 International Workshop on Parallelism in Bioinformatics. Walker, Sara Imari. Top-Down Causation and the Rise of Information in the Emergence of Life. Information. 5/3, 2014. Reviewed more in Information Computation, the Arizona State University astrophysicist explains the presence and role of algorithmic processes that appear to originate and drive a complexifying evolution. See also the author's 2015 paper The Descent of Math at arXiv:1505.00312. Walker, Sara Imari and Paul Davies. The Algorithmic Origins of Life. Journal of the Royal Society Interface. 10/Art. 79, 2012. Reviewed more in Origin of Life, Sara Walker, NASA Astrobiology Institute, and astrophysicist Paul Davies, presently at Arizona State University, articulate how an episodic evolutionary emergence seems to be driven and tracked by way of an native informational source. Watson, Richard A.. Compositional Evolution. Cambridge: MIT Press, 2006. Reviewed more in Quickening Evolution, a biologist and computer scientist at the University of Southampton applies algorithmic and complexity theory to show that much more is going on than a gradual drift of random mutations. Rather nature employs additional computations which generate a constant modularity engaged in symbiotic assemblies. Wibral, Michael, et al. Bits from Biology for Computational Intelligence. arXiv:1412.0291. If one may translate and gloss this paper by Wibral, Goethe University, with Joseph Lizier, University of Sydney, and Viola Priesemann, MPI Dynamics and Self-Organization, a novel self-developing, quickening cosmos seems suffused by natural information processing via evolved, genetic algorithms and multiplex networks. As Viola’s own research conveys (summary below, search Danielle Bassett also) human brains are an archetypal microcosm of this macrocosmic genesis. The paper appeared in Frontiers of Robotics and AI in 2015, and will be a chapter in From Matter to Life from Cambridge UP in 2017 Computational intelligence is broadly defined as biologically-inspired computing. Usually, inspiration is drawn from neural systems. This article shows how to analyze neural systems using information theory to obtain constraints that help identify the algorithms run by such systems and the information they represent. Algorithms and representations identified information-theoretically may then guide the design of biologically inspired computing systems (BICS). We show how to analyze the information encoded in a system about its environment, and also discuss recent methodological developments on the question of how much information each agent carries about the environment either uniquely, or redundantly or synergistically together with others. (Abstract excerpts) Wolfram, Stephen. Logic, Explainability and the Future of Understanding. Complex Systems. 28/1, 2019. The polymath prodigy (bio below) now can provide a 40 page technical survey of the history, present and preview of philosophical knowledge by way of its computational basis. In this journal edited by Hector Zenil, see also On Patterns and Dynamics of Rule 22 Cellular Automaton by Genaro Martinez, et al (28/2). Stephen Wolfram is the creator of Mathematica, Wolfram|Alpha and the Wolfram Language; the author of A New Kind of Science; and the founder CEO of Wolfram Research. Born in London in 1959, he was educated at Eton, Oxford and received his PhD in theoretical physics from Caltech at age 20. Wolfram's early scientific work was mainly in high-energy physics, quantum field theory and cosmology. Over the course of four decades, he has been responsible for many discoveries, inventions and innovations in computer science and beyond. (www.stephenwolfram.com) Wolpert, David, et al, eds. The Energetics of Computing in Life and Machines. Santa Fe: Santa Fe Institute Press, 2018. These highly technical proceedings from SFI seminars consider more efficient computational methods by way a better, deeper integration with vital principles and procedures. For example see Overview of Information Theory and Stochastic Thermodynamics of Computation by Wolpert (search), Information Processing in Chemical Systems by Peter Stadler, et al, and Automatically Reducing Energy consumption of software by Stephanie Forrest, et al. Why do computers use so much energy? What are the fundamental physical laws governing the relationship between the precise computation run by a system, whether artificial or natural, and how much energy that computation requires? Can we learn how to improve efficiency in computing by examining how biological computers manage to be so efficient? The time is ripe for a new synthesis of systems physics, computer science, and biochemistry. This volume integrates pure and applied concepts from these diverse fields, with the goal of cultivating a modern, nonequilibrium thermodynamics of computation. Woods, Damian, et al. Diverse and Robust Molecular Algorithms Using Reprogrammable DNA Self-Assembly. Nature. 567/366, 2019. Seven CalTech and Harvard bioinformatic researchers including Erik Winfree and David Doty (search each) advance understandings of how nature’s helical nucleotides can be availed for many more chemical, structural, data storage uses beyond replication. Who then are we cosmic curators to learn all about and intentionally take up life’s organic procreativity? Molecular biology provides a proof-of-principle that chemical systems can store and process information to direct molecular activities such as the fabrication of complex structures from molecular components. Mathematical tiling and statistical–mechanical models of molecular crystallization have shown that algorithmic behaviour can be embedded within molecular self-assembly processes by DNA nanotechnology. Here we report the design and validation of a DNA tile set that contains 355 single-stranded tiles and can be reprogrammed to implement a wide variety of 6-bit algorithms. These findings suggest that molecular self-assembly could be a reliable algorithmic component within programmable chemical systems. (Abstract excerpt) Wu, Jun. The Beauty of Mathematics in Computer Science. Boca Raton: CRC Press, 2019. This popular text in Chinese by a Google Brain USA member (bio below) is here published in English. It’s main message is to show the deep affinities between algorithmic code, scriptural languages and innate mathematic principles The book covers many topics including Natural language processing, Speech recognition and machine translation, Statistical language modeling, Quantitive measurement of information, Pagerank for web search, Matrix operation and document classification, Mathematical background of big data, and Neural networks and Google’s deep learning.
Xu, Lei.
Further Advances on Bayesian Ying-Yang Harmony Learning.
Applied Informatics.
Online June,
2016.
The Chinese University of Hong Kong chair of computer science provides a succinct update on his project since the 1990s to join traditional Chinese organic philosophy with Bayesian probabilities so to achieve a preferred way, via an intricate mathematics, to gather and realize natural knowledge. On the author’s CUHK website is a steady list of publications which explore and refine these unique insights. If one may distill, sentences such as Ying (feminine principle) is primary and comes first, while the Yang (male) is secondary and bases on the Ying express and qualify a dynamic reality with both an inner, constant, code-like source, and an overt, manifest animate structure. By whatever terms, its once and future essence is a universally evident gender complementarity, by creative turns Ying, Yang, and may we say Taome. Complementary composition of Ying-Yang system: A system that survives or interacts with its world is able to be functionally divided into two different but complement parts. One is called Yang that inputs from its external world called Yang domain and transforms what gathered via a Yang pathway into an inner domain; while the other is Ying that consists of this inner domain called Ying domain and a Ying pathway. The Ying domain accumulates, integrates, digests, and condenses whatever came from Yang, and the Ying pathway selects among the Ying domain the best ones to produce the reconstructions back to the Yang domain. (321, 2010) Instead of simply regarding Ying-Yang as two opposite parts, which was frequently misunderstood by westerns, the major natures of a Ying-Yang pair are described by the following propositions: (1) Ying is primary, while Yang is secondary and comes from Ying, (2) Ying and Yang are not exclusive each other, though they were sometimes misunderstood by ones from a logical perspective. (321, 2010)
Previous 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 Next
|
![]() |
|||||||||||||||||||||||||||||||||||||||||||||
HOME |
TABLE OF CONTENTS |
Introduction |
GENESIS VISION |
LEARNING PLANET |
ORGANIC UNIVERSE |