(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

III. Ecosmos: A Revolutionary Fertile, Habitable, Solar-Bioplanet Lifescape

C. The Information Computation Turn

Horsman, Dominic, et al. The Natural Science of Computing. Communications of the ACM. August, 2017. As the lengthy editorial summary next conveys, computer scientists Horsman and Vivien Kendon, University of Durham, and Susan Stepney, University of York, UK offer that our pervasive 21st century computational abilities has revolutionary implications as it empowers studies such as the recent astronomical discovery of gravity waves, along with everywhere else from quantum to social realms.

Technology changes science. In 2016, the scientific community thrilled to news that the LIGO collaboration had detected gravitational waves for the first time. LIGO is the latest in a long line of revolutionary technologies in astronomy, from the ability to 'see' the universe from radio waves to gamma rays, or from detecting cosmic rays and neutrinos. The interplay of technological and fundamental theoretical advance is replicated across all the natural sciences—which include, we argue, computer science. Some early computing models were developed as abstract models of existing physical computing systems. Now, as novel computing devices—from quantum computers to DNA processors, and even vast networks of human 'social machines'—reach a critical stage of development, they reveal how computing technologies can drive the expansion of theoretical tools and models of computing.

Non-standard and unconventional computing technologies have come to prominence as Moore's Law, that previously relentless increase in computing power, runs out. While techniques such as multicore and parallel processing allow for some gains without further increase of transistor density, there is a growing consensus that the next big step will come from technologies outside the framework of silicon hardware and binary logic. Quantum computing is now being developed on an international scale, with active research and use from Google and NASA as well as numerous universities and national laboratories. Biological computing is also being developed, from data encoding and processing in DNA molecules, to neuro-silicon hybrid devices and bio-inspired neural networks, to harnessing the behavior of slime molds. The huge advance of the internet has enabled 'social machines'—Galaxy Zoo, protein FoldIt, Wikipedia, innumerable citizen science tools—all working by networking humans and computers, to perform computations not accessible on current silicon-based technology alone. (Editorial summary)

Igamberdiev, Abir. Semiokinesis - Semiotic Autopoiesis of the Universe. Semiotica. 135/1-4, 2001. A fractal universe proceeds in its organic development by means of a recursive “self-representation of its Logos.” Life generates and organizes itself through open, nonequilibrium systems characterized by internal semiotic definitions. This affirms an emergent Platonic, textual reality which awaits our collective recognition.

The Universe is a semiotic connection of the infinity of Logos (Word) and the finiteness of its representation in the spatial-temporal structure of Cosmos (World). (20)

Jaeger, Gregg. Information and the Reconstruction of Quantum Physics. Annalen der Physik. 531/3, 2019. In a lead paper for a Physics of Information issue, the Boston University physicist philosopher first reviews precursor efforts by John Bell, Anton Zellinger, Jeffrey Bub, Carlo Rovelli onto Lucien Hardy, Giulio Chiribella, and others. Into the 21st century an informational component has conceptually become a prime, definitive quality. This expansive advance is then seen to augur for a wider synthesis toward a truly cosmic narrative reality.

The reconstruction of quantum physics has been connected with the interpretation of the quantum formalism, and by a deeper consideration of the relation of information to quantum states and processes. This recent form of reconstruction has provided new perspectives on physical correlations and entanglement that can be used to encode information. Here, a representative series of specifically information‐based treatments from partial reconstructions that make connections with information to rigorous axiomatizations, including those involving the theories of generalized probability and abstract systems is reviewed. (Abstract excerpt)

The reconstruction of quantum mechanics has historically been intertwined with the interpretation of the quantum formalism and, more recently, with the relation of information to quantum state transformation. Given that quantum mechanics, like information theory, involves probability at a fundamental level, it is to be expected that the two can be related. The deeper exploration of the connection of quantum mechanics to information has led to the idea of reconstructing not only quantum mechanics and quantum field theory but to the seeking of connections with space–time theory in a more general sort of quantum theory based specifically on informational principles rather than more obviously physical principles known from previous forms of physics. (1)

Ji, Sungchul. Language as a Model of Biocomplexity. International Conference on Complex Systems. May 23, 2000. In a paper presented at this conference, a cell biologist at Rutgers University describes a hierarchy of biological complexity where each level from biopolymers to societies and the biosphere is most defined by linguistic properties.

Johannsen, Wolfgang. On Semantic Information in Nature. Information. Online July, 2015. A Frankfurt School of Finance & Management theorist, by virtue of joining salient themes such as John Wheeler’s participatory ‘It from Bit,’ a semiotic, linguistic recurrence from universe to us, and an energetic, thermodynamic basis, reaches an integral synthesis as emergent degrees of meaningfulness. An Evolutionary Energetic Information Model with 15 tenets such as organisms as knowledge processors is proposed to contain and explain. Energy/entropy and information/semantics become a continuum, such that genomes and languages are versions of a natural source code. For a companion view, see Elements of a Semantic Code by Bernd-Olaf Kuppers (2013, search).

Joosten, Joost. Complexity Fits the Fittest. Zelinka, Ivan, et al, eds. How Nature Works: Complexity in Interdisciplinary Research and Applications. Berlin: Springer, 2014. The University of Barcelona logician is affiliated with the Algorthmic Nature group of the Paris-based Laboratory for Scientific Research for the Natural and Digital Sciences. By a general application of Stephen Wolfram’s cellular automata, the real presence a generative computational source in effect prior to selection can now be theoretically explained. This chapter, and a companion paper “On the Necessity of Complexity,” are available on the arXiv website.

In this paper we shall relate computational complexity to the principle of natural selection. We shall do this by giving a philosophical account of complexity versus universality. It seems sustainable to equate universal systems to complex systems or at least to potentially complex systems. Post’s problem on the existence of (natural) intermediate degrees then finds its analog in the Principle of Computational Equivalence (PCE). In this paper we address possible driving forces—if any—behind PCE. Both the natural aspects as well as the cognitive ones are investigated. We postulate a principle GNS that we call the Generalized Natural Selection principle that together with the Church-Turing thesis is seen to be in close correspondence to a weak version of PCE. Next, we view our cognitive toolkit in an evolutionary light and postulate a principle in analogy with Fodor’s language principle. (Complexity Fits the Fittest)

Wolfram's Principle of Computational Equivalence (PCE) implies that universal complexity abounds in nature. This paper comprises three sections. In the first section we consider the question why there are so many universal phenomena around. So, in a sense, we seek a driving force behind the PCE if any. We postulate a principle GNS that we call the Generalized Natural Selection Principle that together with the Church-Turing Thesis is seen to be equivalent to a weak version of PCE. In the second section we ask the question why we do not observe any phenomena that are complex but not-universal. We choose a cognitive setting to embark on this question and make some analogies with formal logic. In the third and final section we report on a case study where we see rich structures arise everywhere. (On the Necessity of Complexity)

Kari, Lila and Grzegorz Rozenberg. The Many Facets of Natural Computing. Communications of the ACM. 51/10, 2008. Kari, Canada Research Chair in Biocomputing, University of Western Ontario, and Rozenberg, pioneer Leiden University information philosopher offer an illistrated paean to the theoretical vista that “Nature is computation.” By so doing, a cross-fertilization accrues whence perceptions of a dynamic natural “software” can in turn inspire more viable computer capabilities. Prime instances via computational systems biology are “genomic computers,” gene regulatory and biochemical networks, transport and cellular computing, interactive organisms, and so on. These cases, for example, can infer computational immune systems, particle swarm optimization, and onto membrane, molecular, or quantum computing. See also Rozenberg’s chapter “Computer Science, Informatics, and Natural Computing” in Cooper, Barry, et al, eds. New Computational Paradigms (Springer, 2008).

Keller, Evelyn Fox. Towards a Science of Informed Matter. Studies in History and Philosophy of Biological and Biomedical Sciences. 42/2, 2011. The MIT philosopher of science picks up on the insights of Chemistry laureate Jean-Marie Lehn (search) in support of a growing sense that nature’s materiality is not inert but suffused with prescriptive information. An inherent “non-equilibrium dynamics” is thus at work, so as to infer a “molecular informatics.” By these encounters, if one might allow a creative, organic universe, could these abstractions be actually trying to express a natural parent to child genetic code?

Over the last couple of decades, a call has begun to resound in a number of distinct fields of inquiry for a reattachment of form to matter, for an understanding of 'information' as inherently embodied, or, as Jean-Marie Lehn calls it, for a "science of informed matter." We hear this call most clearly in chemistry, in cognitive science, in molecular computation, and in robotics-all fields looking to biological processes to ground a new epistemology. (174)

Khrennikov, Andrei. Towards Information Lasers. Entropy. Online October, 2015. The prolific Linnaeus University physicist continues his project, with many colleagues, to reconceive quantum phenomena in terms of an intrinsic communicative quality. The article is also a good entry to this 21st century fundamental revolution.

In our modeling, the notion of information is a primary notion, which is not definable with the aid of more fundamental notions, such as probability. Such an approach matches various purely-informational approaches to physics celebrated in Wheeler’s statement: “It from bit. Otherwise put, every ‘it’, every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely - even if in some contexts indirectly from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. ‘It from bit’ symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.” (6975)

Kitazono, Jun, et al. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy. 20/3, 2018. As this IIT approach (Tononi) whence consciousness is seen to rise in tandem with relative knowledge content grows in acceptance, Kitazono, Araya, Inc., Tokyo, Ryota Kanai, Kobe University, and Masafumi Oizumi (search), RIKEN Brain Science Institute, Hiroshima press its technical advance via algorithms and neural nets by which to better analyze and apply. See also concurrent papers Measuring Integrated Information by Pedro Mediano, et al at arXiv:1806.09373, and Integrated Information in the Thermodynamic Limit by Miguel Aguilera and Exequiel Di Paolo at 1806.07879. A good entry to ITT is What is Consciousness? by co-theorist Christoph Koch in Scientific American for June 2018.

The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information in the brain is related to the level of consciousness. (Abstract)

Knuth, Kevin. Information-Based Physics: An Observer-Centric Foundation. Contemporary Physics. Online January, 2014. The SUNY Albany professor of physics and informatics continues the inspiration of John Archibald Wheeler that this existent reality is in some way founded upon and most distinguished by a communicative source and conveyance. Such a self-visualizing and activating cosmos then requires at a later point the presence of sentient observers to recognize, acknowledge, and so bring into full being. Knuth has a series of prior papers on arXiv such as The Physics of Events: A Potential Foundation for Emergent Space-Time. While they, and most theoretical papers, are written in a technical parlance, the point of the message could be that human beings are in fact significantly empowered and entitled to learn, discover, witness and self-select.

It is generally believed that physical laws, reflecting an inherent order in the universe, are ordained by nature. However, in modern physics the observer plays a central role raising questions about how an observer-centric physics can result in laws apparently worthy of a universal nature-centric physics. Over the last decade, we have found that the consistent apt quantification of algebraic and order-theoretic structures results in calculi that possess constraint equations taking the form of what are often considered to be physical laws. The result is an approach to foundational physics where laws derive from both consistent descriptions and optimal information-based inferences made by embedded observers. (Abstract excerpt)

It is generally believed that physical laws reflect an inherent order in the universe. These laws, thought to apply everywhere, are typically considered to have been ordained by nature. In this sense they are universal and nature-centric. However, in the last century, modern physics has placed the observer in a central role resulting in an observer-based physics, which with a potential for subjectivity as well as quantum contextuality, poses conceptual difficulties for reconciliation with a nature-based physics. This raises questions as to precisely how an observer-based physics could give rise to consistent universal laws of nature as well as what role information plays in physics. Perhaps the potential implication of such questions has never been so clearly and concisely put as in Wheeler’s aphorism “It from Bit.” (1)

Landaure, Rolf. The Physical Nature of Information. Physics Letters A. 217/188, 1996. An historic paper by the German-American, IBM Research Center, physicist which established the concept that something else and more is going on than just material in motion. It is here that early claims of “quantum information,” along with “quantum parallelism and analog computation” are entered.

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next  [More Pages]