III. Ecosmos: A Revolutionary Fertile, Habitable, Solar-Bioplanet Lifescape
C. The Information Computation Turn
Horsman, Dominic, et al. Abstraction and Representation in Living Organisms: When does a Biological System Compute? Dodig-Crnkovic, Gordana and Raffaela Giovagnoli, e. Representation and Reality in Humans, Other Living Organisms and Intelligent Machines. International: Springer, 2012. Within this endeavor to comprehend a greater nature which seems to run and evolve via generative algorithmic programs, a chapter by the computer scientist team of Horsman and Vivien Kendon, University of Durham, along with Susan Stepney and J. P. W. Young, University of York, UK traces an iterative course by way of abstract information as it is manifestly represented. Photosynthesis, the process by which flora and fauna convert sunlight into chemical energy, is given as an example. They then conclude with allusions to a cosmos to consciousness evolutionary pathway of progressive self-representation. See by the authors When does a Physical System Compute? at arXiv:1309.7979 for a technical basis and The Natural Science of Computing in ACM Communications (August 2017) for a popular review.
Even the simplest known living organisms are complex chemical processing systems. But how sophisticated is the behaviour that arises from this? We present a framework in which even bacteria can be identified as capable of representing information in arbitrary signal molecules, to facilitate altering their behaviour to optimise their food supplies, for example. Known as Abstraction/Representation theory (AR theory), this framework makes precise the relationship between physical systems and abstract concepts. Originally developed to answer the question of when a physical system is computing, AR theory naturally extends to the realm of biological systems to bring clarity to questions of computation at the cellular level. (Abstract)
Horsman, Dominic, et al. The Natural Science of Computing. Communications of the ACM. August, 2017. As the lengthy editorial summary next conveys, computer scientists Horsman and Vivien Kendon, University of Durham, and Susan Stepney, University of York, UK offer that our pervasive 21st century computational abilities has revolutionary implications as it empowers studies such as the recent astronomical discovery of gravity waves, along with everywhere else from quantum to social realms.
Technology changes science. In 2016, the scientific community thrilled to news that the LIGO collaboration had detected gravitational waves for the first time. LIGO is the latest in a long line of revolutionary technologies in astronomy, from the ability to 'see' the universe from radio waves to gamma rays, or from detecting cosmic rays and neutrinos. The interplay of technological and fundamental theoretical advance is replicated across all the natural sciences—which include, we argue, computer science. Some early computing models were developed as abstract models of existing physical computing systems. Now, as novel computing devices—from quantum computers to DNA processors, and even vast networks of human 'social machines'—reach a critical stage of development, they reveal how computing technologies can drive the expansion of theoretical tools and models of computing.
Igamberdiev, Abir. Semiokinesis - Semiotic Autopoiesis of the Universe. Semiotica. 135/1-4, 2001. A fractal universe proceeds in its organic development by means of a recursive “self-representation of its Logos.” Life generates and organizes itself through open, nonequilibrium systems characterized by internal semiotic definitions. This affirms an emergent Platonic, textual reality which awaits our collective recognition.
The Universe is a semiotic connection of the infinity of Logos (Word) and the finiteness of its representation in the spatial-temporal structure of Cosmos (World). (20)
Jaeger, Gregg. Information and the Reconstruction of Quantum Physics. Annalen der Physik. 531/3, 2019. In a lead paper for a Physics of Information issue, the Boston University physicist philosopher first reviews precursor efforts by John Bell, Anton Zellinger, Jeffrey Bub, Carlo Rovelli onto Lucien Hardy, Giulio Chiribella, and others. Into the 21st century an informational component has conceptually become a prime, definitive quality. This expansive advance is then seen to augur for a wider synthesis toward a truly cosmic narrative reality.
The reconstruction of quantum physics has been connected with the interpretation of the quantum formalism, and by a deeper consideration of the relation of information to quantum states and processes. This recent form of reconstruction has provided new perspectives on physical correlations and entanglement that can be used to encode information. Here, a representative series of specifically information‐based treatments from partial reconstructions that make connections with information to rigorous axiomatizations, including those involving the theories of generalized probability and abstract systems is reviewed. (Abstract excerpt)
Ji, Sungchul. Language as a Model of Biocomplexity. International Conference on Complex Systems. May 23, 2000. In a paper presented at this conference, a cell biologist at Rutgers University describes a hierarchy of biological complexity where each level from biopolymers to societies and the biosphere is most defined by linguistic properties.
Johannsen, Wolfgang. On Semantic Information in Nature. Information. Online July, 2015. A Frankfurt School of Finance & Management theorist, by virtue of joining salient themes such as John Wheeler’s participatory ‘It from Bit,’ a semiotic, linguistic recurrence from universe to us, and an energetic, thermodynamic basis, reaches an integral synthesis as emergent degrees of meaningfulness. An Evolutionary Energetic Information Model with 15 tenets such as organisms as knowledge processors is proposed to contain and explain. Energy/entropy and information/semantics become a continuum, such that genomes and languages are versions of a natural source code. For a companion view, see Elements of a Semantic Code by Bernd-Olaf Kuppers (2013, search).
Joosten, Joost. Complexity Fits the Fittest. Zelinka, Ivan, et al, eds. How Nature Works: Complexity in Interdisciplinary Research and Applications. Berlin: Springer, 2014. The University of Barcelona logician is affiliated with the Algorthmic Nature group of the Paris-based Laboratory for Scientific Research for the Natural and Digital Sciences. By a general application of Stephen Wolfram’s cellular automata, the real presence a generative computational source in effect prior to selection can now be theoretically explained. This chapter, and a companion paper “On the Necessity of Complexity,” are available on the arXiv website.
In this paper we shall relate computational complexity to the principle of natural selection. We shall do this by giving a philosophical account of complexity versus universality. It seems sustainable to equate universal systems to complex systems or at least to potentially complex systems. Post’s problem on the existence of (natural) intermediate degrees then finds its analog in the Principle of Computational Equivalence (PCE). In this paper we address possible driving forces—if any—behind PCE. Both the natural aspects as well as the cognitive ones are investigated. We postulate a principle GNS that we call the Generalized Natural Selection principle that together with the Church-Turing thesis is seen to be in close correspondence to a weak version of PCE. Next, we view our cognitive toolkit in an evolutionary light and postulate a principle in analogy with Fodor’s language principle. (Complexity Fits the Fittest)
Kari, Lila and Grzegorz Rozenberg. The Many Facets of Natural Computing. Communications of the ACM. 51/10, 2008. Kari, Canada Research Chair in Biocomputing, University of Western Ontario, and Rozenberg, pioneer Leiden University information philosopher offer an illistrated paean to the theoretical vista that “Nature is computation.” By so doing, a cross-fertilization accrues whence perceptions of a dynamic natural “software” can in turn inspire more viable computer capabilities. Prime instances via computational systems biology are “genomic computers,” gene regulatory and biochemical networks, transport and cellular computing, interactive organisms, and so on. These cases, for example, can infer computational immune systems, particle swarm optimization, and onto membrane, molecular, or quantum computing. See also Rozenberg’s chapter “Computer Science, Informatics, and Natural Computing” in Cooper, Barry, et al, eds. New Computational Paradigms (Springer, 2008).
Keller, Evelyn Fox.
Towards a Science of Informed Matter.
Studies in History and Philosophy of Biological and Biomedical Sciences.
The MIT philosopher of science picks up on the insights of Chemistry laureate Jean-Marie Lehn (search) in support of a growing sense that nature’s materiality is not inert but suffused with prescriptive information. An inherent “non-equilibrium dynamics” is thus at work, so as to infer a “molecular informatics.” By these encounters, if one might allow a creative, organic universe, could these abstractions be actually trying to express a natural parent to child genetic code?
Over the last couple of decades, a call has begun to resound in a number of distinct fields of inquiry for a reattachment of form to matter, for an understanding of 'information' as inherently embodied, or, as Jean-Marie Lehn calls it, for a "science of informed matter." We hear this call most clearly in chemistry, in cognitive science, in molecular computation, and in robotics-all fields looking to biological processes to ground a new epistemology. (174)
Khrennikov, Andrei. Towards Information Lasers. Entropy. Online October, 2015. The prolific Linnaeus University physicist continues his project, with many colleagues, to reconceive quantum phenomena in terms of an intrinsic communicative quality. The article is also a good entry to this 21st century fundamental revolution.
In our modeling, the notion of information is a primary notion, which is not definable with the aid of more fundamental notions, such as probability. Such an approach matches various purely-informational approaches to physics celebrated in Wheeler’s statement: “It from bit. Otherwise put, every ‘it’, every particle, every field of force, even the space-time continuum itself derives its function, its meaning, its very existence entirely - even if in some contexts indirectly from the apparatus-elicited answers to yes-or-no questions, binary choices, bits. ‘It from bit’ symbolizes the idea that every item of the physical world has at bottom—a very deep bottom, in most instances—an immaterial source and explanation; that which we call reality arises in the last analysis from the posing of yes no questions and the registering of equipment-evoked responses; in short, that all things physical are information-theoretic in origin and that this is a participatory universe.” (6975)
Kitazono, Jun, et al. Efficient Algorithms for Searching the Minimum Information Partition in Integrated Information Theory. Entropy. 20/3, 2018. As this IIT approach (Tononi) whence consciousness is seen to rise in tandem with relative knowledge content grows in acceptance, Kitazono, Araya, Inc., Tokyo, Ryota Kanai, Kobe University, and Masafumi Oizumi (search), RIKEN Brain Science Institute, Hiroshima press its technical advance via algorithms and neural nets by which to better analyze and apply. See also concurrent papers Measuring Integrated Information by Pedro Mediano, et al at arXiv:1806.09373, and Integrated Information in the Thermodynamic Limit by Miguel Aguilera and Exequiel Di Paolo at 1806.07879. A good entry to ITT is What is Consciousness? by co-theorist Christoph Koch in Scientific American for June 2018.
The ability to integrate information in the brain is considered to be an essential property for cognition and consciousness. Integrated Information Theory (IIT) hypothesizes that the amount of integrated information in the brain is related to the level of consciousness. (Abstract)
Knuth, Kevin. Information-Based Physics: An Observer-Centric Foundation. Contemporary Physics. Online January, 2014. The SUNY Albany professor of physics and informatics continues the inspiration of John Archibald Wheeler that this existent reality is in some way founded upon and most distinguished by a communicative source and conveyance. Such a self-visualizing and activating cosmos then requires at a later point the presence of sentient observers to recognize, acknowledge, and so bring into full being. Knuth has a series of prior papers on arXiv such as The Physics of Events: A Potential Foundation for Emergent Space-Time. While they, and most theoretical papers, are written in a technical parlance, the point of the message could be that human beings are in fact significantly empowered and entitled to learn, discover, witness and self-select.
It is generally believed that physical laws, reflecting an inherent order in the universe, are ordained by nature. However, in modern physics the observer plays a central role raising questions about how an observer-centric physics can result in laws apparently worthy of a universal nature-centric physics. Over the last decade, we have found that the consistent apt quantification of algebraic and order-theoretic structures results in calculi that possess constraint equations taking the form of what are often considered to be physical laws. The result is an approach to foundational physics where laws derive from both consistent descriptions and optimal information-based inferences made by embedded observers. (Abstract excerpt)