(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

III. Ecosmos: A Revolutionary Fertile, Habitable, Solar-Bioplanet, Incubator Lifescape

C. The Information Computation Turn

Into the 21st century, aided by computational advances, the concept of information as broadly conceived from algorithms to literate knowledge has become recognized as a prime independent quality along with and prior to matter and energy, space and time. As many references such as James Gleick's The Information (2011) attest, a deep textual dimension provides a missing, codifying element in support of an organic genesis uniVerse. In several ways, the long temporal passage of an ecosmic nature to human culture gains a vital informational-computational empowerment. From Gottfried Leibniz, Alan Turing to Stephen Wolfram and Gregory Chaitin to Gordana Dodig-Gordana, Luciano Floridi, Wolfgang Hofkircher, Sara Walker and many others, the general perception is a universe that seems to independently program and iteratively compute itself into personal, emergent cognizance. By some analogous genotype/phenotype, software,hardware facility, a double generative mathematical domain reappears in our late midst. Another aspect is a similar informational essence within quantum phenomena, which is covered in the next section.

2020: By this date, sanother feature being found by our composite global genius is a theoretical and evidential presence, in addition to space, time, matter and energy, of a maybe more fundamental, mathematic, generative domain. In regard, its intrinsic informative content would appear to program and prescribe the temporal development of an animate, informed, personified uniVerse. In accord with the large Ecosmomics chapter ahead, some 136 entries herein open windows upon a deeply textual source code which at once independently underlies an organic genesis while seeming to rise along with life’s exemplary evolution all the way to our sapient cocreative endowment.

Bawden, David and Lyn Robinson. Still Minding the Gap? Reflecting on Transitions between Concepts of Information. Information. 11/2, 2020.
Cuffaro, Michael and Samuel Fletcher. Physical Perspectives on Computation, Computational Perspectives on Physics. Cambridge: Cambridge University Press, 2018.
Davies, Paul. The Demon in the Machine: How Hidden Webs of Information are Solving the Mystery of Life. London: Allen Lane, 2019.
Ensslin, Torsten, et al. The Physics of Information. Annalen der Physik. 531/3, 2019.
Dodig-Crnkovic, Gordana and Raffaela Giovagnoli, eds. Representation and Reality in Humans, Other Living Organisms and Intelligent Machines. International: Springer Praxis, 2017.

Floridi, Luciano. The Logic of Information. Oxford: Oxford University Press, 2019.
Ghavasieh, Arsham and Manilo De Domenico. Statistical Physics of Network Structure and Information Dynamics. Journal of Physics: Complexity. February, 2022.
Mainzer, Klaus. The Digital and the Real Universe: Foundations of Natural Philosophy and Computational Physics. Philosophies. 4/1, 2019.
Mora, Thierry, et al. Special Issue on Information Processing in Living Systems. Journal of Statistical Physics. 162/5, 2017.
Stepney, Susan, et al, eds. Computational Matter. International: Springer, 2018.
Walker, Sara Imari, et al, eds. From Matter to Life: Information and Causality. Cambridge: Cambridge University Press, 2017.

2023:

Information Theoretic Foundations for Physics.. www.perimeterinstitute.ca/conferences/information-theoretic-foundations-physics. A site for this Perimeter Institute, Ontario, week-long conference in May 2015. Its synopsis below could be a good summary of the physics paradigm, or lack thereof, which may be remedied by this Turing turn. Among the 22 men and no women speakers are Gerard ‘t Hooft, Lee Smolin, and Giulio Chiribella, so this is a big event.

In the last few decades, there has been a growing consensus that information theory is of fundamental importance for theoretical physics. The information paradigm has yielded successes in many different areas of the foundations of physics, including the development and application of quantum information theory, Jacobson’s thermodynamic derivation of the Einstein equations, the black-hole information paradox, Jaynes’ maximum entropy principle, or operational approaches to the foundations of quantum mechanics.

Parallel to this fruitful development, there is a growing number of physicists who endorse the general attitude that fundamental physics may need new foundations or at least a new perspective. Despite amazing recent successes, there remains the profound challenge that experiments have not confirmed many of the recent theoretical ideas: the universe seems simpler than our ideas of unification predicted. Information-theoretic approaches can provide a starting point, which is exceptionally careful in the assumptions that it makes, and broad in its applicability, since it does not rely on specific laws of motion or formal properties of underlying theories. (Synopsis)

The Science of Information: From Language to Black Holes. www.thegreatcourses.com/courses/the-science-of-information-from-language-to-black-holes. This Great Course online offering is taught by Benjamin Schumacher, a Kenyon College physicist. The 24 lectures span The Transformability of Information to The Meaning of Information, #14 and 23 noted below, to cover the gamut from quantum computation to electronic circuits. But the description cites a universe to human reality suffused with this quality but still stuck in a mechanical insignificance. This is the current quandary and paradigm shift that confounds us and must be resolved.

The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.

14. DNA, RNA, and the protein molecules they assemble are so interdependent that it's hard to picture how life got started in the first place. Survey a selection of intriguing theories, including the view that genetic information in living cells results from eons of natural computation. 23. Physicist John A. Wheeler's phrase "It from bit" makes a profound point about the connection between reality and information. Follow this idea into a black hole to investigate the status of information in a place of unlimited density. Also explore the information content of the entire universe!

Adamatzky, Andrew, et al. East-West Paths to Unconventional Computing. Progress in Biophysics and Molecular Biology. Online August, 2017. A paper for a forthcoming issue on a whole world bicameral scientific wisdom wherein a notable array of computational philosophers, see below, offer personal views upon a natural reality which in some way arises from a mathematic informative program. The Contributors are listed below, for example we cite Causality in Complexity by H. Zenil, Persian Philosophy by M. M. Dehshibi, Three Steps by S. Stepney, and Cooperation in Rebellion by C. Calude. Some words by S. Akl, Queen’s University, Ontario, may form a capsule.

A paper for a forthcoming issue on a whole world bicameral scientific wisdom wherein a notable array of computational philosophers, see below, offer personal views upon a natural reality which in some way arises from a mathematic informative program. The Contributors are listed below, for example we cite Causality in Complexity by H. Zenil, Persian Philosophy by M. M. Dehshibi, Three Steps by S. Stepney, and Cooperation in Rebellion by C. Calude. Some words by S. Akl, Queen’s University, Ontario, may form a capsule.

Adami, Christoph. The Evolution of Biological Information: How Evolution Creates Complexity, from Viruses to Brains. Princeton: Princeton University Press, 2024. A senior authority in computational biology for over 20 years based at Michigan State University contributes his unique, 600 page opus. Its comprehensive chapters flow from the subject theory to biotic precursors, its genetic prescription, digital lab experiments, physiologic robustness, RNA origins, all the way to ascendant intelligence and social cooperation. As these aspects unfold in thorough turn they are braced by many equations.

An overall course becomes evident from the earliest replicants to metabolic biomolecules, cellular animals and relative linguistic knowledge. Although not a theme, it does trace a central trend that winds up with our sentient, observant selves. While Caleb Scharf and Yavul Harari write of algorithms that pass on from the human phase, for Adami the plot thickens and quickens as life’s genomic heredity proceeds from simple cells to neural cognizance. By 2024, a temporal progression may come into view of a procreative ecosmos were trying to decipher and read its own hereditary endowment.

In this final chapter I discussed three prime scales of life: how information is the key element that distinguishes life from nonlife, how the communication of information is the main factor that makes cooperation possible from cells to societies and how information is used to predict the future state of the world. Just as I wrote in the chapter introduction, once we look at biology in the light of information, it is hard not to see the fundamental importance of this concept. (519)

In this book, Christoph Adami adds a 21st century perspective to Darwinian evolution through the lens of an informational quality. This novel theoretical stance sheds light on how cells evolve to communicate, intelligence arises and viruses evolve drug resistance, By this account, information emerges as the central unifying principle which allows us to think about the origin of life on Earth and elsewhere. A leader in the field of computational biology, Adami especially considers the information theory of biomolecules and its content in genetic sequences and proteins. After viewing bacteria and digital organisms, he goes onto explain cooperation among cells, animals, and people. (MIT)

Christoph Adami is professor of microbiology and molecular genetics at Michigan State University. A pioneer in the application of methods from information theory to the study of evolution, he designed the Avida system that launched the use of digital life as a tool for investigating basic questions in evolutionary biology.

Adami, Christoph. The Use of Information Theory in Evolutionary Biology. Annals of the New York Academy of Sciences. Vol. 1256/49, 2012. The Michigan State University computational scientist advances the perception that what life is about, what is going on from proteins to people, is most of all informational and communicative in kind. By this view the rise of animate complexity can be tracked by relative abilities to learn, process, and integrate knowledge. Might we then consider tracing the other way to a vital conducive cosmos which may take on the guise of a textual genomic-like source?

In this review, I focus on two uses of information theory in evolutionary biology: First, the quantification of the information content of genes and proteins and how this information may have evolved along the branches of the tree of life. Second, the evolution of information-processing structures (such as brains) that control animals, and how the functional complexity of these brains (and how they evolve) could be quantified using information theory. The latter approach reinforces a concept that has appeared in neuroscience repeatedly: the value of information for an adapted organism is fitness, and the complexity of an organism’s brain must be reflected in how it manages to process, integrate, and make use of information for its own advantage. (50)

Information is the central currency for organismal fitness, and appears to be that which increases when organisms adapt to their niche. Information about the niche is stored in genes, and used to make predictions about the future states of the environment. Because fitness is higher in well-predicted environments (simply because it is easier to take advantage of the environment’s features for reproduction if they are predictable), organisms with more information about their niche are expected to outcompete those with less information, suggesting a direct relationship between information content and fitness within a niche. (62)

Ananthaswamy, Anil. The Evolution of Reality. www.fqxi.org/community/articles. A posting on this Foundational Questions Institute site on November 10, 2009 about the quantum information speculations of LANL physicist Wojciech Zurek, as Verdral below also, draw much from John Archibald Wheeler. If you go on to click Community and Articles, a wide array of views appear, such as “The Crystallizing Universe” by Kate Becker, that yet languish in a material, insensate cosmos knowable only from remote, arcane domains, rather than as an organic, universe to human emergence.

Ananthaswamy, Anil. Why Machines Learn: The Elegant Math Behind Modern AI. New York: Dutton, 2024. The veteran science expositor provides a most comprehensive discourse from its historic origins such as Gottfied Leibniz’s alphabets, algebra/calculus and more onto late 20th century digital/analog algorithmic computations and into the 2020s its radical AI neural net, large language, ChatGPT information phase. By way of an ecosmos vista then, it could well seem that such a participatory cocreation may be trying to pass its whole genetic-like operating system to our Earthuman prodigy going forward.

Arrighi, Pablo and Jonathan Grattage. The Quantum Game of Life. Physics World. June, 2012. As a “digital cosmos” cover story, University of Grenoble “quantum information scientists” report that views of the physical universe as due to a computational process have grown and matured into a major theoretical approach. Precursors are John Archibald Wheeler’s famous It from Bit,, John Conway’s Game of Life, Stephen Wolfram’s cellular automata, and lately Seth Lloyd, Ed Fredkin, and Lee Smolin, search arXiv for an increasing number of advocates. As Vlatko Verdal and others contend, an informational reconception of quantum theory is much in order. A & G note that such a generative program implies an “intrinsic universality” since the same natural code would run everywhere. As the tangled string theory multiverse implodes, this 21st century doubleness, albeit still mechanical, is on the wane to take its place.

The idea that our universe can be modelled as a giant computer dates back to the 1970s. But as Pablo Arrighi and Jonathan Grattage describe, quantum-information theorists are now hoping to revitalize this idea by making the "digital physics" project compatible with quantum theory. (23) Digital physicists, for their part, are like characters in a video game who are desperately trying to understand the rules. (24)

In other words, if the rule of a particular quantum cellular automaton is complex enough, then it can simulate all other quantum cellular automata. A quantum cellular automaton that can perform such a simulation is said to have intrinsic universality, a concept we have developed in the quantum setting. Hence, if we can find the simplest intrinsically universal rule for a quantum cellular automaton, we can use it to find the simplest and most “natural” (in the sense of how nature does it) way of implementing or simulating physical phenomena. (26)

Auletta, Gennaro. Cognitive Biology: Dealing with Information from Bacteria to Minds. Oxford: Oxford University Press, 2011. Reviewed more in Current Vistas, a huge contribution to this whole information and intelligence revolution.

Bates, Marcia. Information and Knowledge: An Evolutionary Framework for Information Science. Information Research. 10/4, 2005. The UCLA professor of information studies is a main spokesperson for 21st century understandings of these cognitive and societal qualities, as they seem to grow in natural significance. She often contributes essays in regard for reference works such as the Encyclopedia of Library and Information Sciences, check her copious Publications webpage. This article includes an extensive survey and bibliography to date.

Argument. Information 1 is defined as the pattern of organization of matter and energy. Information 2 is defined as some pattern of organization of matter and energy that has been given meaning by a living being. Knowledge is defined as information given meaning and integrated with other contents of understanding. Elaboration. The approach is rooted in an evolutionary framework; that is, modes of information perception, processing, transmission, and storage are seen to have developed as a part of the general evolution of members of the animal kingdom. Conclusion. Thus, rather than being reductionist, the approach taken demonstrates the fundamentally emergent nature of most of what higher animals and human beings, in particular, experience as information.

The Evolutionary Context. All these differentiations, clusterings and bunchings of matter and energy present in the universe can be the subject of animals' perceptions and interactions with the world, whether with animals, plants, rocks, i.e., matter, or with waves of sound or waves of the earth in an earthquake, i.e., energy. Animals are able to detect and process those differentiations in the universe and experience them as patterns of organization. (7) Surely, it enriches our understanding in information science to begin with this basic understanding of information, then see how this elemental definition contributes to ever-more-sophisticated understandings of patterns of organization, of information, in animal and human lives. (10)

Bawden, David. Organized Complexity, Meaning and Understanding. Aslib Proceedings: New Information Perspectives. 59/4-5, 2007. In this Association for Information Management journal, the City University London information scientist provides a comprehensive survey to date of the waxing reconception from physical cosmology to biology, evolution, and human societies in terms of the prime presence and influence of informative communication. For a 2013 update see “Deep Down Things:” In What Ways in Information Physical in Information Research (18/3), access from DBs website, along with his Towards Quantum Information Science in JASIST, online June 2014 (search).

The purpose of this paper is to outline an approach to a unified framework for understanding the concept of “information” in the physical, biological and human domains, and to see what links and interactions may be found between them. This analysis is used to re-examine the information system discipline, with a view to locating it in a larger context. In turn, this picture is used to reflect on the possibility that information science may not only draw from these other disciplines, but that its insights may contribute to them. (307)

This paper has argued that information may be seen in the physical domain as patterns of organised complexity of matter and energy, with the added implication that information, and perhaps meaning and consciousness, may underlie and suffuse the physical universe to a remarkable extent. In the biological domain, meaning-in-context emerges from the self-organised complexity of biological organisms. In the human domain, understanding emerges from the complex interactions of World 2, the mental product of the human consciousness, with World 3, the social product of recorded human knowledge. The term “emerges” is used deliberately, for these are emergent properties, that is to say they appear appropriate to their level: physical, biological, conscious and social. The linking thread, and the unifying concept of information here, is organized complexity. The crucial events which allow the emergence of new properties are: (1) the origin of the universe, which spawned organised complexity itself; (2) the origin of life, which allowed meaning-in-context to emerge; and (3) the origin of consciousness, which allows self-reflection, and the emergence of understanding, at least partly occasioned when the self reflects on the recorded knowledge created by other selves. (318-319)

Bawden, David and Lyn Robinson. “Waiting for Carnot:” Information and Complexity. Journal of the Association for Information Science and Technology. Online May, 2015. City University London scholars and authors (search here and web pages) contribute to on-going understandings of these title qualities with regard to parallel, integral, evolutionary relations, set within entropic dimensions. A copious bibliography is appended, from which one might suggest Entropy, Complexity, and Spatial Information by Michael Batty, et al in the Journal of Geographical Systems (16/4, 2014), which finds a similar parallel and course.

The relationship between information and complexity is analyzed using a detailed literature analysis. Complexity is a multifaceted concept, with no single agreed definition. There are numerous approaches to defining and measuring complexity and organization, all involving the idea of information. Conceptions of complexity, order, organization, and “interesting order” are inextricably intertwined with those of information. Shannon's formalism captures information's unpredictable creative contributions to organized complexity; a full understanding of information's relation to structure and order is still lacking. Conceptual investigations of this topic should enrich the theoretical basis of the information science discipline, and create fruitful links with other disciplines that study the concepts of information and complexity. (Abstract)

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next  [More Pages]