|
III. Ecosmos: A Revolutionary Fertile, Habitable, Solar-Bioplanet, Incubator LifescapeC. The Information Computation Turn
Into the 21st century, aided by computational advances, the concept of information as broadly conceived from algorithms to literate knowledge has become recognized as a prime independent quality along with and prior to matter and energy, space and time. As many references such as James Gleick's The Information (2011) attest, a deep textual dimension provides a missing, codifying element in support of an organic genesis uniVerse. In several ways, the long temporal passage of an ecosmic nature to human culture gains a vital informational-computational empowerment. From Gottfried Leibniz, Alan Turing to Stephen Wolfram and Gregory Chaitin to Gordana Dodig-Gordana, Luciano Floridi, Wolfgang Hofkircher, Sara Walker and many others, the general perception is a universe that seems to independently program and iteratively compute itself into personal, emergent cognizance. By some analogous genotype/phenotype, software,hardware facility, a double generative mathematical domain reappears in our late midst. Another aspect is a similar informational essence within quantum phenomena, which is covered in the next section. Information Theoretic Foundations for Physics.. www.perimeterinstitute.ca/conferences/information-theoretic-foundations-physics. A site for this Perimeter Institute, Ontario, week-long conference in May 2015. Its synopsis below could be a good summary of the physics paradigm, or lack thereof, which may be remedied by this Turing turn. Among the 22 men and no women speakers are Gerard ‘t Hooft, Lee Smolin, and Giulio Chiribella, so this is a big event. In the last few decades, there has been a growing consensus that information theory is of fundamental importance for theoretical physics. The information paradigm has yielded successes in many different areas of the foundations of physics, including the development and application of quantum information theory, Jacobson’s thermodynamic derivation of the Einstein equations, the black-hole information paradox, Jaynes’ maximum entropy principle, or operational approaches to the foundations of quantum mechanics. The Science of Information: From Language to Black Holes. www.thegreatcourses.com/courses/the-science-of-information-from-language-to-black-holes. This Great Course online offering is taught by Benjamin Schumacher, a Kenyon College physicist. The 24 lectures span The Transformability of Information to The Meaning of Information, #14 and 23 noted below, to cover the gamut from quantum computation to electronic circuits. But the description cites a universe to human reality suffused with this quality but still stuck in a mechanical insignificance. This is the current quandary and paradigm shift that confounds us and must be resolved. The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries. Adamatzky, Andrew, et al. East-West Paths to Unconventional Computing. Progress in Biophysics and Molecular Biology. Online August, 2017. A paper for a forthcoming issue on a whole world bicameral scientific wisdom wherein a notable array of computational philosophers, see below, offer personal views upon a natural reality which in some way arises from a mathematic informative program. The Contributors are listed below, for example we cite Causality in Complexity by H. Zenil, Persian Philosophy by M. M. Dehshibi, Three Steps by S. Stepney, and Cooperation in Rebellion by C. Calude. Some words by S. Akl, Queen’s University, Ontario, may form a capsule. A paper for a forthcoming issue on a whole world bicameral scientific wisdom wherein a notable array of computational philosophers, see below, offer personal views upon a natural reality which in some way arises from a mathematic informative program. The Contributors are listed below, for example we cite Causality in Complexity by H. Zenil, Persian Philosophy by M. M. Dehshibi, Three Steps by S. Stepney, and Cooperation in Rebellion by C. Calude. Some words by S. Akl, Queen’s University, Ontario, may form a capsule.
Adami, Christoph.
The Evolution of Biological Information: How Evolution Creates Complexity, from Viruses to Brains.
Princeton: Princeton University Press,
2024.
A senior authority in computational biology for over 20 years based at Michigan State University contributes his unique, 600 page opus. Its comprehensive chapters flow from the subject theory to biotic precursors, its genetic prescription, digital lab experiments, physiologic robustness, RNA origins, all the way to ascendant intelligence and social cooperation. As these aspects unfold in thorough turn they are braced by many equations. In this final chapter I discussed three prime scales of life: how information is the key element that distinguishes life from nonlife, how the communication of information is the main factor that makes cooperation possible from cells to societies and how information is used to predict the future state of the world. Just as I wrote in the chapter introduction, once we look at biology in the light of information, it is hard not to see the fundamental importance of this concept. (519) Adami, Christoph. The Use of Information Theory in Evolutionary Biology. Annals of the New York Academy of Sciences. Vol. 1256/49, 2012. The Michigan State University computational scientist advances the perception that what life is about, what is going on from proteins to people, is most of all informational and communicative in kind. By this view the rise of animate complexity can be tracked by relative abilities to learn, process, and integrate knowledge. Might we then consider tracing the other way to a vital conducive cosmos which may take on the guise of a textual genomic-like source? In this review, I focus on two uses of information theory in evolutionary biology: First, the quantification of the information content of genes and proteins and how this information may have evolved along the branches of the tree of life. Second, the evolution of information-processing structures (such as brains) that control animals, and how the functional complexity of these brains (and how they evolve) could be quantified using information theory. The latter approach reinforces a concept that has appeared in neuroscience repeatedly: the value of information for an adapted organism is fitness, and the complexity of an organism’s brain must be reflected in how it manages to process, integrate, and make use of information for its own advantage. (50) Ananthaswamy, Anil. The Evolution of Reality. www.fqxi.org/community/articles. A posting on this Foundational Questions Institute site on November 10, 2009 about the quantum information speculations of LANL physicist Wojciech Zurek, as Verdral below also, draw much from John Archibald Wheeler. If you go on to click Community and Articles, a wide array of views appear, such as “The Crystallizing Universe” by Kate Becker, that yet languish in a material, insensate cosmos knowable only from remote, arcane domains, rather than as an organic, universe to human emergence.
Ananthaswamy, Anil.
Why Machines Learn: The Elegant Math Behind Modern AI.
New York: Dutton,
2024.
The veteran science expositor provides a most comprehensive discourse from its historic origins such as Gottfied Leibniz’s alphabets, algebra/calculus and more onto late 20th century digital/analog algorithmic computations and into the 2020s its radical AI neural net, large language, ChatGPT information phase. By way of an ecosmos vista then, it could well seem that such a participatory cocreation may be trying to pass its whole genetic-like operating system to our Earthuman prodigy going forward. Arrighi, Pablo and Jonathan Grattage. The Quantum Game of Life. Physics World. June, 2012. As a “digital cosmos” cover story, University of Grenoble “quantum information scientists” report that views of the physical universe as due to a computational process have grown and matured into a major theoretical approach. Precursors are John Archibald Wheeler’s famous It from Bit,, John Conway’s Game of Life, Stephen Wolfram’s cellular automata, and lately Seth Lloyd, Ed Fredkin, and Lee Smolin, search arXiv for an increasing number of advocates. As Vlatko Verdal and others contend, an informational reconception of quantum theory is much in order. A & G note that such a generative program implies an “intrinsic universality” since the same natural code would run everywhere. As the tangled string theory multiverse implodes, this 21st century doubleness, albeit still mechanical, is on the wane to take its place. The idea that our universe can be modelled as a giant computer dates back to the 1970s. But as Pablo Arrighi and Jonathan Grattage describe, quantum-information theorists are now hoping to revitalize this idea by making the "digital physics" project compatible with quantum theory. (23) Digital physicists, for their part, are like characters in a video game who are desperately trying to understand the rules. (24) Auletta, Gennaro. Cognitive Biology: Dealing with Information from Bacteria to Minds. Oxford: Oxford University Press, 2011. Reviewed more in Current Vistas, a huge contribution to this whole information and intelligence revolution. Bates, Marcia. Information and Knowledge: An Evolutionary Framework for Information Science. Information Research. 10/4, 2005. The UCLA professor of information studies is a main spokesperson for 21st century understandings of these cognitive and societal qualities, as they seem to grow in natural significance. She often contributes essays in regard for reference works such as the Encyclopedia of Library and Information Sciences, check her copious Publications webpage. This article includes an extensive survey and bibliography to date. Argument. Information 1 is defined as the pattern of organization of matter and energy. Information 2 is defined as some pattern of organization of matter and energy that has been given meaning by a living being. Knowledge is defined as information given meaning and integrated with other contents of understanding. Elaboration. The approach is rooted in an evolutionary framework; that is, modes of information perception, processing, transmission, and storage are seen to have developed as a part of the general evolution of members of the animal kingdom. Conclusion. Thus, rather than being reductionist, the approach taken demonstrates the fundamentally emergent nature of most of what higher animals and human beings, in particular, experience as information. Bawden, David. Organized Complexity, Meaning and Understanding. Aslib Proceedings: New Information Perspectives. 59/4-5, 2007. In this Association for Information Management journal, the City University London information scientist provides a comprehensive survey to date of the waxing reconception from physical cosmology to biology, evolution, and human societies in terms of the prime presence and influence of informative communication. For a 2013 update see “Deep Down Things:” In What Ways in Information Physical in Information Research (18/3), access from DBs website, along with his Towards Quantum Information Science in JASIST, online June 2014 (search). The purpose of this paper is to outline an approach to a unified framework for understanding the concept of “information” in the physical, biological and human domains, and to see what links and interactions may be found between them. This analysis is used to re-examine the information system discipline, with a view to locating it in a larger context. In turn, this picture is used to reflect on the possibility that information science may not only draw from these other disciplines, but that its insights may contribute to them. (307) Bawden, David and Lyn Robinson. “Waiting for Carnot:” Information and Complexity. Journal of the Association for Information Science and Technology. Online May, 2015. City University London scholars and authors (search here and web pages) contribute to on-going understandings of these title qualities with regard to parallel, integral, evolutionary relations, set within entropic dimensions. A copious bibliography is appended, from which one might suggest Entropy, Complexity, and Spatial Information by Michael Batty, et al in the Journal of Geographical Systems (16/4, 2014), which finds a similar parallel and course. The relationship between information and complexity is analyzed using a detailed literature analysis. Complexity is a multifaceted concept, with no single agreed definition. There are numerous approaches to defining and measuring complexity and organization, all involving the idea of information. Conceptions of complexity, order, organization, and “interesting order” are inextricably intertwined with those of information. Shannon's formalism captures information's unpredictable creative contributions to organized complexity; a full understanding of information's relation to structure and order is still lacking. Conceptual investigations of this topic should enrich the theoretical basis of the information science discipline, and create fruitful links with other disciplines that study the concepts of information and complexity. (Abstract)
1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 Next [More Pages]
|
||||||||||||||||||||||||||||||||||||||||||||||
HOME |
TABLE OF CONTENTS |
Introduction |
GENESIS VISION |
LEARNING PLANET |
ORGANIC UNIVERSE |