(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

III. Ecosmos: A Revolutionary Fertile, Habitable, Solar-Bioplanet Lifescape

5. Universal Evolution: A Celestial Expanse

Fraix-Burnet, Didier, et al. The Phylogeny of Quasars and the Ontogeny of Their Central Black Holes. Frontiers in Astronomy and Space Science. February, 2017. A latest posting by the Institute of Planetology and Astrophysics of Grenoble natural philosopher about this project, here with Paola Marziani, Padova Astronomical Observatory, Mauro D’Onofrio, University of Padova, and Deborah Dultzin, UNAM Astronomical Institute, to sketch out an evolutionary astrocladistics (Google) for diverse galaxies akin to systematic groupings of organisms. See also their 2017 paper Phylogenetic Analyses of Quasars and Galaxies in this journal, along with Phylogenetic Tools in Astrophysics (1703.00286), The Phylogeny of Quasars (1702.02468), and Concepts of Phylogenetic Classification and Taxonomy (1606.016310 by Fraix-Burnet. For another perception see Cosmic Phylogeny by Paula Jofre, et al. A whole scale Cosmic Cladistics then seems a thought away so as to complete a universe to us developmental genesis.

A latest posting by the Institute of Planetology and Astrophysics of Grenoble natural philosopher about this project, here with Paola Marziani, Padova Astronomical Observatory, Mauro D’Onofrio, University of Padova, and Deborah Dultzin, UNAM Astronomical Institute, to discern and construct an evolutionary astrocladistics (Google) for diverse galaxies akin to systematic groupings of organisms. See also their 2017 paper Phylogenetic Analyses of Quasars and Galaxies in this journal, along with Phylogenetic Tools in Astrophysics (1703.00286), The Phylogeny of Quasars (1702.02468), and Concepts of Phylogenetic Classification and Taxonomy (1606.016310 by Fraix-Burnet. For a further avail see Cosmic Phylogeny by Paula Jofre, et al. A whole scale Cosmic Cladistics then seems a thought away so as to complete a universe to us developmental genesis.

Friston, Karl. The History of the Future of the Bayesian Brain. NeuroImage. 62/1230, 2012. After being immersed for two decades in British and American neuroscience, the now Scientific Director of the Wellcome Trust Center for Neuroimaging surveys the discovery in those years of a cerebral dynamic self-organization, along with a cognitive faculty distinguished by an interactive responsiveness via hierarchical scales in congruence with its greater environment. Such a “Bayesian brain” is busy with optimizing its “beliefs” about any input or reply, so as to minimize any expense of “free energy.” As Friston speaks for the field, the approach, via “statistical physics and information theory,” can be seen to reveal another means to join human and universe.

Thomas Bayes (1701-1761) was a British mathematician and Presbyterian minister. Bayesian Statistics is a subset of the field of statistics in which the evidence about the true state of the world is expressed in terms of degrees of belief or, more specifically, Bayesian probabilities. Such an interpretation is only one of a number of interpretations of probability and there are many other statistical techniques that are not based on "degrees of belief". (Wikipedia)

The future of the Bayesian brain is clear: it is the application of dynamic causal modeling to understand how the brain conforms to the free energy principle. In this context, the Bayesian brain is a corollary of the free energy principle, which says that any self-organizing system (like a brain or neuroimaging community) must maximize the evidence for its own existence, which means it must minimize its free energy using a model of its world. Dynamic causal modeling involves finding models of the brain that have the greatest evidence or the lowest free energy. In short, the future of imaging neuroscience is to refine models of the brain to minimize free energy, where the brain refines models of the world to minimize free energy. This endeavor itself minimizes free energy because our community is itself a self organizing system. (Abstract, 1230)

This means that a Bayesian brain that tries to maximize its evidence is implicitly trying to minimize its entropy. In other words, it resists the second law of thermodynamics and provides a principled explanation for self organization in the face of a natural tendency to disorder. This means the Bayesian brain gracefully accommodates ensemble or population dynamics in evolutionary thinking within a statistical framework. In functionalist terms, such a self organizing system that minimizes its entropy would appear to be making Bayesian inferences about its sensory exchanges with the environment, which, of course, is just the Bayesian brain hypothesis. (1233)

Goeree, Jacob, et al. Quantal Response Equilibrium: A Stochastic Theory of Games. Princeton: Princeton University Press, 2016. As the book summary notes, Goeree, a University of Technology Sydney physicist and economist, with Charles Holt, a University of Virginia political scientist, and Thomas Balfrey, a Caltech political economist, propose a more realistic theory to match actual individual and societal behaviors.

Quantal Response Equilibrium presents a stochastic theory of games that unites probabilistic choice models developed in psychology and statistics with the Nash equilibrium approach of classical game theory. Nash equilibrium assumes precise and perfect decision making in games, but human behavior is inherently stochastic and people realize that the behavior of others is not perfectly predictable. In contrast, QRE models choice behavior as probabilistic and extends classical game theory into a more realistic and useful framework with broad applications for economics, political science, management, and other social sciences.

Grinin, Leonid, et al. Evolutionary Megaparadigms: Potential, Problems, Perspectives. Grinin, Leonid et al, eds. Evolution: Cosmic, Biological, and Social. Volgograd: Uchitel Publishing, 2011. An introductory chapter for the first edition of an Almanac posted in full on the Sociostudies site (www.sociostudies.org) of its publisher. The editors and authors are Grinin, director of the Volgograd Center for Social Research, Andrey Lorotayev, Russian State University for the Humanities social anthropologist, Robert Carneiro, curator of the American Museum of Natural History, and Fred Spier, University of Amsterdam cosmic historian. But as The Russian Cosmists by George Young (Historic Prescience 2012) records, in contrast to Western analyses of an insensate, accidental reality, the traditional Russian mindset holds more to an Eastern, holistic persuasion. As a result, while cognizant of the latest science, an animate universe to human evolutionary genesis is allowed, akin to Vladimir Vernadsky, Pierre Teilhard de Chardin and a long heritage from Herbert Spencer to ancient Grecian origins.

“Megaparadigm” then represents an abiding milieu of intrinsic, independent “general evolutionary laws, principles, dynamics, vectors” as they serve to generate life’s emergent development and knowing sentience wherever conducive. A Universal Evolutionism arrays into nested scales of self-organized complexities by way of exemplary recurrences. A quickening vitality and growth thus springs from active matter, energy and information, which lately arises from a biosphere phase to a (potentially) intelligent, reasonable noosphere. An intent of this frontier project is to get a clear, actual perception of a precious Earth and a genesis universe for a much better common future.

One of the clearest manifestations of the evolutionary approach is the form of universal evolutionism (Big History) that considers the process of evolution as a continuous and integral process – from the Big Bang all the way down to the current state of human affairs and beyond. Universal evolutionism implies that cosmic, chemical, geological, biological, and social types of macroevolution exhibit forms of structural continuity. The great importance of this approach (that has both the widest possible scope and a sound scientific basis) is evident. It strives to encompass within a single theoretical framework all the major phases of the universe, from the Big Bang down to forecasts for the entire foreseeable future, while showing that the present state of humankind is a result of the self-organization of matter. (10)

Guglielmo, Magda, et al. A Genetic Approach to the History of the Magellanic Clouds. Monthly Notices of the Royal Astronomical Society. Online August, 2014. With Geraint Lewis and Joss Bland-Hawthorn, University of Sydney astrophysicists propose this novel procedure drawn from the field of bioinformatics to aid their studies of interstellar phenomena. As a result, a unique application of evolutionary biology terms, techniques, and selective processes to these far celestial reaches is achieved. The method is akin to the popular Bayesian statistics, also Markov processes, computational algorithms, which inference, altogether treat the cosmos as some manner of a universal Darwinism.


The two Magellanic Clouds are a duo of irregular dwarf galaxies visible from the southern hemisphere, which are members of our Local Group and may be orbiting our Milky Way galaxy. (Wikipedia)

The history of the Magellanic Clouds is investigated using N-body hydrodynamic simulations where the initial conditions are set by a genetic algorithm. This technique allows us to identify possible orbits for the Magellanic Clouds around the Milky Way, by directly comparing the simulations with observational constraints. We explore the parameter space of the interaction between the Magellanic Clouds and the Milky Way, considering as free parameters the proper motions of the Magellanic Clouds, the virial mass and the concentration parameter (c) of the Galactic dark matter halo. In both orbital models presented here, the mutual interaction between the Magellanic Clouds is able to reproduce the observed features of the Magellanic System. (Abstract excerpts)

Why Do We Need a Genetic Algorithm? Emulating the biological concept of evolution, the genetic algorithm (GA) is a powerful tool to explore a complex parameter space. In biology, given a set of possible genetic sequences (“population of individuals”), the fittest organisms are those strong enough to survive and reproduce themselves in their environments: nature selects creatures with a high probability of survival (“survival of the fittest”). In optimisation problems, given a set of “possible solutions”, the best is the one which better adapts to the requirements imposed by the model. The genetic algorithm mimics the reproduction, mutation and selection to arrive at the fittest set of parameters. Keeping the same terminology from the biological world, a gene is the value of a particular parameter and the phenotype encodes the collection of all parameters which describe a possible solution. (16)

A simple genetic algorithm consists of the following steps: (i) Start by randomly generating an initial population of phenotypes, each representing a possible solution. (ii) Evaluate the fitness of each member of the current population. (iii) Select a pair of genotypes (“parents”) from the current population and breed them, based on their merit. In this way, two new solutions are generated (“offspring”). Repeat this step until the number of offspring produced equals the number of individuals in the current population. (iv) Replace the old population with the new one. (v) Repeat from step (ii) until the fitness criterion is satisfied. (16-17)

Harman, Willis and Elisabet Sathouris. Biology Revisioned. Berkeley, CA: North Atlantic, 1998. Noted elsewhere, for this section Sahtouris perceives a process of natural selection on a planetary and universe scale.

My metaphor for the reproduction of cosmic life is that the Cosmos scatters planets as star seed, much as plants and animals here below scatter their seed. In both cases, only few seeds in this prolific venture of life actually “sprout” - those that land in the right conditions to support their continuing life.

Jackson, Holly, et al. Using Heritability of Stellar Chemistry to Reveal the History of the Milky Way. arXiv:2011.06453. For a paper to appear in the Monthly Notices of the Royal Astronomical Society, an international, interdisciplinary team from MIT, University of Diego Portales, Chile (Paula Jofre, search), Cambridge University, and the University of Surrey (Robert Foley) continue to perceive and advance evident comparisons between biological and astrophysical evolutionary patterns and processes.

Since chemical abundances are inherited between generations of stars, we use them to trace the evolutionary history of our Galaxy. We present a robust methodology for creating a phylogenetic tree, a biological tool often used to study heritability. Combining our phylogeny with information on stellar ages and dynamical properties, we reconstruct the shared history of 78 stars in the solar neighborhood. The branching pattern in our tree supports a scenario in which the thick disk is an ancestral population of the thin disk. In this paper, we demonstrate how a biological, phylogenetic perspective can help study key processes that have contributed to the evolution of the Milky Way. (Abstract excerpt)

In any evolutionary analysis, there are two components – the actual phylogeny of the constituent lineages, and the exosystem context,. The former is reconstructed from the heritable traits of the organisms, and the latter by the environment shaping the phylogeny. In galaxy evolution, the traits or variables contributing to heritable component are those encoding the chemical pattern of the stars. Variables that reflect the context include those that describe the dynamical situation of the star, such as interactions with the bar and spiral arms and/or the galaxy’s external setting. (2)

Jofre, Paula, et al. Cosmic Phylogeny: Reconstructing the Chemical History of the Solar Neighborhood with an Evolutionary Tree. Monthly Notices of the Royal Astronomical Society. 467/1, 2017. Paula Jofre, a Cambridge University astronomer, with Universidad Diego Portales, Chile, astronomer Payel Das, Universitat Pompeu Fabra, Spain, biologist Jaume Bertranpetit, and Cambridge University anthropologist Robert Foley contribute to a novel reconsideration of stellar and galactic dynamics by way of branching populations of organisms. With a recognition of prior work by Didier Fraix-Burnet (search) and others, phylogenetic trees are constructed for stars in the Milky Way, dubbed an astrocladistics, so as to extend Darwinian evolution across the celestial raiment. See also Galactic Phylogenetics by Paula Jofre and Payel Das at arXiv:1709.09338.

Using 17 chemical elements as a proxy for stellar DNA, we present a full phylogenetic study of stars in the solar neighbourhood. This entails applying a clustering technique that is widely used in molecular biology to construct an evolutionary tree from which three branches emerge. These are interpreted as stellar populations that separate in age and kinematics and can be thus attributed to the thin disc, the thick disc and an intermediate population of probable distinct origin. Combining the ages of the stars with their position on the tree, we are able to quantify the mean rate of chemical enrichment of each of the populations, and thus show in a purely empirical way that the star formation rate in the thick disc is much higher than that in the thin disc. Our method offers an alternative approach to chemical tagging methods with the advantage of visualizing the behaviour of chemical elements in evolutionary trees. (Abstract)

Joosten, Joost. Complexity Fits the Fittest. Zelinka, Ivan, et al, eds. How Nature Works: Complexity in Interdisciplinary Research and Applications. Berlin: Springer, 2014. The University of Barcelona logician is affiliated with the Algorthmic Nature group of the Paris-based Laboratory for Scientific Research for the Natural and Digital Sciences. By a general application of Stephen Wolfram’s cellular automata, the real presence a generative computational source in effect prior to selection can now be theoretically explained. This chapter, and a companion paper “On the Necessity of Complexity,” are available on the arXiv website.

In this paper we shall relate computational complexity to the principle of natural selection. We shall do this by giving a philosophical account of complexity versus universality. It seems sustainable to equate universal systems to complex systems or at least to potentially complex systems. Post’s problem on the existence of (natural) intermediate degrees then finds its analog in the Principle of Computational Equivalence (PCE). In this paper we address possible driving forces—if any—behind PCE. Both the natural aspects as well as the cognitive ones are investigated. We postulate a principle GNS that we call the Generalized Natural Selection principle that together with the Church-Turing thesis is seen to be in close correspondence to a weak version of PCE. Next, we view our cognitive toolkit in an evolutionary light and postulate a principle in analogy with Fodor’s language principle. (Complexity Fits the Fittest)

Wolfram's Principle of Computational Equivalence (PCE) implies that universal complexity abounds in nature. This paper comprises three sections. In the first section we consider the question why there are so many universal phenomena around. So, in a sense, we seek a driving force behind the PCE if any. We postulate a principle GNS that we call the Generalized Natural Selection Principle that together with the Church-Turing Thesis is seen to be equivalent to a weak version of PCE. In the second section we ask the question why we do not observe any phenomena that are complex but not-universal. We choose a cognitive setting to embark on this question and make some analogies with formal logic. In the third and final section we report on a case study where we see rich structures arise everywhere. (On the Necessity of Complexity)

Knott, Paul. Decoherence, Quantum Darwinism, and the Generic Emergence of Our Objective Classical Reality. arXiv:1811.09062. A University of Nottingham, Center for the Theoretical Physics of Quantum Non-Equilibrium Systems mathematician continues to make better sense of this theoretical frontier as it becomes more amenable and familiar. Visit the author’s website at knottquantum.weebly.com for publications, a blog and an illustrated book Our Quantum Reality with engaging entries to many concepts. An earlier version of his work with colleagues is Generic Emergence of Objectivity of Observables in Infinite Dimensions in Physical Review Letters (121/160401, 2018). Also check the UN Center for QS site for examples of how this arcane phase is lately seen to have multifractal, informative, gravity, algorithmic (1812.01032) qualities.

A University of Nottingham, Center for the Theoretical Physics of Quantum Non-Equilibrium Systems mathematician continues to make better sense of this theoretical frontier as it becomes more amenable and familiar. Visit the author’s website at knottquantum.weebly.com for publications, a blog and an illustrated book Our Quantum Reality with engaging entries to many concepts. An earlier version of his work with colleagues is Generic Emergence of Objectivity of Observables in Infinite Dimensions in Physical Review Letters (121/160401, 2018). Also check the UN Center for QS site for examples of how this arcane phase is lately seen to have multifractal, informative, gravity, algorithmic (1812.01032) qualities.

Knuth, Kevin. Information-Based Physics: An Observer-Centric Foundation. Contemporary Physics. Online January, 2014. The SUNY Albany professor of physics and informatics continues the inspiration of John Archibald Wheeler that this exisstent reality is in some way founded upon and most distinguished by a communicative source and conveyance. Such a self-visualizing and activating cosmos then requires at a later point the presence of sentient observers to recognize, acknowledge, and so bring into full being. Knuth has a series of prior papers on arXiv such as The Physics of Events: A Potential Foundation for Emergent Space-Time. While they, and most theoretical papers, are written in a technical parlance, the point of the message could be that human beings are in fact significantly empowered and entitled to learn, discover, witness and self-select.

It is generally believed that physical laws, reflecting an inherent order in the universe, are ordained by nature. However, in modern physics the observer plays a central role raising questions about how an observer-centric physics can result in laws apparently worthy of a universal nature-centric physics. Over the last decade, we have found that the consistent apt quantification of algebraic and order-theoretic structures results in calculi that possess constraint equations taking the form of what are often considered to be physical laws. The result is an approach to foundational physics where laws derive from both consistent descriptions and optimal information-based inferences made by embedded observers. (Abstract excerpt)

It is generally believed that physical laws reflect an inherent order in the universe. These laws, thought to apply everywhere, are typically considered to have been ordained by nature. In this sense they are universal and nature-centric. However, in the last century, modern physics has placed the observer in a central role resulting in an observer-based physics, which with a potential for subjectivity as well as quantum contextuality, poses conceptual difficulties for reconciliation with a nature-based physics. This raises questions as to precisely how an observer-based physics could give rise to consistent universal laws of nature as well as what role information plays in physics. Perhaps the potential implication of such questions has never been so clearly and concisely put as in Wheeler’s aphorism “It from Bit.” (1)

Kording, Konrad. Bayesian Statistics: Relevant for the Brain? Current Opinion in Neurobiology. 25/130, 2014. In a special issue on Theoretical and Computational Neuroscience, a Northwestern University biophysicist advocates this approach which is lately coming into use across the sciences for optimal choices from a population of options. A best or sufficient bet is achieved by according new experience and/or responses with prior learned memory. For example, Richard Watson, et al (search 2014) proposes life’s evolution as proceeding this way. See also Automatic Discovery of Cell Types and Microcircuitry from Neural Connectomics by Kording and Eric Jonas at arXiv:1407.4137. The whole issue of some 32 articles, e.g. by Adrienne Fairhall, Stanislav Dehaene, and Leslie Valiant, is a significant entry to an endeavor by worldwise humanity to reveal the creaturely cerebration that brought me and We to be. With “connectome” often cited, the papers seem as if they could equally apply to genomes. Might a better term be a “neurome” equivalent?

Bayesian statistics can be seen as a model of the way we understand things. Our sensors are noisy and ambiguous as several worlds could give rise to the same sensor readings. We therefore have uncertainty in our data and cannot be certain which model or hypothesis we should believe in. However, we can considerably reduce uncertainty about the world using previously acquired knowledge and by interpreting data across sensors and time. As new data comes in, we update our hypotheses. Bayesian statistics is the rigorous way of calculating the probability of a given hypothesis in the presence of such kinds of uncertainty. With Bayesian statistics, previously acquired knowledge is called prior, while newly acquired sensory information is called likelihood. (130)

Neural connectomics has begun producing massive amounts of data, necessitating new analysis methods to discover the biological and computational structure. It has long been assumed that discovering neuron types and their relation to microcircuitry is crucial to understanding neural function. Here we developed a nonparametric Bayesian technique that identifies neuron types and microcircuitry patterns in connectomics data. It combines the information traditionally used by biologists, including connectivity, cell body location and the spatial distribution of synapses, in a principled and probabilistically-coherent manner. (arXiv Abstract)

Previous   1 | 2 | 3 | 4  Next