(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

IV. Ecosmomics: Independent, UniVersal, Complex Network Systems and a Genetic Code-Script Source

3. Iteracy: A Rosetta Ecosmos Textuality

Cardenas, Juan, et al. Does Network Complexity Help Organize Babel’s Library?. arXiv:1409.7336. Chilean and Spanish system linguists begin with Jorge Borges’ short story The Library of Babel with an infinite number of books (also known as the universe) but without a catalogue by which to find. In this paper, it is proposed that in addition to all the textual volumes, the same scale-free networks as found everywhere else serve to organize and interconnect words, paragraphs, and content. This perception is then used to parse works such as The Universal Declaration of Human Rights and Man of La Mancha, so the answer is yes. See also Scaling-Laws in Emotion-Associated Words and Corresponding Network Topology by Takuma Takehara, et al in Cognitive Processing (Online November 2014). So might we appreciate that even our whole written corpus is similarly graced by nature’s complex webworks, which might extend to a literal cosmos, as Borges sought to imply.

In this work, we study properties of texts from the perspective of complex network theory. Words in given texts are linked by co-occurrence and transformed into networks, and we observe that these display topological properties common to other complex systems. However, there are some properties that seem to be exclusive to texts; many of these properties depend on the frequency of words in the text, while others seem to be strictly determined by the grammar. Precisely, these properties allow for a categorization of texts as either with a sense and others encoded or senseless. (Abstract)

Castellini, Alberto, et al. A Dictionary Based Informational Genome Analysis. BMC Genomics. 13/485, 2012. In a typical paper for an online Bioinformatics journal, University of Verona computer scientists proceed to study and parse life’s molecular program by way of literate analogies.

Background. In the post-genomic era several methods of computational genomics are emerging to understand how the whole information is structured within genomes. Literature of last five years accounts for several alignment-free methods, arisen as alternative metrics for dissimilarity of biological sequences. Among the others, recent approaches are based on empirical frequencies of DNA k-mers in whole genomes.

Results. Any set of words (factors) occurring in a genome provides a genomic dictionary. About sixty genomes were analyzed by means of informational indexes based on genomic dictionaries, where a systemic view replaces a local sequence analysis. A software prototype applying a methodology here outlined carried out some computations on genomic data. We computed informational indexes, built the genomic dictionaries with different sizes, along with frequency distributions. The software performed three main tasks: computation of informational indexes, storage of these in a database, index analysis and visualization. The validation was done by investigating genomes of various organisms. A systematic analysis of genomic repeats of several lengths, which is of vivid interest in biology (for example to compute excessively represented functional sequences, such as promoters), was discussed, and suggested a method to define synthetic genetic networks.

Chai, Lucy and Danielle Bassett. Evolution of Semantic Networks in Biomedical Texts. arXiv:1810.10534. Cambridge University and University of Pennsylvania bio/neuro/info engineers post a frontier paper which limns an affinity between complex cerebral and textual domains. With theoretical depth, both brains and books are rooted in, arise from, and well exemplify a fertile natural milieu. A cumulative advance from 2010 rudiments to 2018 sophisticated analyses can now evince computational similarities across a similar scale of “information transmission networks.” As a result, the presence of universal, invariant “fractal hierarchical modular architectures” is clearly evident. The key insight is an organizational connectivity or computational scaling, as the quotes note, which guides both phases. So once more, a window upon nature’s literate essence is opened. See also A Mathematical Theory of Semantic Development in Deep Neural Networks at 1810.10531, and Evidence of Rentian Scaling of Functional Modules in Diverse Biological Networks in Neural Computation (30/8, 2018).

Language is hierarchically organized: words are built into phrases, sentences, and paragraphs to represent complex ideas. Here we ask whether the organization of language in written text displays the fractal hierarchical architecture common in systems optimized for efficient information transmission. We test the hypothesis that the expositional structure of scientific research articles displays Rentian scaling, and that the exponent of the scaling law changes as the article's information transmission capacity changes. Using 32 scientific manuscripts - each containing between three and 26 iterations of revision - we construct semantic networks which display clear Rentian scaling from the first draft to the final revision. This reflects the evolution in semantic network structure over the manuscript revision process, highlighting a balance between network complexity and efficiency. Taken together, our results suggest that semantic networks reflecting the structure of exposition in scientific research articles display striking hierarchical architecture that arbitrates tradeoffs between competing constraints on network organization. (Abstract excerpt)

A modular hierarchical architecture is observed across various real-life networks. Here, we quantify this fractal-like structure in semantic networks composed of introductions of scientific articles, in particular throughout the drafting and revision process of scientific manuscripts. We observe that the Rentian scaling exponent describing hierarchical network structure varies throughout the publication life cycle and clusters into three main trends among our collection of manuscripts. This evolution of the scaling exponent over the manuscript revision process suggests a balance of network complexity and efficiency, a trade-off which similarly governs other natural and man-made networked systems. (18)

Rent's rule pertains to the organization of computing logic, specifically the relationship between the number of external signal connections to a logic block (i.e., the number of "pins") with the number of logic gates in the logic block, and has been applied to circuits ranging from small digital circuits to mainframe computers. (Wikipedia)

Chebanov, S. The Role of Hermeneutics in Biology. Koslowski, Peter, ed. Sociobiology and Bioeconomics. Berlin: Springer. 1999. From Russia an attempt to retrieve an “Hermeneutica Sacra,” the art of interpreting the world as “sacred text,” by applying semiotic principles to modern biological and economic realms.

Chen, Heng, et al. How Does Language Change as a Lexical Network? An Investigation Based on Written Chinese Word Co-occurrence Networks. PLoS One. February 28, 2018. Center for Linguistics and Applied Linguistics, Guangdong University, and School of Foreign Studies, Xi’an Jiaotong University, China scholars turn to complexity and network structures and dynamics to reveal that their own language is quite similarly distinguished by nested scales, modules, communities, and so on. In regard, ever again nature’s independent, mathematic source is universally exemplified in each and every realm, which then well implies an innately textual, genomic milieu.

Language is a complex adaptive system, but how does it change? To investigate this process, four diachronic Chinese word co-occurrence networks have been built based on texts written during the last 2,000 years. By comparing the network indicators associated with the hierarchical features in language networks, we learn that the hierarchy of Chinese lexical networks has evolved over time at three different levels. The connections of words at the micro level are weakening; the number of words in meso-level communities has increased; and the network is expanding at the macro level. This means that more words tend to be connected to medium-central words and form different communities. Understanding this process can help us understand the increasing structural complexity of the language system. (Abstract)

Chen, Hongjia, et al. Scaling Laws and Dynamics of Hashtags on Twitter. arXiv:2004.12707. University of Sydney systems linguists including Eduardo Altmann discern yet another case of nature’s universal mathematic patterns and dynamics at formative presence even is these hyper-active Internet communications.

In this paper we quantify the statistical properties and dynamics of the frequency of hashtag use on Twitter. Hashtags are special words used in social media to attract attention and to organize content. Looking at the collection of hashtags used in a period of time, we identify the scaling laws for their frequency distribution (Zipf's law), the number of unique hashtags as a function of sample size (Heaps' law), and the fluctuations around expected values (Taylor's law). While these scaling laws appear to be universal, their volume and nature depends strongly on time, with bursts at the minute scale, fat-tailed noise, and long-range correlations. Here we view hashtags as memes and quantify emerging properties of their collective interaction including scaling laws and time scales. (Excerpt)

Chrisomalis, Stephen. Numerical Notation: A Comparative History. Cambridge: Cambridge University Press, 2010. A Wayne State University linguistic anthropologist achieves an encyclopedic survey of how homo sapiens came to develop alphabetic and numerical systems as they so distinguish our literate, technical societies. Among other endeavors, the author is involved a MathCorps project at WSU to aid the teaching of mathematics in Detroit middle schools.

This book is a cross-cultural reference volume of all attested numerical notation systems (graphic, non-phonetic systems for representing numbers), encompassing more than 100 such systems used over the past 5,500 years. Using a typology that defies progressive, unilinear evolutionary models of change, Stephen Chrisomalis identifies five basic types of numerical notation systems, using a cultural phylogenetic framework to show relationships between systems and to create a general theory of change in numerical systems. Numerical notation systems are primarily representational systems, not computational technologies. Cognitive factors that help explain how numerical systems change relate to general principles, such as conciseness or avoidance of ambiguity, which apply also to writing systems. The transformation and replacement of numerical notation systems relates to specific social, economic, and technological changes, such as the development of the printing press or the expansion of the global world-system. (Publisher)

Church, George. Rosetta Brain: The Multileveled Complexity of the Brain. Marcus, Gary and Jeremy Freeman, eds. The Future of the Brain. Princeton: Princeton University Press, 2014. Within the cerebral atlas project that this book surveys, the audacious Harvard biologist, with Adam Marblestone and Reza Kullor, propose a Rosetta Stone analog to convey how a neural network anatomy repeats in kind across many nested stages. In regard, it is a “gigantic matrix,” an “annotated connectome,” that can be sequenced like a genome. Its recurrent form, dialogue, and message can then be cross-translated between the levels. Once again, by a good metaphor, our very brain/mind becomes a microcosmic exemplar of macro reality. See also Rosetta Brains: A Strategy for Molecularly-Annotated Connectomics by this extended group at arXiv:1404.5103.

Going deeper, each cell (whether neuron, glial cell, or otherwise,) is composed of a network of self-assembling molecular machines, the dynamics of which is used not only to construct the electrochemical computing elements (neurons) but also to dynamically store and manipulate information within genetic logic circuits and synaptic protein assemblies. On the other hand, the fully functional brain self-organizes from a less-structured precursor during development and learning. (50-51)

To imagine what the dataset would represent, an analogy to a very different field is useful. The Rosetta Stone is a 1,700-pound tablet bearing three inscriptions: a priestly decree in honor of King Ptolemy V in Egyptian hieroglyphics, in Greek script, and in demotic script. Because it presented the exact same statement in three different languages, two of which were know and the third unknown, it provided a key resource for cracking the then-unknown code of the hieroglyphics. Similarly, a Rosetta Brain would convey information about multiple interrelated phenomenological levels of the brain’s biology and allow these levels to be directly compared to one another. (52)

Coecke, Bob. Compositionality as We See it Everywhere Around Us. arXiv:2110.05327.. The Oxford University quantum polymath continues his collegial insights into this neoclassical realm to show how the latest understandings continue to imply an innate natural narrative textuality. See also A Quantum Natural Language Processing Approach to Musical Intelligence by BC and friends at 2111.06741.

There are different meanings of the term "compositionality" within science: what one researcher would call compositional, is not for another researcher. The most established conception is characterised by a bottom-up flow of meanings: the meaning of the whole can be derived from the meanings of the parts, and how they are structured together. Inspired by work on compositionality in quantum theory, and categorical quantum mechanics, we turn to notions of (Erwin) Schrodinger, A. N. Whitehead. Schrodinger compositionality accommodates quantum theory, and also meaning-as-context. All together, our proposal aims to include the fact that compositionality is at its best when it is `real', `non-trivial', and even more when it also is `complete'. (Abstract excerpt)

Coecke, Bob. From Quantum Foundations via Natural Language Meaning to a Theory of Everything. arXiv:1602.07618. The Oxford University professor of Quantum Foundations, Logics and Structures also leads the Quantum Group of the Oxford Computing Laboratory, so is well versed in both theoretical physics and algorithmic programs. With many colleagues, he has been trying for some years to get a bearing on an apparent correspondence between quantum phenomena and linguistic expression. Into these 2010s nature’s deepest, basic realm can now be treated as a complex, dynamic system, with new commonalities to everywhere else. See for example his papers The Logic of Quantum Mechanics (1204.3458) and An Alternative Gospel of Structure (1307.4038). Here it is first put that an intrinsic relatedness is needed to complement a particle only fixation. A tour of quantum and literary realms then leads to a “new compositional distributional model” whence “meaning is everything.” By these insights, a novel “theory of everything” is broached which can convey this holistic, textual, connective, semantic essence, rather than one equation as before.

In this paper we argue for a paradigmatic shift from `reductionism' to `togetherness'. In particular, we show how interaction between systems in quantum theory naturally carries over to modelling how word meanings interact in natural language. Since meaning in natural language, depending on the subject domain, encompasses discussions within any scientific discipline, we obtain a template for theories such as social interaction, animal behaviour, and many others. (Abstract)

The title of this section is a metaphor aimed at confronting the complete disregard that the concept of togetherness has suffered in the sciences, and especially, in physics, where all of the effort has been on describing the individual, typically by breaking its description down to that of even smaller individuals. While, without any doubt, this has been a useful endeavour, it unfortunately has evolved in a rigid doctrine, leaving no space for anything else. The most extreme manifestation of this dogma is the use of the term ‘theory of everything’ in particle physics. We will provide an alternative conceptual template for a theory of everything, supported not only by scientific examples, but also by everyday ones. (2)

Of course, if we now consider the example of quantum theory from two sections ago, the analogues to grammatical types are system types i.e. a specification of the kinds (incl. quantity) of systems that are involved. So it makes sense to refine the grammatical types according to the subject domain. Just like nouns in physics would involve specification of the kinds of systems involved, in biology, for example, this could involve specification of species, population size, environment, availability of food etc. Correspondingly, the top part would not just be restricted to grammatical interaction, but also domain specific interaction, just like in the case of quantum theory. (15)

Coecke, Bob. The Logic of Quantum Mechanics. Jennifer Chubb, et al, eds. Logic and Algebraic Structures in Quantum Computing. Cambridge: Cambridge University Press, 2016. We cite this chapter by the Oxford computer theorist because it advances how this basic physical realm can be seen to possess a literate, compositional quality.

We put forward a new take on the logic of quantum mechanics, following Schrödinger's point of view that it is composition which makes quantum theory what it is, rather than its particular propositional structure due to the existence of superpositions. This gives rise to an intrinsically quantitative kind of logic, which truly deserves the name ‘logic’ in that it also models meaning in natural language, the latter being the origin of logic, that it supports automation, the most prominent practical use of logic, and that it supports probabilistic inference. (Abstract)

Coecke, Bob. The Mathematics of Text Structure. arXiv:1904.03478. The Oxford University computer scientist (search) leads a collaborative group there and beyond which studies in part how written literature is suffused by independent mathematical forms and narratives. This deep rooting can be extended even into physical quantum realms, as Diederik Aerts’ Brussels project (search), and Natural Language Processing are also finding. A prime basis for Coecke has been the lifetime work of the late McGill University Joachim (Jim) Lambek (1922-2014) who came to conceive “a compositional algebraic approach to grammar.” This is a subtitle for his 100+ page paper, From Word to Sentence, available at math.mcgill.ca/barr/lambek/pdffiles/2008lambek.pdf. BC, JL, and others collaborated around this sense of a compositional cosmos that can in some way be considered and treated as having a meaningful, textual content.

In previous work we gave a mathematical foundation, referred to as DisCoCat, for how words interact in a sentence in order to produce the meaning of that sentence. To do so, we exploited the perfect structural match of grammar and categories of meaning spaces. Here, we give a mathematical foundation, referred to as DisCoCirc, for how sentences interact in texts in order to produce the meaning of that text. We revisit DisCoCat: while in the latter all meanings are states, in DisCoCirc word meanings are types of which the state can evolve, and sentences are gates within a circuit which update the meaning of words. While the developments in this paper are independent of a physical embodiment (cf. classical vs. quantum computing), both the compositional formalism and suggested meaning model are highly quantum-inspired, and implementation on a quantum computer would come with a range of benefits. (Abstract excerpt)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next  [More Pages]