(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

VII. Our Earthuman Ascent: A Major Evolutionary Transition in Individuality

1. A Cultural (Geonome) Code : Systems Linguistics

This is an eclectic section much about human language and communication across its many aspects and versions. A companion resource is Iteracy: Rosetta Ecosmos Textuality above. The significant import is that even such a far removed domain can yet be seen to manifest the universal dynamic, self-organizing, genetic-like system. Its influential presence is noticed in the evolution of human language with regard to constructive grammar and syntax, along with informational content, and onto script and speech complements. Personal conversation and even our historic literary corpus are found to have a congruent heritage. And within this website compass, one is prompted to add a “Systems Linguistics” phrase. We also seek to broach an emergent transition to a global “geonome” identity for our Earth learn accumulated repository.

2020: As we continue to report how complex network systems are informing and being applied in every natural and social realm, this section shows that even human literate human cultures and conversational dialogue can (necessarily) be seen as an exemplary instance. Many studies have found textual books and communicative speech to quite hold to and display fractal, self-organized similarities. Within our Earthomo compass as a major emergent transition, could a “geonome” lingua franca be considered so as to identify a global accumulated knowledge repository and naturome narrative?

Brand, C. O., et al. Analogy as a Catalyst for Cumulative Cultural Evolution. Trends in Cognitive Science. March, 2021.
Diessel, Holger. The Grammar Network: How Linguistic Structure is Shaped by Language Use. Cambridge: Cambridge University Press, 2019.
Fabbro, Franco. The Nature and Function of Languages. Language. 7/4, 2022.
Gontier, Nathalie, et al. Introduction: Language and Worldviews. Topoi. 41/3, 2022.
Heintz, Christophe and Thom Scott-Phillips. Expression Unleashed: The Evolutionary and Cognitive Foundations of Human Communication. Behavioral and Brain Sciences.January 2022)

Kirby, Simon. Culture and Biology in the Origins of Linguistic Structure. Psychonomic Bulletin & Review. 24/1, 2017.
MacWhinney, Brian and William O’Grady, eds. The Handbook of Language Emergence. New York: Wiley, 2015.
Oudeyer, Pierre-Yves, et al. Computational and Robotic Models of Early Language Development. arXiv:1903.10246.
Raimondi, Vincenzo. The Bio-Logic of Languaging and its Epistemological Background. Language Sciences. Online March, 2018.
Wierzbicka, Anna. Innate Conceptual Primitives Manifested in the Languages of the World and in Infant Cognition. Eric Margolis and Stephen Laurence, eds. The Conceptual Mind. Cambridge: MIT Press, 2015.

2023:

Biosemiotics 2006. www.biosemiotics2006.org. Abstracts for this international conference in Salzburg, Austria are available online and augur for new insight and synthesis. In the semiotic view, biological phenomena from protein modules to an oriented evolution most of all involve signification – the conveyance of information and meaning. From fungi to humans, constant modes of communication are thus expressed. Typical participants are Stacey Axe, Jesper Hoffmeyer, Guenther Witzany, Marcello Barbieri, Kalevi Kull, and John Collier. In a natural genesis, after the phenotype of complex bodies and brains, what is implied is an immaterial, genotype-like quality which characterizes and rises with life’s personifying gestation.

Only since exploration of space do we become increasingly aware how the entire architecture of the cosmos seems to be synchronized in a way to make life possible in the first place. (Abstract of “Biophotonics and Information, Matter and Energy - Nonlinear World-View” by Pierre Madl and Maricela Yip)

Aerts, Diederik and Lester Beltran. Human Language: A Boson Gas of Quantum Entangled Cognitions. arXiv:1909.06845. We want to notice this endeavor, albeit with many theoretical flourishes, by Brussels Free University interdisciplinary thinkers (search) so to broach a synthesis across these spatial and temporal stretches of physical and linguistic essays. By turns, an implication could be that the natural cosmos is deeply textual in kind. Whatever “scriptome” language could it then be written in?

We model a sample text of human language which tells a story by means of a quantum structure describing a Bose gas in a Bose-Einstein condensate near absolute zero. For this we introduce energy levels for the concepts (words) in the story and also introduce the new notion of “cognition” as the quantum of human language. Concepts are then “cognitions” in different energy states as it is the case for photons in different energy states of frequency radiation. We show that Bose-Einstein statistics delivers a good model for these pieces of texts telling stories, as well for short and long (novels) stories. From this new perspective we conjecture that the way to understand how similar concepts are identical and indistinguishable in human language is also how quantum particles identical and indistinguishable in physical reality, providing new evidence for our conceptuality interpretation of quantum theory. (Abstract flavor)

Arbib, Michael and Mihail Bota. Language Evolution: Neural Homologies and Neuroinformatics. Neural Networks. 16/9, 2003. The lead article in a special issue on neuroinformatics, a project similar to bioinformatics, which uses computers to organize vast amount of data. A specific NeuroHomology Database (NHDB) is described as a way to construct an evolutionary continuum from macaque to human with regard to the readiness of the brain for language.

Atkinson, Quentin. The Descent of Words. Proceedings of the National Academy of Sciences. 110/4159, 2013. A commentary by the University of Auckland psychologist on a paper in the same issue “Automated Reconstruction of Ancient Languages using Probabilistic Models of Sound Change” by University of British Columbia systems linguists Alexandre Bouchard-Cote, David Hall, Thomas Griffiths, and Dan Klein. For background, circa 1970 Ramon Jacobson, Jean Piaget, and others alluded that there is a deep affinity between genetics and linguistics. Some four decades on, this report describes how bioinformatic methods of “probabilistic string transducer algorithms developed for ancestral genome reconstruction” can similarly parse the historical course and structures of proto-languages. While the authors are careful to say their work complements prior linguistic analysis, it achieves a significant advance and verification of the invariant character, widely separated in evolutionary time and space, of life’s salient modes of organic prescription and welfare.

Indeed a recognition ought to be recorded of the epochal discovery that nature’s molecular regulatory and lexical syntax versions are one and the same. From our late vantage each appears, as this site attests, as a manifest exemplar of an independent, iterative cosmic genetic code in the guise of self-organizing complex modular network system dynamics. Their prime complements of entity agents and emphatic relations, as masculine and feminine principles, along with constant communication, well take on a procreative parent’s to children essence. In regard this implicate genome of a genesis uniVerse may then be seen to ascend with life’s nested evolutionary emergence, to today breakthrough via individual and collaborative knowledge unto fledgling self-recognition. By these views, human beings could gain a cognizant gene-like role as the way a family cosmos tries to read its own procreative code.

In this paper, we present an automated system capable of large-scale reconstruction of protolanguages directly from words that appear in modern languages. This system is based on a probabilistic model of sound change at the level of phonemes, building on work on the reconstruction of ancestral sequences and alignment in computational biology. Several groups have recently explored how methods from computational biology can be applied to problems in historical linguistics, but such work has focused on identifying the relationships between languages (as might be expressed in a phylogeny) rather than reconstructing the languages themselves. (4224) Using phonological representations allows us to perform reconstruction and does not require us to assume that cognate sets have been fully resolved as a preprocessing step. Representing the words at each point in a phylogeny and having a model of how they change give a way of comparing different hypothesized cognate sets and hence inferring cognate sets automatically. (4224)

We have developed an automated system capable of large-scale reconstruction of protolanguage word forms, cognate sets, and sound change histories. The analysis of the properties of hundreds of ancient languages performed by this system goes far beyond the capabilities of any previous automated system and would require significant amounts of manual effort by linguists. (4227) Our system is able to reconstruct the words that appear in ancient languages because it represents words as sequences of sounds and uses a rich probabilistic model of sound change. This is an important step forward from previous work applying computational ideas to historical linguistics. By leveraging the full sequence information available in the word forms in modern languages, we hope to see in historical linguistics a breakthrough similar to advances in evolutionary biology prompted by the transition from morphological characters to molecular sequences in phylogenetic analysis. (4228)

Aunger, Robert, ed. Darwinizing Culture. Oxford: Oxford University Press, 2000. Considerations of the evolutionary and psychological aspects of Richard Dawkins’ theory of memetics, whereby cultural “memes” such as ideas, phrases, paradigms may propagate similar to molecular genes.

Baker, Mark. Linguistic Differences and Language Design. Trends in Cognitive Sciences. 7/8, 2003. Every language is based on universal principles while being distinguished by discrete superficial variations. By this theory, English, Japanese and Mohawk, seemingly unrelated, actually have deep similarities.

Barnard, Alan. Language in Prehistory. Cambridge: Harvard University Press, 2015. In his latest volume, the Edinburgh University professor of the anthropology of South Africa offers the luminous insight that abilities for linguistic discourse arose in hominid bands for the main purpose of telling stories and conjuring myths about themselves and the fantastic natural realm they found themselves in.

Beckner, Clay, et al. Language as a Complex Adaptive System. http://www.santafe.edu/research/publications/workingpapers/08-12-047.pdf. A draft prospectus for a conference held at the University of Michigan, November 7 – 9, 2008, celebrating the 60th anniversary of the journal Language Learning. The conference website is: http://elicorpora.info/LLC. Its significance, together with a flurry of similar work in both this section and Emergent Genetic Information, is the evidential presence not only in linguistic discourse, but also in genomic networks, of these universal, independent dynamics which could then be significantly appreciated as a “cosmic genetic code.”

Language as a CAS involves the following key features: The system consists of multiple agents (the speakers in the speech community) interacting with one another. The system is adaptive, that is, speakers’ behavior is based on their past interactions, and current and past interactions together feed forward into future behavior. A speaker’s behavior is the consequence of competing factors ranging from perceptual constraints to social motivations. The structures of language emerge from interrelated patterns of experience, social interaction, and cognitive mechanisms. The CAS approach reveals commonalities in many areas of language research, including first and second language acquisition, historical linguistics, psycholinguistics, language evolution and computational modeling.

The advantage of viewing language as a complex adaptive system is that it us to provide a unified account of seemingly unrelated linguistic phenomena as properties of a single system. These phenomena include: variation all levels of linguistic organization; the probabilistic nature of linguistic behavior; continuous change within agents and across speech communities; the emergence of grammatical regularities from the interaction of agents in language use; and stage-transitions due to underlying nonlinear processes. We outline how the complex adaptive systems approach reveals commonalities in many areas of language research, including cognitive linguistics, sociolinguistics, first and second language acquisition, psycholinguistics, historical linguistics, and language evolution.

Beckner, Clay, et al. Language as a Complex Adaptive System: Position Paper. Language Learning. 59/Supp. 1, 2009. The issue collects papers from Santa Fe Institute and University of Michigan conferences. While inspired by UM’s John Holland, a founder of this theoretical concept, the ten authors, including Holland, Richard Blythe, Joan Bybee, Morten Christiansen, William Croft, Ellis, Jinyun Ke, Larsen-Freeman, and Tom Schoenemann (search) represent a cadre of leading linguists. In this CAS regard, there are several ways that multiple agents or elements interact – as speakers in dynamic conversation, and as words or sentences in a textual document. Salient characteristics of CAS are: Distributed Control and Collective Emergence, Intrinsic Diversity, Perpetual Dynamics, Nonlinearity and Phase Transitions, and Adaptive Network Structures.

The advantage of viewing language as a CAS is that it allows us to provide a unified account of seemingly unrelated linguistic phenomena. These phenomena include the following: variation at all levels of linguistic organization; the probabilistic nature of linguistic behavior; continuous change within agents and across speech communities; the emergence of grammatical regularities from the interaction of agents in language use; and stagelike transitions due to underlying nonlinear processes. We outline how the CAS approach reveals commonalities in many areas of language research, including cognitive linguistics, sociolinguistics, first and second language acquisition, psycholinguistics, historical linguistics, and language evolution. (2)

Cognition, consciousness, experience, embodiment, brain, self, human interaction, society, culture, and history are all inextricably intertwined in rich, complex, and dynamic ways in language. Everything is connected. Yet despite this complexity, despite its lack of overt government, instead of anarchy and chaos, there are patterns everywhere. Linguistic patterns are not preordained by God, genes, school curriculum, or other human policy. Instead, they are emergent—synchronic patterns of linguistic organization at numerous levels (phonology, lexis, syntax, semantics, pragmatics, discourse, genre, etc.), dynamic patterns of usage, diachronic patterns of language change (linguistic cycles of grammaticalization, pidginization, creolization, etc.), ontogenetic developmental patterns in child language acquisition, global geopolitical patterns of language growth and decline, dominance and loss, and so forth. We cannot understand these phenomena unless we understand their interplay. (18)

Bel-Enguix, Gemma and Dolores Jimenez-Lopez. Biocomputing: An Insight from Linguistics. Natural Computing. 11/1, 2012. Universitat Rovira I Virgili, Tarragona, Spain, Mathematical Linguistics Group researchers whose project is to integrate biology, computation and linguistics present an overview of their fertile intersects. In this regard, with Luc Steels, natural language is to be known as a “complex adaptive system.” A growing number of contributions like this across biosemiotic, computational information, and genomic domains are quite defining an innately textual, scriptural nature, as long foretold. By these lights, an independent, universal narrative and message seems to ever rewrite, retell and expound itself. The authors, with Veronica Dahl, are also editors of Biology, Computation and Linguistics: New Interdisciplinary Paradigms (IOS Press, 2011). Their chapter therein “Genetic Code and Verbal Language: Syntactic and Semantic Analogies” is a historical survey of the commonalities and steady convergence of these premier programs.

This paper is placed in a formal framework in which the interdisciplinary study of natural language is conducted by integrating linguistics, computer science and biology. It provides an overview of the field of research, conveying the main biological ideas that have influenced research in linguistics. Our work highlights the main methods of molecular computing that have been applied to the processing and study of the structure of natural language: DNA computing, membrane computing and networks of evolutionary processors. Moreover, some new challenges and lines of research for the future are pointed out, that can provide important improvements in the understanding of natural language as a structure and a human capacity. (Abstract)


Surprisingly, and despite the fact that the genetic code has been considered a code from its discovery, linguistics has not attempted to construct a new paradigm taking advantage of the great developments in molecular biology. Nevertheless, it is an especially suggestive idea, because of the interesting similarities between natural language and the genetic code in different levels. Natural Language Processing (NLP) can take great advantage of the structural and “semantic” similarities between these codes. Specifically, taking the systemic code units and methods of combination of the genetic code, the methods of such entity can be translated to the study of natural language. (132)

Binder, Phillippe and Kenny Smith, eds. The Language Phenomenon: Human Communication from Milliseconds to Millennia. Berlin: Springer, 2013. As the subtitle cites, a synoptic edition that seeks an integration and rooting of words, grammar, syntax, content of script and speech with life’s long episodic, evolutionary trajectory. Might we even imagine a true cosmic and global gestation ever trying to represent, describe, say something, in an expected response, we read you? Certain chapters are Learning: Statistical Mechanisms in Language Acquisition by Elizabeth Wonnacott, and Evolution: Language Use and the Evolution of Languages. The entries by Pierre-Yves Oudeyer and Simon Kirby are reviewed separately for their significant advances.

Blackmore, Susan. The Power of Memes. Scientific American. October, 2000. A popular article to summarize the author’s book The Meme Machine (Oxford, 1999) which expands on Richard Dawkins’ proposal that ideas, concepts and so on are cultural equivalents to molecular genes. By this view, our lives are much taken with their influence and propagation.

1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10  Next