(logo) Natural Genesis (logo text)
A Sourcebook for the Worldwide Discovery of a Creative Organic Universe
Table of Contents
Introduction
Genesis Vision
Learning Planet
Organic Universe
Earth Life Emerge
Genesis Future
Glossary
Recent Additions
Search
Submit

VII. Our Earthuman Ascent: A Major Evolutionary Transition in Twindividuality

1. A Cultural (Geonome) Code : Systems Linguistics

Steels, Luc. Language as a Complex Adaptive System. Marc Schoenauer, et al, eds. Parallel Problem Solving from Nature. Berlin: Springer, 2000. Communication is due to “autonomous distributed agents” involved with learning, organization, selectionism, and co-evolution, which proceed by structural coupling and multi-level formation.

Steels, Luc, ed. Experiments in Cultural Language Evolution. Amsterdam: Johns Benjamins, 2013. A European effort to connect our linguistic attributes with the evolutionary origins they must have had by way of self-organizing propensities that are in place prior to selection. Two main parts are Emergence of Perceptually Grounded Vocabularies, and Emergence of Grammatical Systems. See also Luc Steels’ paper How Language Emerges in Situated Embodied Interactions in New Perspectives on the Origins of Language (Benjamins, 2013).

This (first) chapter outlines the main challenges a theory for the cultural evolution of language should address and proposes a particular theory which is worked out and explored in greater detail in the remaining chapters of this book. The theory rests on two biologically inspired mechanisms, namely selection and self-organization, mapped onto the cultural, more specifically, linguistic domain. Selectionism is an alternative to rational top-down design. It introduces a distinction between processes that generate possible linguistic variants in a population (for example, different ways to express tense and aspect) and processes that select some variants to survive and become dominant in a language, based on criteria that translate into increased communicative success, such as expressive adequacy, minimal cognitive effort, learnability and social conformity. Self-organization occurs when speakers and hearers align their communication systems based on the outcome of each interaction. It explains how convergence may arise without central coordination or direct telepathic meaning transfer. This chapter explains these basic hypotheses in more detail and introduces a methodology for exploring them based on the notion of a language game. (Introduction: Self-organization and Selection in Cultural Language Evolution)

Steyvers, Mark and Joshua Tenebaum. The Large-Scale Structure of Semantic Networks. Cognitive Science. 29/1, 2005. As later cited by Markosova above, researchers at UC, Irvine and MIT find the same scale-free organization that repairs everywhere else in nature to similarly grace “human semantic knowledge.”

We present statistical analyses of the large-scale structure of 3 types of semantic networks: word associations, WordNet, and Roget’s Thesaurus. We show that they have a small-world structure, characterized by sparse connectivity, short average path lengths between words, and strong local clustering. In addition, the distributions of the number of connections follow power laws that indicate a scale-free pattern of connectivity, with most nodes having relatively few connections joined together through a small number of hubs with many connections. These regularities have also been found in certain other complex natural networks, such as the World Wide Web. (41)

Stratford, Barney. The Topology of Knowledge. Journal of Mathematical Psychology. 53/6, 2009. An Oxford University computer scientist considers how linguistic forms might be rightly appreciated as exemplary manifestations of a self-similar natural geometry.

Based on ideas from information theory, in particular the method of arithmetic coding, we describe a metric on sets of strings. In fact, these spaces correspond exactly with compact ultrametric spaces, and we explore their properties in detail. In particular, we look at contraction mappings in these spaces, as they are of fundamental importance in building fractals. We introduce a connection between fractal geometry and formal language theory. There are a number of parallels that can be drawn between the operations that are carried out on fractals and those that occur in formal language theory, and this leads to speculation about how deep the connection goes. (Abstract, 502)

Stroik, Thomas and Michael Putnam. The Structural Design of Language. Cambridge: Cambridge University Press, 2013. University of Missouri, and Penn State University system linguists provide another take on “The Biolinguistic Turn” (Chapter 1), akin to Balari and Lorenzo above, but proceed, per the first quote, to seek the causal roots of human discourse within a cosmic material essence.

The importance of Turing’s Thesis to our work is that it brings physical constrants and structural design into biology. It calls for as Fodor and Piattelli-Palmarini note, a constraint-based biology that questions “blind trial and error followed by natural selection” as a basis for biological growth and assumes instead that “It’s vastly more plausible to suppose that the causes of these (biological) forms are to be found in the elaborate self-organizing interactions between several components that are indeed coded for genes (protein complexes, morphogenetic gradients, hormones, cell-cell interactions, and so on) and the strictures dictated by chemical and physical forces.” According to this line of argumentation, the evolution of organisms is not primarily governed by extrinsic forces, but rather is driven by the design properties internal to the organism in question. (6)

Although there have been numerous investigations of biolinguistics, many of which appeal to the importance of Turing’s Thesis (that the structural design of systems must obey physical and mathematical laws), these studies have by and large ignored the question of the structural design of language. They have paid significant attention to identifying the components of language – settling on a lexicon, a computational system, a sensorimotor performance system, and a conceptual-intentional performance system; however, they have not examined how these components must be inter-structured to meet thresholds of simplicity, generality, naturalness, and beauty, as well as of biological and conceptual necessity. In this book, Stroik and Putnam argue that the narrow syntax – the lexicon, the Numeration, and the computational system – must reside, for reasons of conceptual necessity, within the performance systems. As simple as this novel design is, it provides, as Stroik and Putnam demonstrate, radical new insights into what the human language faculty is, how language emerged in the species, and how language is acquired by children. (Publisher)

Studdert-Kennedy, Michael. How did Language go Discrete? Tallerman, Maggie, ed. Language Origins. Oxford: Oxford University Press, 2005. Amongst an updated collection from an April 2002 conference at Harvard on the biological roots of speech, morphology, syntax, and symbol, a consideration of a deep analogy between the linguistic and genetic codes.

Tallerman, Maggie and Kathleen Gibson, eds. Oxford Handbook of Language Evolution. Oxford: Oxford University Press, 2011. A 750 page compendium with five Parts: Insights from Comparative Animal Behavior; The Biology of Language Evolution: Anatomy, Genetics, and Neurology; The Prehistory of Language: When and Why Did Language Evolve?; Launching Language: The Development of a Linguistic Species; and Language Change, Creation, and Transmission in Modern Humans. Authorship of the 65 entries is luminary, such as Dorothy Cheney, Frans de Waal, Irene Pepperberg, Tecumseh Fitch, Eors Szathmary, Merlin Donald, Jacques Vauclair, Michael Arbib, Stephen Mithen, Rebecca Caan, Dean Falk, Robin Dunbar, Terrence Deacon, and Joan Bybee. A notable paper might be “Self-Organization and Language Evolution” by Bart de Boer on the spontaneous emergence of speech phonetics and lexicons.

Tomasello, Michael. Constructing a Language. Cambridge: Harvard University Press, 2003. In contrast to the generative grammar school of Noam Chomsky which views language as springing from an innate propensity, the Max Planck Institute anthropologist advocates a usage-based theory of its acquisition. Language is symbolic and syntax-governed, of cultural-historic occasion, which does not require a native source. The way that the human species learned to speak is seen to be retraced by children.

Wierzbicka, Anna. Innate Conceptual Primitives Manifested in the Languages of the World and in Infant Cognition. Eric Margolis and Stephen Laurence, eds. The Conceptual Mind. Cambridge: MIT Press, 2015. The Australian National University linguist is an eminent scholar for some five decades with many significant publications. Her main project has been to discern the presence of common, recurrent features across every language. The roots of this endeavor are traced to the similar pursuit of Gottfried Leibniz (1646-1716) to find a constant “lingua naturae” or “alphabet of human thoughts.” With Cliff Goddard of Griffith University, she has conceived a modern version dubbed Natural Semantic Metalanguage (NSM, see below) whence “semantic primes” are found in many world vernaculars. Just as Plotnik and Clayton in this volume describe a common animal acumen, so a basic continuity of expression graces human speech. It is alluded that children, in an ontogenetic fashion, innately tap into this crosslinguistic resource as they learn to speak. See also her Common Language of All People: The Innate Language of Thoughts in Problems of Information Transmission (474, 2011) which refers to a “genetic code of the human mind” by way of “universal semantic molecules.” For more sources, we note Semantic Primes and Universal Grammar (2006), and Cross-Linguistic Semantics (2008).

The Natural Semantic Metalanguage approach, originated by Anna Wierzbicka, can lay claim to being the most well-developed, comprehensive and practical approach to cross-linguistic and cross-cultural semantics on the contemporary scene. It has been applied to over 30 languages from many parts of the world. The NSM approach is based on evidence that there is a small core of basic, universal meanings, known as semantic primes, which can be expressed by words or other linguistic expressions in all languages. This common core of meaning can be used as a tool for linguistic and cultural analysis: to explain the meanings of complex and culture-specific words and grammatical constructions (using semantic explications), and to articulate culture-specific values and attitudes (using cultural scripts). The theory also provides a semantic foundation for universal grammar and for linguistic typology. (Goddard website)

Wray, Alison. Formulaic Language and the Lexicon. Cambridge: Cambridge University Press, 2002. Noted more in the A Complementary Brain and Thought Process, the work is a significant contribution to understanding how human linguistic abilities manifest an archetypal complementarity in their function and content.

Yamazaki, Makoto, et al, eds.. Quantitative Approaches to Universality and Individuality in Language. Berlin: de Gruyter, 2023. The editors are MY, National Institute for Japanese Language and Linguistics, Haruko Sanada, Rissho University, Tokyo, Reinhard Köhler, University Trier, Germany; Sheila Embleton and Eric Wheeler, York University, Toronto and Relja Vulanović, Kent State University. Seventeen chapters such as Revisiting Zipf’s law: A new indicator of lexical diversity and The Menzerath-Altmann law in the syntactic relations of the Chinese language broadly scope out how the latest linguistic studies are being informed by complex system and machine learning methods.

The series Quantitative Linguistics publishes books on all aspects of quantitative methods and models in linguistics, text analysis and related research fields. Specifically, the scope of the series covers the whole spectrum of theoretical and empirical research, ultimately striving for an exact mathematical formulation and empirical testing of hypotheses: observation and description of linguistic data, application of methods and models, discussion of methodological and epistemological issues, modelling of language and text phenomena.
Quantitative linguistic research reveals fascinating patterns in contemporary and historical linguistic data. The book offers insights from a broad range of languages, including Japanese, Slovene and Catalan. The reader is convinced that statistic empirical analysis – and increasingly also machine learning and big data – should be an essential part of any serious linguistic enquiry.

Youn, Hyejin, et al. On the Universal Structure of Human Lexical Semantics. Proceedings of the National Academy of Sciences. 113/1766, 2016. Drawing on 21st century worldwide linguistics studies, a senior team from Oxford University, Indiana University, Santa Fe Institute, and the University of New Mexico including Eric Smith and William Croft describe a method by which a universality across some 6000 languages can at last be perceived. The long abstract can explain. The main idea is to recognize that many words can have several, “polysemic,” meanings in different cultures, which if teased out and factored can reveal common themes everywhere.

How universal is human conceptual structure? The way concepts are organized in the human brain may reflect distinct features of cultural, historical, and environmental background in addition to properties universal to human cognition. Semantics, or meaning expressed through language, provides indirect access to the underlying conceptual structure, but meaning is notoriously difficult to measure, let alone parameterize. Here, we provide an empirical measure of semantic proximity between concepts using cross-linguistic dictionaries to translate words to and from languages carefully selected to be representative of worldwide diversity. These translations reveal cases where a particular language uses a single “polysemous” word to express multiple concepts that another language represents using distinct words.

We use the frequency of such polysemies linking two concepts as a measure of their semantic proximity and represent the pattern of these linkages by a weighted network. This network is highly structured: Certain concepts are far more prone to polysemy than others, and naturally interpretable clusters of closely related concepts emerge. Statistical analysis of the polysemies observed in a subset of the basic vocabulary shows that these structural properties are consistent across different language groups, and largely independent of geography, environment, and the presence or absence of a literary tradition. The methods developed here can be applied to any semantic domain to reveal the extent to which its conceptual structure is, similarly, a universal attribute of human cognition and language use. (Abstract)

Previous   1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10