|
VII. Our Earthuman Ascent: A Major Evolutionary Transition in Twndividuality1. A Cultural (Geonome) Code : Systems Linguistics Studdert-Kennedy, Michael. How did Language go Discrete? Tallerman, Maggie, ed. Language Origins. Oxford: Oxford University Press, 2005. Amongst an updated collection from an April 2002 conference at Harvard on the biological roots of speech, morphology, syntax, and symbol, a consideration of a deep analogy between the linguistic and genetic codes. Tallerman, Maggie and Kathleen Gibson, eds. Oxford Handbook of Language Evolution. Oxford: Oxford University Press, 2011. A 750 page compendium with five Parts: Insights from Comparative Animal Behavior; The Biology of Language Evolution: Anatomy, Genetics, and Neurology; The Prehistory of Language: When and Why Did Language Evolve?; Launching Language: The Development of a Linguistic Species; and Language Change, Creation, and Transmission in Modern Humans. Authorship of the 65 entries is luminary, such as Dorothy Cheney, Frans de Waal, Irene Pepperberg, Tecumseh Fitch, Eors Szathmary, Merlin Donald, Jacques Vauclair, Michael Arbib, Stephen Mithen, Rebecca Caan, Dean Falk, Robin Dunbar, Terrence Deacon, and Joan Bybee. A notable paper might be “Self-Organization and Language Evolution” by Bart de Boer on the spontaneous emergence of speech phonetics and lexicons. Tomasello, Michael. Constructing a Language. Cambridge: Harvard University Press, 2003. In contrast to the generative grammar school of Noam Chomsky which views language as springing from an innate propensity, the Max Planck Institute anthropologist advocates a usage-based theory of its acquisition. Language is symbolic and syntax-governed, of cultural-historic occasion, which does not require a native source. The way that the human species learned to speak is seen to be retraced by children. Wierzbicka, Anna. Innate Conceptual Primitives Manifested in the Languages of the World and in Infant Cognition. Eric Margolis and Stephen Laurence, eds. The Conceptual Mind. Cambridge: MIT Press, 2015. The Australian National University linguist is an eminent scholar for some five decades with many significant publications. Her main project has been to discern the presence of common, recurrent features across every language. The roots of this endeavor are traced to the similar pursuit of Gottfried Leibniz (1646-1716) to find a constant “lingua naturae” or “alphabet of human thoughts.” With Cliff Goddard of Griffith University, she has conceived a modern version dubbed Natural Semantic Metalanguage (NSM, see below) whence “semantic primes” are found in many world vernaculars. Just as Plotnik and Clayton in this volume describe a common animal acumen, so a basic continuity of expression graces human speech. It is alluded that children, in an ontogenetic fashion, innately tap into this crosslinguistic resource as they learn to speak. See also her Common Language of All People: The Innate Language of Thoughts in Problems of Information Transmission (474, 2011) which refers to a “genetic code of the human mind” by way of “universal semantic molecules.” For more sources, we note Semantic Primes and Universal Grammar (2006), and Cross-Linguistic Semantics (2008). The Natural Semantic Metalanguage approach, originated by Anna Wierzbicka, can lay claim to being the most well-developed, comprehensive and practical approach to cross-linguistic and cross-cultural semantics on the contemporary scene. It has been applied to over 30 languages from many parts of the world. The NSM approach is based on evidence that there is a small core of basic, universal meanings, known as semantic primes, which can be expressed by words or other linguistic expressions in all languages. This common core of meaning can be used as a tool for linguistic and cultural analysis: to explain the meanings of complex and culture-specific words and grammatical constructions (using semantic explications), and to articulate culture-specific values and attitudes (using cultural scripts). The theory also provides a semantic foundation for universal grammar and for linguistic typology. (Goddard website)
Wray, Alison.
Formulaic Language and the Lexicon.
Cambridge: Cambridge University Press,
2002.
Noted more in the A Complementary Brain and Thought Process, the work is a significant contribution to understanding how human linguistic abilities manifest an archetypal complementarity in their function and content. Yamazaki, Makoto, et al, eds.. Quantitative Approaches to Universality and Individuality in Language. Berlin: de Gruyter, 2023. The editors are MY, National Institute for Japanese Language and Linguistics, Haruko Sanada, Rissho University, Tokyo, Reinhard Köhler, University Trier, Germany; Sheila Embleton and Eric Wheeler, York University, Toronto and Relja Vulanović, Kent State University. Seventeen chapters such as Revisiting Zipf’s law: A new indicator of lexical diversity and The Menzerath-Altmann law in the syntactic relations of the Chinese language broadly scope out how the latest linguistic studies are being informed by complex system and machine learning methods. The series Quantitative Linguistics publishes books on all aspects of quantitative methods and models in linguistics, text analysis and related research fields. Specifically, the scope of the series covers the whole spectrum of theoretical and empirical research, ultimately striving for an exact mathematical formulation and empirical testing of hypotheses: observation and description of linguistic data, application of methods and models, discussion of methodological and epistemological issues, modelling of language and text phenomena. Youn, Hyejin, et al. On the Universal Structure of Human Lexical Semantics. Proceedings of the National Academy of Sciences. 113/1766, 2016. Drawing on 21st century worldwide linguistics studies, a senior team from Oxford University, Indiana University, Santa Fe Institute, and the University of New Mexico including Eric Smith and William Croft describe a method by which a universality across some 6000 languages can at last be perceived. The long abstract can explain. The main idea is to recognize that many words can have several, “polysemic,” meanings in different cultures, which if teased out and factored can reveal common themes everywhere. How universal is human conceptual structure? The way concepts are organized in the human brain may reflect distinct features of cultural, historical, and environmental background in addition to properties universal to human cognition. Semantics, or meaning expressed through language, provides indirect access to the underlying conceptual structure, but meaning is notoriously difficult to measure, let alone parameterize. Here, we provide an empirical measure of semantic proximity between concepts using cross-linguistic dictionaries to translate words to and from languages carefully selected to be representative of worldwide diversity. These translations reveal cases where a particular language uses a single “polysemous” word to express multiple concepts that another language represents using distinct words.
Previous 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10
|
||||||||||||||||||||||||||||||||||||||||||||||
HOME |
TABLE OF CONTENTS |
Introduction |
GENESIS VISION |
LEARNING PLANET |
ORGANIC UNIVERSE |