Linguistic Attractors HUMAN COGNITIVE PROCESSING is a forum for interdisciplinary research on the nature and organization of the cognitive systems and processes involved in speaking and understanding natural language (including sign language), and their relationship to other domains of human cognition, including general conceptual or knowledge systems and processes (the language and thought issue), and other perceptual or behavioral systems such as vision and non-verbal behavior (e.g. gesture). ‘Cognition’ should be taken broadly, not only including the domain of rationality, but also dimensions such as emotion and the unconscious. The series is open to any type of approach to the above questions (methodologically and theoretically) and to research from any discipline, including (but not restricted to) di¤erent branches of psychology, arti¼cial intelligence and computer science, cognitive anthropology, linguistics, philosophy and neuro- science. It takes a special interest in research crossing the boundaries of these disciplines. EDITORS Marcelo Dascal (Tel Aviv University) Raymond Gibbs (University of California at Santa Cruz) Jan Nuyts (University of Antwerp) Editorial address: Jan Nuyts, University of Antwerp, Dept. of Linguistics (GER), Universiteitsplein 1, B 2610 Wilrijk, Belgium, e-mail: [email protected] EDITORIAL ADVISORY BOARD Melissa Bowerman (Nijmegen); Wallace Chafe (Santa Barbara, CA) Philip R. Cohen (Portland, OR); Antonio Damasio (Iowa City, IA) Morton Ann Gernsbacher (Madison, WI); David McNeill (Chicago, IL) Eric Pederson (Eugene, OR); François Recanati (Paris) Sally Rice (Edmonton, Alberta); Benny Shanon (Jerusalem) Lokendra Shastri (Berkeley, CA); Dan Slobin (Berkeley, CA) Paul Thagard (Waterloo, Ontario) Volume 2 David L. Cooper Linguistic Attractors The cognitive dynamics of language acquisition and change Linguistic Attractors The cognitive dynamics of language acquisition and change DAVID L. COOPER Fairfax, Virginia JOHN BENJAMINS PUBLISHING COMPANY AMSTERDAM/PHILADELPHIA TM The paper used in this publication meets the minimum requirements of 8 American National Standard for Information Sciences — Permanence of Paper for Printed Library Materials, ANSI Z39.48-1984. Library of Congress Cataloging-in-Publication Data Cooper, David L. Linguistic attractors : the cognitive dynamics of language acquisition and change / David L. Cooper. p. cm. -- (Human cognitive processing, ISSN 1387-6724 ; v. 2) Includes bibliographical references and index. 1. Psycholinguistics. 2. Computational linguistics. 3. Linguistic change. I. Title. II. Series. P37.C614 1999 401’.9--DC21 99-10761 ISBN 90 272 2354 8 (Eur.) / 1 55619 202 9 (US) (alk. paper) CIP © 1999 – John Benjamins B.V. No part of this book may be reproduced in any form, by print, photoprint, microfilm, or any other means, without written permission from the publisher. John Benjamins Publishing Co. • P.O.Box 75577 • 1070 AN Amsterdam • The Netherlands John Benjamins North America • P.O.Box 27519 • Philadelphia PA 19118-0519 • USA Table of contents Preface ix Introduction: Abstractions, Universals, Systems, and Attractors 1 Chapter 1 Human Architecture: Physical Constraints and Special 17 Accommodations 1.1 Evolution 17 1.2 Accommodations for Production: The “Breath Group” 21 1.3 Accommodations for Comprehension and Processing: 23 Fractal Patterns 1.4 Psycholinguistic Results 31 1.5 Neuropsychological Results 36 1.6 Parsing: Top-Down, Bottom-Up, and Rule-by-Rule 40 Strategies 1.7 Conclusions 45 Chapter 2 Possible Neural Network Implementations:General 47 Network Properties and Requirements for Computation 2.1 Rumelhart and McClelland: Emergent Rule-Like Behavior 48 2.2 Grossberg: Templates, Attention, and Memory 51 2.3 Turing Machines and Computation 52 2.4 Hopfield: Ising Glasses, Critical Phenomena, and Memory 58 2.5 Feedforward Systems: Fast Pattern Classifiers 62 2.6 Closer Approximations to Natural Neural Networks: The Semantic Attractor Memory 65 2.7 Computation With Fractal Sets 76 vi Contents Chapter 3 Representations 87 Chapter 4 Attractor Dynamics on Semantic Fields 105 4.1 Meaning is Measurement 107 4.2 Ambiguity Exacts a Price for Precision 115 4.3 Entropy Completes the Dynamics 131 Chapter 5 Towards an Attractor Grammar 137 5.1 Recapitulation 137 5.2 Case 140 5.2.1 Focus 148 5.2.2 Equations 157 5.2.3 Target 160 5.2.4 Reference 161 5.2.5 Elaborative Roles 162 5.2.6 Attractor Measurements 163 5.2.7 Attractor Basins 169 5.2.8 The Ordering of Elements 174 5.3 Some Conclusions 194 Chapter 6 The Dynamics of Language Change: Beowulf, the Tatian, and German Biblical Texts 205 6.1 Attractor Dynamics in Beowulf 206 6.1.1 Ambiguity Indices 209 6.1.2 Morphological Cylinder Sets: Usage of Case and 215 Mood 6.1.3 Templates 232 6.2 Sociodynamic Factors for German Mood Selection 241 6.2.1 Sources and Methodology 243 6.2.2 Catastrophes 245 6.2.3 Simple Changes: The Same Number or Fewer 254 Attractors 6.2.3.1 Concessives 256 6.2.3.2 Negative Imperatives 261 6.2.3.3 Present Conditional Sentences 265 6.2.4 Complexifications: More Attractors 275 6.2.4.1 Past Tense Conditionals 276 Contents vii 6.2.4.2 Affirmative Wishes 281 6.2.4.3 Purpose Clauses 288 6.2.4.4 Result Clauses 290 6.2.4.5 Indirect Discourse 296 6.2.5 Sociodynamics 316 6.3 Some Conclusions: The “Meaning” of the Subjunctive 322 References 329 Index 353 Preface This book is an attempt to understand language change by combining insights from several disciplines. The backbone is, of course, provided by linguistics, with a great deal gleaned from older structural approaches and more recent studies in dialectology, in addition to current mainstream approaches. To this, however, we must add information from psychological and neurological studies of humans and animals, as well as results from investigations of artificial neural networks and general computation. While the focus is on language change, we will also need to consider language acquisition and processing for the context, framework, and causative forces underlying this change. The essential metaphor underlying my argument is that language is a statistical ensemble of elements interacting in a dynamic system. To give substance to this idea, we must analyze this information in light of techniques drawn from studies of critical phenomena in physics, as well as the more general techniques from game theory and the theory of “chaos.” I use quota- tion marks here because I do not wish to imply that language is chaotic itself. It is necessarily a stable system. The theory, however, provides very precise insights into what makes dynamical behavior stable, variable, or unpredict- able. One of these insights is the idea of an ‘attractor.’ The book is consequently aimed at a relatively broad spectrum of “clien- teles,” each of which is demanding in its own way: those interested in linguistics, of course; those with an interest in cognitive science as it is applied to language; those with an interest in dynamic systems in general, and social systems in particular; and, finally, those interested in problems of computa- tion, both in general, and as applied to language processing. My hope is to offer something to each group, although practitioners in these specialties might not at first recognize what I have done with their particular formalism to make it fit the problem. x Preface While drawing heavily on the work of others, I make a number of new points that may be of interest as well. Language, for instance, is examined from a variety of angles. When we try to adopt measure theory to create a semantic space and apply concepts derived from thermodynamics, the result is a vector space, structured by means of ‘cylinder sets’ whose dynamics are controlled by ‘precision,’ and ‘ambiguity.’ An index is introduced which measures paradigmatic ambiguity, and is able to portray an ‘ambiguity land- scape’ peculiar to each language. Language is also examined from the per- spective of computation, where we find that the key concepts are ‘composition,’ ‘minimalization,’ and ‘recursion.’ We will see that language processing can be keyed to minimalization steps, and that we can possibly build up parallel-processing variants of more traditional analysis keyed to the ‘juxtaposition’ or ‘superposition’ of elements at each step. We can also track sociodynamic factors affecting language change when we look at language as a dynamic system supported by a speech community. For cognitive scientists, perhaps the chief contribution of the book is the analysis of ‘universals’ in terms of physical processes and the constraints inherent in processing time-varying pressure waves into meaningful informa- tion. This leads to consideration of computation theory and the complications introduced into theoretical models by Universal Turing Machines, which are capable of imitating the behavior of other Turing machines (computers re- duced to a kind of Platonic ideal). The distinction between simulating behav- ior in a model and simply emulating it as a Universal Turing Machine does is developed carefully, so that it should be possible to see that successful emulations might not provide very much insight into some process under study. That is, universal parameters embodied in an emulation might not faithfully represent the general behavior traits and interactions that occur in a dynamic system. However, we are really after the laws that govern how the various components of the system interact among themselves — the true universals — and not simply successful emulations. For computer science itself, the book introduces a new artificial network design: the Semantic Attractor Memory. This was developed precisely be- cause of those concerns about the simulation/emulation mix in various models as a way to demonstrate how extremely simple components might function in a neural architecture. One immediate result, of course, is that the architecture itself becomes critical to the function of the network, which helps explain why particular structures seem so important in the study of biological neural networks.