The First Word: The Search for the Origins of Language

Home > Other > The First Word: The Search for the Origins of Language > Page 27
The First Word: The Search for the Origins of Language Page 27

by Christine Kenneally


  Kirby, who completed undergraduate and graduate degrees in the study of language evolution, was appointed lecturer in language evolution at the University of Edinburgh at thirty-three. This was the first appointment of its kind in the world. Indeed, Kirby is still probably the only academic with language evolution in his job title. Each morning he heads off to his office in the linguistics department, and as he goes through his day, he talks to staff, other lecturers, and students. In lectures, tutorials, and simple hellos in the corridor, Kirby and his interlocutors exchange a certain number of words. If you could zoom out on the department, you would see Kirby and everyone he spoke to zipping around, stopping to connect with one another, and moving off again. Imagine these interactions in fast-forward, the days accelerating into weeks and then years, and all the while see how Kirby and his colleagues talk incessantly. Watch language bubble, build, and evaporate.

  Let’s assume that as Kirby and his interlocutors get older, they have children, and eventually the children replace them in all the running around and constant talking. Then their children have children. And their children follow in their footsteps. As the talk continues, the language starts to grow and change. Kirby himself may have disappeared relatively early in the process, but the people he spoke to live on, influenced by their conversations with him, and even though they, too, eventually die, the people they spoke to are influenced by them, and indirectly influenced by what Kirby said. Imagine if you could watch this process unfold from the dawn of humanity, watch the first speakers speak and the first listeners listen, and see how meaning and structure develop. Over time, words proliferate and begin to cluster in particular ways, regularities appear, and structural patterns begin to emerge. This grand view of the history of language is a little like what Kirby seeks in his research. His specialty is computer modeling of the evolution of language.

  Until the 1990s changes within and between languages could be tracked only by using the comparative method of linguistic reconstruction. But that technique has limitations. No single language from which all the world’s dialects are known to have descended has been reconstructed. The comparative method can unearth traces of language from as early as six thousand years ago, but not much further back than that. Computer modeling starts from the opposite end of the language chain. Instead of beginning with contemporary language and reconstructing past versions from it, Kirby creates populations of digital individuals called agents. He hands them some small amount of meaning, maybe a few rules, and then steps back and watches what they do with it.

  Jim Hurford, Kirby’s supervisor, kicked off the digital modeling of language in the late 1980s. “Jim had read The Selfish Gene by Richard Dawkins,” said Kirby,

  and in that Dawkins describes a computational model, where these things called biomorphs evolve, you know, bodies and things. Jim read that, and thought, Wow, I wonder if I could do that for language. So he started running these simulations on the VAX, an old-fashioned mainframe computer that we had back in the ’80s. He would tie up so much of the computing power, the whole department would be paralyzed, and they wouldn’t be able to read their e-mail or anything. It was groundbreaking stuff, and he did it really out of a vacuum.

  Jim modeled various things, like speech sounds. He built a model about vocabulary and numeral systems, and he did one on the critical period for language learning, which is this idea that we can learn language very easily when we’re young, but after a certain age we stop, and our language-learning ability kind of switches off. The question he was trying to understand was: Why on earth did something like that evolve? Why not have the ability to learn language all through your life? And his computational model showed that a critical period did evolve in his agents.

  As an undergraduate, Kirby had been deeply inspired by Hurford’s lectures. “His ideas about computational modeling really seemed fantastic, and it was just what I wanted to do.” So when Kirby finished his undergraduate degree, he enrolled as a Ph.D. student under Hurford.

  At around this time, Steven Pinker published The Language Instinct, in which he describes Hurford’s “critical period” model and refers to Hurford as the world’s only computational evolutionary linguist. Since then, the computer modeling of language has boomed. “Every year,” said Kirby, “there are more people using the computational approach to language evolution.” Today, less than twenty years since Hurford periodically disabled the University of Edinburgh’s linguistics department, the school is offering the first degree specifically in the subject, an M.S. on the evolution of language and cognition, and hundreds of researchers are working on computer modeling all over the world.

  Even though science has been getting better and better at tracking the elusive clues to our biological language suite, we still don’t know how language itself got here in the first place. Computer modeling promises to be a most useful tool in this quest. In addition to the godlike allure of creating populations and then watching them evolve into different kinds of creatures, this technique became so popular so quickly because modeling proposes to answer such questions as: How did the wordlike items that our ancestors used proliferate to become many tens of thousands of words with many rules about how they can be combined today? Why does language have structure, and why does it have its particular structure? How is it that the meaning of a sentence arises from the way it’s put together, not just from the meaning of the words alone?

  In just a few years computer modeling of language evolution has produced a plethora of findings that are counterintuitive to a traditional view of language. The most fundamental idea driving this research is that there are at least two different kinds of evolution—biological and linguistic, meaning that as we evolved, language evolved on its own path.

  Kirby starts his models by building a single individual, and then creating a whole population of them. “I’ll have them communicating with each other and transmitting their knowledge culturally over thousands or tens of thousands of generations and very long periods of time. In some extensions of the model, I allow those agents to evolve biologically as well.” What he and other researchers in the field have found is that from little things, big things grow. In these accelerated models, from the smallest beginning—agents with the ability to make sound but not words, agents who start out not knowing what other speakers mean—comes incredible structural complexity that looks a lot like language.

  This cultural evolution, said Kirby, is simply the repeated learning by individuals of other individuals’ behavior:

  The idea is that you’ve got iterated learning whenever your behavior is the result of observing another agent’s particular behavior. Language is the perfect example of this. The reason I speak in the way I do is because when I was younger I was around people who spoke and I tried to speak like that. And what we’ve been finding in our models is, to some extent, that is all you need. It’s very surprising. But if you make some very, very simple assumptions like that, you can get linguistic structure to emerge out of nothing—just from the assumption that the agents basically learn to speak on the basis of having seen other populations speak before them.

  Strangely enough, the most languagelike structures arise from beginnings that are constrained or not full of information. When Kirby built a model where agents were allowed a lot of exposure to one another’s behavior and able to learn all at once pretty much anything they would ever want to say, he found that nothing would actually happen. No linguistic structure emerged from the primordial word soup. In fact, the resultant system of communication looked more like simple animal communication. Kirby discovered that if the agents had only limited access to one another’s utterances—either because he made the language so big that they could observe only a small part of it at any one time or because he made sure they listened to only a few sentences at a time—then a lot of syntactic structure would eventually arise over the generations of agents. “It’s a kind of irony that you get this complex and structured language precisely when you make it
difficult for the agents to learn,” he said. “If you make it easy for them, then nothing interesting happens.”

  It would not be possible for Kirby, or anyone for that matter, to sit down and calculate the ways in which thousands of generations of different individuals may have interacted, and this is what makes digital modeling such a powerful tool. It offers a strong contrast to the armchair models that linguists have used for many years. For example, mainstream linguistics saw language as taking place between an idealized speaker and an idealized hearer. These two were representatives of a population of individuals who spoke pretty much the same language and were basically identical to one another. But this model blurs the distinction between the population and its constituent individuals. Digital modeling allows researchers to account for individuals within language communities. Modeling, then, can consist of at least two tiers of interactions—between individual agents within a population and between populations of these agents.

  “If you look at the lifetimes of individuals, you see massive changes in there, from nothing to a full language user,” explained Kirby. “It’s a hugely complex process that leads from one state to another.7 Then, on top of that, language changes in a community. So the new thing that’s emerging is this desire to link individuals with populations in the model directly, by saying, ‘Let’s put together lots of agents that are seriously individual, and see what happens when there is a population of these.’”

  Because Kirby is working on a vast biological timescale, his models usually involve very simple, idealized aspects of language, like the ordering of words. “They almost seem trivial,” he said. Eventually, the models will become much more complex, and ideally the particular models that show how language might have evolved from its earliest beginning will mesh with models that show how languages have changed in more recent times—as, for example, how Latin changed into Italian, French, and other Romance languages.

  Traditionally linguists have carved up the long history of language into language evolution and more recent language change. Language evolution examined how the human species developed the ability to speak with human language. Language change and growth studies focused on how that first language, once acquired, became thousands of different languages over tens of thousands of years. More and more computer modelers have come to believe that the process is more seamless than that, and language change is to some degree the same as language evolution. The obvious model here is biological life—in the same way that species, once formed, can keep on speciating, the process by which sound and meaning ratchet themselves up into language in the first place leads inevitably to the process by which that language becomes a multitude of languages.

  “I would say,” Kirby explained, “that the same process or parts of the same process have to be going on. What’s tricky about modeling it is the timescales. They are so hugely different. To model biological evolution in a computer you obviously need thousands and thousands of generations, and currently the problem is getting a computer that has the resolution to look at very fine facts about language evolution or language change.”

  In attempting to incorporate linguistic change in both individuals and populations, Kirby and other modelers like him are actually trying to tease out three different timescales and three different evolutionary processes that contribute to language evolution: two types of linguistic evolution—in the individual and in the population—and biological evolution, tracking how one species becomes another. “That’s what is unique about language,” said Kirby. “That is what makes it really special in the natural world and probably one of the most complex systems we know of—it’s dynamic and adaptive at all three different timescales, the biological, the cultural, and the individual. They are all operating together, and that’s where language comes from—out of that interaction.”

  Kirby and a number of other researchers find one metaphor especially useful for thinking about language: imagine that it is a virus, a nonconscious life-form that evolves independently of the animals infected by it. Just as a standard virus adapts to survival in its physical environment, the language virus adapts to survival in its environment—a complicated landscape that includes the semi-linguistic mind of the infant, the individual mind of the speaking adult, and the collective mind of communicating humans.

  According to Terrence Deacon, language and its human host are parasitic upon each other. “Modern humans need the language parasite in order to flourish and reproduce just as much as it needs humans to reproduce.”8 It’s an analogy that goes straight to the heart of how much language means to us as a species. If some global disaster killed all humans, there would be no language left. If language suddenly became inaccessible to us, perhaps we would all die, too.

  The most exciting implication of the language-as-virus metaphor is the finding that some features of language have less to do with the need of individuals to communicate clearly with one another than with the need of the language virus to ensure its own survival. That is, in the same way that the traits of a particular animal reflect its evolutionary adjustments to survival in a particular environment, so, too, do the features of language structure reflect its struggle to survive in its environment—the human mind. Reproduction is still the driving force of the evolutionary process, but it’s not our reproduction: it’s the reproduction of language itself.

  If language is a virus and its properties are shaped by its drive to survive, then the traditional linguistic goal of reducing all language to a set of rules or parameters is misguided. As Deacon explained, “Languages are more like living organisms than mathematical proofs, so we should study them [in] the way we study organism structure, not as a set of rules.”9 By this light the quirky grammars of the world’s languages make about as much sense as a pelican does, and English syntax is as elegant as, say, a panda. You can view any animal purely as a formal system, and you can describe it to a great extent using mathematics, but ultimately living organisms cannot be distilled into rule sets, though each is beautiful, elegant, and perfect in its own way.

  If you accept the language-as-virus metaphor, you can’t backward-engineer a language-specific mental device simply by looking at the language we have now. If language structure is the result of cultural evolution and accretion, then it’s a historical process as well as a mental one. Accordingly, one of Kirby’s models showed that a language that has the basic property of compositionality—that is, the meaning of an utterance results from the meaning of its parts and the way they are structured—is going to be more successful at surviving than one that doesn’t.10 Languages that don’t develop compositionality are not robust, and they soon die.

  “In the model where we don’t allow the agents to see all of the language,” said Kirby, “structure evolves. The explanation for this is that a structured language can be learned even if you don’t see all of it, because you can generalize pieces of it. Whereas an unstructured language, well, you can imagine a big dictionary where every single thing you might ever want to say is listed with a different word. To learn that language, you’d have to see every single word and learn it. But a language that puts words together and allows you to combine them in different ways can be learned from a much smaller set of examples.”

  As with biological evolution, the road to survival is not straightforward. “What happens,” explained Kirby, “if you’re forced to learn from a small set of examples is that initially you do very badly, but the language itself adapts in such a way that it is more easily learned by you. We see it happening before our eyes in the simulations. The languages change, and eventually, somewhere along the line, a little pattern will emerge, and that will be learned much more easily than all the other ones. So over time you get this adaptation to the learner by the language. It makes total sense psychologically—the language can’t survive if it’s not learned.”

  In 1990 Steven Pinker proposed that our language ability derives from the fact that it is used for communication. Does the virus metaphor comple
tely contradict this approach to language evolution? It doesn’t have to. Pinker argued that the appearance of design was evidence of the hand of evolution. This remains relevant for accounts that focus on the survival needs of language. The strong design constraints shown by language in Kirby’s model still result from evolution—but the object undergoing that particular evolution is language, not us.11

  Kirby, Deacon, and the computation modeler Morten Christiansen, a professor at Cornell University in New York State, are especially interested in why language is learned so readily by children. Their approach flips the old notion of poverty of stimulus on its head: if language is driven to survive, and the language learners of the world are children, language must be adapted to the quirks and traits of the child’s mind. As Deacon puts it, language is designed to be “particularly infective for the child brain.”

  So if language in its very structure has all or most of the clues that children require to learn it, then the need for some kind of language organ starts to look dubious. In its strongest version, this approach means there is no support for the argument that grammar is so complicated that children simply can’t learn it without a grammar-specific device.

 

‹ Prev