Book Read Free

How Language Began

Page 10

by Daniel L. Everett


  Although there are some who claim that we can only study I-languages, this is misleading. E-languages can also be studied. In fact, thinking about this more carefully, the only way to infer anything about a speaker’s internal languages is to examine the utterances of their E-language. E-languages are the gateway to I-languages.

  Moreover, our inferences about the analysis of any sentence or set of sentences are always based upon a particular theory. The interactions observed from speakers are the essential source of evidence of what speakers know, regardless of how we test them. Of course, it is an obvious fact that a given label for an E-language, say ‘English’, is an abstraction. After all, what is ‘English’? British, Australians, Jamaicans and Californians all speak it in one form or another, but which is or is closest to ‘real’ English? How is ‘real’ English pronounced? What are its grammatical rules? There is simply too much variation in English around the world for anyone to say definitively what English is. Moreover, the sentences, stories and conversations that form the database for discussing English do not themselves exhaust the language. There are always data from some variety of English spoken somewhere in the world that have not yet been collected. It is in this sense, therefore, that English is an abstraction. At the same time, the utterances one hears or the sentences one reads are not abstractions. They are the very concrete, empirical sources of what speakers know, what cultures produce and what people actually do. Asserting that one ignores what people actually say – their ‘performance’ as some call it – in order to understand their ‘competence’ (what people know about their language as opposed to what they do) is like claiming that college exams show nothing because they only measure performance (the answers students give), not competence (what students really know). Exams, however, exist precisely because performance is the only way to gauge competence. Whether the competence one wants to understand is knowledge of how to engage in dialogue, how to tell stories or how to produce individual sentences, one can only figure out what speakers know by what they do.

  No one ever directly studies what people know. To assert that they do is a common error in thinking. Rather one infers knowledge through behaviour. It should also be remembered that the quote on page 74 ignores the fossil record, which makes it clear not only that language, culture and communication were part of the same cluster of socially evolved traits of human cognition but also that there was a slow semiotic progression fuelled by natural selection.

  Moreover, seeing communication as the primary purpose of language facilitates the understanding of what is most interesting about language – its social applications. Thus, for many researchers, in the study of language grammar takes a back seat to things like conversational interactional patterns, discourse topic-tracking, metaphor, the usage-based accounts of grammatical forms, and cultural effects on words and how they are put together. Pursuing these ideas, and based on everything discussed so far about the evolution of Homo species, three hypotheses for the origin of human language come to the fore. Each of these takes a different view on the relative importance of when grammar arose in the evolution of human languages.

  The first hypothesis is known as ‘Grammar Came Last’. According to this idea, the most significant and first step in the evolution of language would be the development of symbols. Grammar is little more than an add-on. Language would have existed before grammar. In this idea, grammar required all of the rest of language to exist before it could become operational. In other words, language first needed symbols, utterances and conversations before it created a grammar to structure, and thereby enhance, our communication.

  The second idea, a very popular one, is that Grammar Came First. According to this proposal, evolution of language is primarily about the origin of computational properties of language, such as syntax. Without these properties there is no language. Symbols, gestures and other components of language may have been around in some form previously, but patterns emerged that brought all of them together for the first time as a language. By this idea, there is no language without a very peculiar kind of computation. But a simpler idea is available, namely that the ability to ‘chunk’ words or symbols into ever-larger units – phrases, sentences, stories and conversations – is really the basis of all computation in language. This ‘combinatoriality’ informs our interpretation of words – without it in fact we cannot understand well the individual constituents of sentences. Think of a string of words such as ‘if the girl is pretty then he will run up to her’ and compare this string to ‘run the up pretty her if then girl is will he to’. Structure guides the interpretation of the words and, over time, it more finely hones their meanings, bringing about nouns, verbs, prepositions and modifiers. Some take this checking to mean that form is a very specific kind of recursive, hierarchical entity. By this view language is more than what can be diagrammed along the lines shown in Figure 8 below.

  This type of diagram is typically used by linguists to represent the constituent structures of sentences. Though it might appear complicated (it isn’t really), the tree structure does indeed seem to be necessary in order to understand how modern speakers of English construct their sentences. In this example the graph represents one sentence, ‘… Bill saw Irving’ as a constituent of the larger sentence beginning ‘John said that …’ Likewise, the verb phrase ‘… saw Irving’ is part of the larger chunk ‘Bill saw Irving’. Furthermore, psychologists, cognitive scientists, linguists and others have demonstrated convincingly that such structures are not mere artefacts. They seem to reflect what native speakers of English know in some way about their language. The grammatical structures every native speaker knows are more complicated than any that they are taught in their composition classes.

  Figure 8: John said that Bill saw Irving

  The third principal hypothesis falls between the first two. It is that Grammar Came Later. Though symbols came first, the evolution of language required a synergy between grammar, symbols and culture, each one affecting the other. In this view, structure, symbols and culture are co-dependent, together producing meaning, gestures, word structures and intonation to form each utterance of a language.

  The notion that the form of sentences followed the invention of symbols might be interpreted in various ways. To understand this idea better, it can be instructive to view it in the context of the other two hypotheses.

  Each of these possibilities accords a role to structure in language evolution. This is because linguistic form is of enormous importance in human communication and thinking. At the same time, claiming that the design of sentences is crucial for language raises questions. Perhaps the most important question is whether tree diagrams like Figure 8 are either necessary or sufficient for human language from an evolutionary perspective.

  To proponents of the Grammar Came First hypothesis, hierarchical structure is the most important aspect of human language. And, again, many who adopt this hypothesis believe that language appeared suddenly, as recently as 50,000 years ago. According to them, not only did language not exist prior to Homo sapiens, but not even all Homo sapiens would have had language (since the species is more than 200,000 years old). A sudden mutation, say 50,000 years ago, would have not affected all sapiens, but only the descendants of the ‘Prometheus’ that won the language-gene mutation lottery. The idea that a mutation had an effect that ultimately left most of a species alive at the time at a disadvantage, or ‘less fit’ in Darwinian terms, is not unusual. DDT-resistance mutations have occurred in some insects, leaving them and their offspring the ability to thrive in a DDT-rich environment, while their conspecifics die off. But anyone proposing such a hypothesis obligates themselves to explain the evolutionary implications of their idea. The grammar mutation, to be of relevance to the evolution of language, must have made its possessor and his or her family more ‘fit’ than other sapiens, that is, more likely to survive. Or perhaps it was favoured by sexual selection, with better talkers becoming more appealing to the opposite sex and thus getti
ng more sex and having more offspring. A third alternative is that a family with the language gene left the area and, via a population bottleneck, became the founder population for subsequent Homo sapiens, guaranteeing that all sapiens would have the mysterious ‘language gene’.‡7

  This contrasts sharply with claims that language appeared very gradually over at least the past 3 million years and that all humans today have, and probably all Homo species in the past had, language.

  Very importantly, hierarchical grammars, those whose structure requires tree diagrams, the kinds of grammars touted as crucial to language in many approaches to linguistic analysis and language evolution, are simply by-products of information-processing tasks independent of grammar. Nobel-prize-winning scientist Herbert Simon wrote about this in the early 1960s in one of the most famous papers of the twentieth century, ‘The Architecture of Complexity’, Simon wrote that

  the central theme that runs through my remarks is that complexity frequently takes the form of hierarchy, and that hierarchic systems have some common properties that are independent of their specific content. Hierarchy, I shall argue, is one of the central structural schemes that the architect of complexity uses.

  And also:

  I have already given an example of one kind of hierarchy that is frequently encountered in the social sciences: a formal organisation. Business firms, governments, universities all have a clearly visible parts-within-parts structure. But formal organisations are not the only, or even the most common, kind of social hierarchy. Almost all societies have elementary units called families, which may be grouped into villages or tribes, and these into larger groupings, and so on. If we make a chart of social interactions, of who talks to whom, the clusters of dense interaction in the chart will identify a rather well-defined hierarchic structure. The groupings in this structure may be defined operationally by some measure of frequency of interaction in this socio metric matrix.

  Hierarchy would have been useful to Homo species as a way of understanding and constructing social relationships, of organising tasks and even structuring language. And we see such hierarchy in the organisation of the Homo erectus settlement of Gesher Benot Ya’aqov. But hierarchy is something that would only be needed in direct proportion to the growth of complexity in communication content – what is being talked about – as information flow grew faster and more complex. Information-rich communication, especially when coming at high rates of speed typical of human languages, will be aided, just as Simon predicted, by being structured in particular ways.

  For example, the first three utterances will in all probability be harder for the average hearer to understand than the second three:

  The moon is made of green cheese. Or so Peter says.

  The moon is made of green cheese. Or so Peter says. Says John.

  The moon is made of green cheese. Or so Peter says. Says John. Says Mary. Says Irving. Says Ralph.

  Peter says that the moon is made of green cheese.

  John says that Peter says that the moon is made of green cheese.

  Ralph said that Irving said that Mary said that John said that Peter said that the moon is made of green cheese.

  The reason is that the first set lacks recursion – sentences are all independent, side-by-side, but the second set does have recursion – one sentence inside another. It is a fact that, because of the complexity of the multiple quotes, recursion helps us to process the sentences more effectively. Although such sentences sound a bit artificial out of context, they are observed in English. Certain languages, however, those that lack recursion, can only produce sentences like those in the first set. As demands from the complexity of the society, such as the hearer or speaker interacting with more and more people they do not know, increase, so do the demands on grammar, though each society and language pair must be studied individually. It is possible to have complex grammars in simple societies, or the other way around.

  An advocate for a mutation for language would have to explain why there are no well-established cortical specialisations for language or speech, aside from reusing parts of the brain for a variety of tasks, as many have claimed.6 The growth of the prefrontal cortex, itself associated with toolmaking and sequential actions, helped to prepare the brain for language, by providing the cognitive firepower necessary for actions where procedures or improvised sequences are required. This is a form of exaptation, evolution’s reuse of something that evolved for one task to perform another task, as in the use of the tongue, which evolved for food intake, in the articulation of speech sounds.

  Grammars cannot exist without symbols. This entails that, even though grammars refine the meaning of symbols, grammars must follow symbols in the historical evolution of language.

  Non-human creatures appear to use syntax. Therefore grammar is not exclusive to humans. Consider Alex the parrot, who, according to years of research by Irene Pepperberg, spoke (some) English and could understand even grammars with recursion and tree structures.

  Humans have evolved away from cognitive rigidity (a by-product of instincts), to cognitive flexibility and learning based on local cultural and even environmental constraints. Under these assumptions, any grammatical similarities found across the world’s languages would not be because grammar is innate. Rather they would indicate either functional pressures for effective communication that go beyond culture or simple efficiency in information transfer. An example of a functional pressure is the fact that in most languages, prepositions with less semantic content are shorter than prepositions with more content, as in the contrast between, say, ‘to’ or ‘at’ vs ‘about’ or ‘beyond’. An example of efficiency in information transfer is seen in the fact that less frequent words are more predictable in their shape than those speakers use more frequently. So the verb ‘bequeath’, as a less-frequent verb, has a simple conjugation: ‘I bequeath’, ‘you bequeath’, ‘she bequeaths’, ‘we bequeath’ and ‘everyone bequeaths’ (this general principle is known as Zipf’s Law). But the common verb ‘to be’, is irregular, as in ‘I am’, ‘you are’, ‘he is’, ‘we are’ and ‘they are’. Both functional pressure and information transfer requirements optimise language for better communication.

  Therefore, while grammar was neither first nor last in the evolution of human language, it necessarily came later than symbols. This conclusion is predicated on the evidence that in human interactions meaning is first and form second. Grammar does facilitate meaning transfer, but grammar is neither necessary nor sufficient for linguistic meanings.

  Again, though, if grammar came later, what came first? Basically, two foundational advances were required of the genus Homo to start it on the road to language, both preceding grammar. We know this through the fossil record. Icons, indexes and symbols appear in the palaeontological record before evidence for grammar, just as the progression of signs would have it. Second, the prerequisite of culture is shown partially in the necessity of intentionality and conventionality in the appearance of symbols. Finally, languages without structure-dependent grammars exist. The evolution of language followed instead the path of the ‘semiotic progression’ shown in Figure 1 and repeated here.

  Figure 1: The semiotic progression

  Let’s consider the components of Peircean semiotics presented in the diagram that are crucial to the evolution of language. First, there are indexes. Indexes are ancient, far predating humans. Every animal species uses indexes, which are physical connections to what they represent, such as smells, footprints, broken branches and scat. Indexes are non-arbitrary, largely non-intentional linkages between form and meaning. If an animal could not interpret indexes, then lions would never find prey, hyenas would search in vain for carrion and monkeys would be hard-pressed to avoid snakes and Accipitriformes (birds of prey). One can even cultivate an ability to detect and recognise indexes, as Native Americans, trained trackers, hunters and others do.§

  It is advisable to develop such an ability. On my walks through the Amazonian and Mexican jungl
es with different indigenous peoples, it was clear that they used indexes to know where they were, what flora and fauna were in their non-visible surroundings, where water was located and what direction would be best for hunting. People sniff, listen, look, feel and taste their way through the forest. Those unfamiliar with the indexes common to the jungle are often oblivious to the indexes they encounter, perceiving random smells, sights and so on, without recognising what they reference.

  The deep knowledge of local index meanings can be referred to as ‘emic’ – or insider – knowledge.8 Indexes are a vital rung in the ladder of human communicative evolution. And as they become enriched by culture, their significance in communication becomes even greater.

  In a sense, indexes are a form of metonymical communication with nature, that is the use of the parts of something to perceive the whole (such as deer scat as a stand-in for the whole deer and the footprints of a horse for the horse itself). Even though the ability to recognise and interpret indexes can be culturally acquired, indexes are not enough to build a language. These indexes are inseparably connected physically to individual objects or creatures and therefore they lack arbitrariness and intentionality – two crucial components of symbolic language. Such primitive indexes have a limited role in human language because cultural rather than necessary linkage between form and meaning is essential to the latter.

  Cultural linkage in the absence of a direct physical connection or resemblance between the meaning and the form associated with it dramatically increases the number of forms that may be used to link to meanings. In English we refer to a canine as dog. In Spanish as perro. And in Portuguese as cão. These are just arbitrary symbols for canines selected by these particular languages. There is no deep connection between the sounds or letters in ‘dog’ and the pet canine. It is simply what it is called in English. Thus its form is not necessary, culturally determined by convention. Lack of arbitrariness therefore means that indexes are unable to serve as the basis for language. But arbitrariness is a later step in the semiotic progression. It is preceded by intentionality. (Languages do have indexes where intentionality and arbitrariness have been added, going beyond the most primitive indexes shared by most species. These are words like ‘I’, ‘here’, ‘this’ and so on.)

 

‹ Prev