by Alex Boese
The account he was referring to is one you might have read. It was the Iliad, the great epic poem of ancient Greece. It describes events during the Trojan War, which historians believe occurred around 3,200 years ago. We don’t know exactly when the Iliad was written down, though it was hundreds of years after the war itself. The narrative, however, seems to have been based upon oral tales that were passed down from the time of the war itself. For this reason, Jaynes argued, it could give us a glimpse into the mental world of those people who had lived over 3,000 years ago. It was an artwork, he said, produced by a bicameral consciousness.
When Jaynes read the poem with this goal of trying to reconstruct the mentality of its ancient authors, it quickly became apparent to him that they possessed a very different view of reality than we do. For a start, the characters in the poem displayed absolutely no self-awareness. They never paused for moments of introspection or contemplation. Jaynes noted that the Greek in which the Iliad was written didn’t even have words for concepts such as consciousness, mind or soul.
Even more intriguingly, whenever the human characters needed to make a decision, they didn’t. Unfailingly, a god would appear and tell them what to do. Jaynes detailed examples of this. When Agamemnon takes Briseis, the mistress of Achilles, a god grabs Achilles by his hair and warns him not to raise his sword against Agamemnon. Gods lead the armies when they go into battle and then urge on the soldiers at each turning point. It’s the gods who cause men to start quarrelling. In fact, it’s the gods who are responsible for the war in the first place. As Jaynes put it, ‘the gods take the place of consciousness.’
Classical scholars before Jaynes had noted this peculiar role played by the gods in the poem, but they had always interpreted it as a kind of literary device. Jaynes challenged this assumption. What if, he asked, the gods in the Iliad weren’t intended to be fictional, but were an actual description of reality as ancient people experienced it? What if they really did hear and see gods giving those commands? After all, he noted, the characters in the poem treat the gods matter-of-factly, as if they were genuinely present.
Of course, he conceded, the Greek and Trojan warriors weren’t really seeing gods; they were experiencing auditory and visual hallucinations, but they wouldn’t have been able to make this distinction. For them, with their bicameral consciousness, the gods would have seemed very real. Jaynes stated his case bluntly: ‘The Trojan War was directed by hallucinations, and the soldiers who were so directed were not at all like us. They were noble automatons who knew not what they did.’
In Jaynes’s chronology, the Trojan War took place towards the end of the era of the bicameral mind. The transition to modern consciousness began soon after.
The cause of this transition, as Jaynes argued, was that the bicameral mind was too rigid to respond well to truly novel situations. So, as communities grew ever larger and began to bump up against neighbouring groups, signs of tension emerged. A more flexible method of regulating behaviour was needed. In essence, the world of these ancient humans grew more complex, and they required a more sophisticated form of brain organization to deal with it. The end result was the development of modern self-awareness. Instead of waiting to hear the hallucinated commands of a god, people developed an internal ‘I’ that could make decisions.
Jaynes emphasized that this development didn’t involve a physical change to the brain. Modern consciousness was a learned adaptation, a socially acquired skill – one that was then taught to succeeding generations. Even today, Jaynes noted, consciousness is something we learn as children, achieving full self-awareness only around the age of seven. One of the unusual aspects of the brain, in fact, is that it’s an extremely malleable organ that develops in response to, and guided by, social contact. Without it, the brain atrophies, such as in the rare cases of feral children raised without human interaction, who permanently lose the ability to learn language and, apparently, to achieve rational consciousness.
But, according to Jaynes, the bicameral mind didn’t disappear completely. He argued that it left its mark in many different ways, such as through religion, much of the history of which consists of people hearing voices in their head that they believe to be gods. The prophet Moses, for example, took directions from a voice that supposedly emanated from a burning bush.
Jaynes also proposed that remnants of the bicameral mind endure into the present in the form of schizophrenia. Those who suffer from this condition continue to hear voices in their head, which they often interpret to be those of gods, demons or angels. Whereas in the ancient world such voices conferred advantages, in the era of unicameral consciousness the voices have become an active hindrance, a medical condition in need of treatment.
Jaynes’s fellow scholars weren’t quite sure what to make of his theory. Some of them dismissed it as ridiculous. Others were more ambivalent. Writing in 2006, the biologist Richard Dawkins commented, ‘It is one of those books that is either complete rubbish or a work of consummate genius, nothing in between! Probably the former, but I’m hedging my bets.’
Scholars offered a variety of criticisms of Jaynes’s thesis. Classicists noted that, although most of the Iliad is consistent with his theory, not all of it is. When Hector decides to accept the challenge of Achilles, he does appear to engage in introspective contemplation. Psychologists, on the other hand, argued that his suggestion that schizophrenics might experience a kind of bicameral consciousness seemed unlikely, because their hallucinations are far more complex and varied than the type he described. Christopher Wills, a biologist at University of California, San Diego, pointed out that modern-day hunter-gatherers who live isolated lifestyles appear to have the same form of consciousness as everyone else, but, if Jaynes was right, this shouldn’t be the case, given their lack of contact with the outside world.
For the most part, however, Jaynes’s hypothesis has simply been ignored by scholars. Due to its highly interdisciplinary nature, it seems to fall through the cracks, rarely getting cited. It occupies its own oddball academic niche.
Jaynes does, however, have a few ardent supporters who are convinced he may be onto something. As evidence of this, they point to some findings that have recently emerged from the study of prehistoric cave art.
Our Stone Age ancestors who lived in Europe 10,000 to 30,000 years ago left behind fantastic artwork in caves, mostly depicting the animals in their environment. The sophistication of this art has led most researchers to conclude that they must have possessed minds much like ours. But, in 1999, the cognitive scientist Nicholas Humphrey published an article in the Journal of Consciousness Studies in which he argued that this wasn’t necessarily the case. Just as Jaynes had found hints in the Iliad of a mentality very alien to our own, Humphrey discerned similar peculiar qualities in cave art.
In particular, Humphrey drew attention to the curious similarities between the prehistoric art in Chauvet Cave, in France, and that of an autistic girl named Nadia, born in 1967, in Nottingham. Nadia was an artistic savant. At the age of three, she began producing remarkably accomplished drawings, even though she had received no instruction at all and, indeed, almost entirely lacked language skills. Her drawings, however, displayed a highly distinctive style. Her subject of choice was animals, particularly horses, and she frequently mixed together parts of animals to produce chimera creatures. She had a marked preference for side-on views, and she emphasized faces and feet while largely ignoring much of the rest of the body, often drawing figures haphazardly, one on top of another. This style seemed to stem from her autistic tendency to focus obsessively on individual parts, while ignoring any larger context.
Humphrey noted that the cave art displayed a very similar style, such as the focus on animals, chimera creatures and the seemingly haphazard overlap of drawings. So, while cave art has frequently been offered as evidence of the emergence of the ‘modern’ mind, he argued that the exact opposite might be true. It might instead be revealing how strangely pre-modern the minds of those ancient pai
nters were.
Humphrey didn’t cite Jaynes, but his conclusion strongly echoed the bicameral hypothesis. It offered another hint that our ancestors may have perceived reality in ways profoundly unlike the way we do. Their brains may have been biologically much the same as ours, but they could have been organized along very different principles.
If this was the case, it raises the intriguing thought that our mentality can change. The brain might be able to reorganize itself to navigate reality in new ways. This could have happened before, when it switched from a bicameral to a modern consciousness, and if it’s happened in the past, it might occur again in the future. If so, one has to wonder, what strange new form would it assume?
Weird became plausible: beer before bread
Around 12,000 years ago, our ancestors gave up their nomadic lifestyle as hunter-gatherers and settled down to become farmers. It was a pivotal moment in our history as a species, because agriculture led directly to civilization and so to the modern world. But why did our ancestors make this change? This question has long puzzled scientists. Research suggests that hunter-gatherers enjoyed a pretty good life. They had abundant leisure time, and their diet was healthy and varied. Being a farmer, on the other hand, was back-breaking work and the diet was monotonous, which led to problems of poor nutrition and disease. In other words, agriculture didn’t seem to improve the standard of living for most people. So, what inspired them to embrace it?
The obvious answer is food. Agriculture would have provided Neolithic humans with a steady supply of grain, which can easily be stored for long periods of time and later used to bake bread. The security of this was surely preferable to the constant threat of not being able to find any food. A less obvious answer, however, is beer. After all, grains can be used to make either bread or beer. Perhaps our ancestors were lured by the appeal of intoxication, and they started planting crops with the intent of becoming brewers rather than bakers.
This is known as the beer-before-bread hypothesis. When it was first proposed in the 1950s, scholars treated it as a bit of a joke. The idea seemed too silly to take seriously. In recent decades, however, it’s been steadily gaining respectability, to the point that the idea can no longer be dismissed as a humorous ‘what if?’ Many researchers now regard it as quite possible that beer created civilization.
The hypothesis debuted in 1953 following the discovery by archaeologist Robert Braidwood of evidence of Neolithic grain cultivation at a site in Iraq. Braidwood argued in a Scientific American article that changing climatic conditions had made it easier for people to grow grains in the region, and so, he concluded, the production of bread must have been the driving force behind their decision to give up hunting and live in sedentary farming villages. But University of Wisconsin botanist Jonathan Sauer promptly challenged this assumption. What if, Sauer asked, making beer was the purpose of cultivating the grain?
To Braidwood’s credit, he didn’t dismiss Sauer’s suggestion out of hand. In fact, he confessed to being quite intrigued by the idea, and he conceded that the evidence didn’t clearly support one hypothesis over the other. He had found the remains of cereal grains, as well as the tools for planting and reaping them, but there was no hint of what people were doing with the grains. So, Braidwood decided to put the question to a panel of experts from anthropology and archaeology. Does it seem more likely, he asked them, that our Neolithic ancestors adopted agriculture to make bread or beer? Their responses appeared in an issue of American Anthropologist.
Sauer was given the chance to make his case first. He argued that growing and gathering grain would have been an extremely time-consuming process for our ancestors, with the tools they had available. Would they have gone to all that trouble, he asked, for bread? Surely beer would have been more worth the effort. He also pointed out that archaeologists had consistently found wheat and barley grains in combination at Neolithic settlements. This seemed to him like the ingredients for beer rather than bread.
Most of the experts, however, were sceptical. The Danish archaeologist Hans Helbæk joked that it seemed a bit like proposing that early humans domesticated cows in order to make alcoholic beverages from their milk. The botanist Paul Mangelsdorf was even more doubtful of the idea. If people were spending all that time raising crops to make beer, he wondered, what were they eating? Or were they just drinking all the time? He asked contemptuously, ‘Are we to believe that the foundations of Western Civilization were laid by an ill-fed people living in a perpetual state of partial intoxication?’
The general conclusion was that the production of some kind of gruel or porridge had probably preceded both beer and bread, since this can be made simply by pouring water on grains. Eventually our ancestors would have figured out that, by cooking the gruel, they could transform it into bread. The brewing of beer, the experts decided, must have followed later.
And that seemed to settle the matter. After the symposium, the beer-before-bread hypothesis disappeared from sight. Scholars proceeded to assume that the transition to agriculture had been a sober affair.
This remained the consensus for thirty years, until the 1980s, when two researchers at the University of Pennsylvania, Solomon Katz and Mary Voigt, revived the case for beer. They noted that, during the intervening decades, archaeological evidence had weakened the case for the bread-first hypothesis. Studies were finding that, for several thousand years after the initial cultivation of grains, Neolithic people had continued to consume a wide variety of plants. This suggested that the decision to take up agriculture had been driven by a cultural desire, rather than by a biological need for food. These early societies were using the grain for something they wanted rather than required. That sounded more like a case for beer than bread.
Simultaneously, evidence from the field of human nutrition had strengthened the beer-first hypothesis. Research had revealed that fermentation is an excellent way to unlock the nutritional content of grains. It converts them from a relatively low- to a high-nutrition food by adding lysine, improving the B-vitamin content and allowing essential minerals to be absorbed more easily. Neolithic beer would also have been calorie dense and full of soluble fibre – completely unlike the thin, filtered lagers one finds in supermarkets today. Plus, the alcohol content would have killed bacteria, making it safer to consume than gruel. It would even have had medicinal value, because it naturally contains the antibiotic tetracycline, produced during the fermentation process. Overall, beer drinkers might actually have enjoyed a significant evolutionary advantage over those who chose to abstain. And this didn’t even factor in the pleasurable intoxicating effect provided by the beer.
Given these new findings, Katz and Voigt argued, it was quite plausible to imagine that the discovery of fermentation had been the trigger that prompted early people to start purposefully planting grains.
In the twenty-first century, more support for the beer-before-bread hypothesis has come from several lines of evidence. Anthropologist Patrick McGovern of the University of Pennsylvania has been using biomolecular analysis to examine the residue lining ancient pottery shards. This allows him to determine what was once stored in the pots. Sure enough, more often than not, it was fermented beverages. He’s been able to determine that a pot found at the Godin Tepe archaeological site, near the Iran–Iraq border, dating back 5,500 years, contained a barley-based beer.
And, coming at the issue from a completely different perspective, cultural anthropologist Brian Hayden of Simon Fraser University has argued that we shouldn’t underestimate or trivialize how much our ancestors liked to party. Having gatherings, then as now, served very basic social needs. It bonded communities together, which would have been advantageous from an evolutionary perspective. And most might agree that parties are usually better with beer than without it.
Hayden notes that potlatches, or elaborate ceremonial feasts, are known to play an important role in the cultures of many tribal people, so he imagines that Neolithic people would have often held feasts to demonstra
te their wealth and power to their neighbours. He refers to this as ‘competitive feasting’. In this context, beer might have been seen as a high-value food item that contributed greatly to the festivities. It would have been something that, for social reasons, our ancestors would have been very motivated to produce. Bread, on the other hand, simply wouldn’t have offered similar cultural rewards.
All these arguments for the beer-before-bread hypothesis have earned it intellectual respectability, although it hasn’t quite gained the status of academic orthodoxy. The evidence for it remains circumstantial. But then, so does that for bread first. We’ll probably never know for sure what the truth is, but it’s quite possible that our ancestors were brewers before they were bakers. Katz and Voigt summed up the case for it this way: imagine that you were a Neolithic person and you could have had either gruel, bread or beer with your meal. What do you think you would choose?
What if Homer was a woman?
Throughout history, society has been organized rigidly along lines of gender. The prevailing belief, virtually unchallenged until the twentieth century, was that men produced all things to do with high culture (the arts, politics and sciences), whereas women produced nature (i.e. babies). Men reigned supreme in the public sphere, while women governed the domestic sphere. This was regarded as the natural order of things.
This deep-rooted gender division was reflected in the Western literary canon, which is the list of authors held up as the greatest exemplars of European culture. Until well into the twentieth century, the list consisted entirely of men, including authors such as Milton, Shakespeare, Chaucer, Dante, Virgil, Sophocles and Homer. The genius of these writers, whose works countless generations of students had to learn, was supposed to offer reassuring proof that men really were the superior producers of culture.
For over 2,500 years, Homer was the one constant presence on the list. His two epic poems, the Iliad and the Odyssey, were widely believed to be not only the greatest, but also the earliest literary works ever written in a European language. They were the foundation upon which Western civilization was built. His cultural influence was immeasurable.