The Age of Absurdity: Why Modern Life Makes it Hard to Be Happy (2010)

Home > Other > The Age of Absurdity: Why Modern Life Makes it Hard to Be Happy (2010) > Page 3
The Age of Absurdity: Why Modern Life Makes it Hard to Be Happy (2010) Page 3

by Michael Foley


  Hence a radical extension of an already radical idea – knowledge is not just the beginning of a solution but the entire solution. Understanding is itself transformation. But the transformation is neither immediate nor easy – nor even perceptible: ‘Just as the ocean slopes gradually, with no sudden incline, so in this method training, discipline and practice take effect by slow degrees, with no sudden perception of the ultimate truth.’23 The secret is to persist in the method until ‘reasoned, accurate, clear and beneficial’ behaviour becomes habitual. To be is to become – so the seeker of enlightenment must be ‘energetic, resolute and persevering’. Buddha’s last words were: ‘All accomplishment is transient. Strive unremittingly.’24

  Another key word is ‘method’. Buddhism is not a creed but a method, a set of procedures for dealing with the chain of consequences following from ignorance. But Buddha refused to speculate on the cause of ignorance itself. So there is no theory of the fall of man, no original sin. In fact he refused to answer any metaphysical questions, not because he himself did not speculate but because such speculation was unhelpful: ‘It is as if a man had been wounded by an arrow thickly smeared with poison, and his friends were to procure for him a physician, and the sick man were to say, ‘I will not have this arrow taken out until I have learnt the name of the man who wounded me’.’25

  This refusal to construct a Great Unified Theory of Everything was profoundly wise. For, if there is no dogma, there can be no doctrinal disputes, no heresies, no schisms – and so no inquisitions, no torturing, no burning at the stake. The two main Buddhist sects, the Theravada and Mahayana, have always coexisted in harmony – compare and contrast with the history of Catholicism and Protestantism. And in Buddhism there are no supernatural interventions, no gods, no miracles, no divine revelation, no divine grace or divine incarnation. So there is no need for faith. In fact, Buddha expressly rejected the idea of faith as an abdication of personal responsibility – no one should believe anything just because someone else says so. Each individual must work out a personal solution.

  It is ironic that Christianity, the religion of the rational West, is, in fact, completely irr ational, inconsistent and even absurd, whereas Buddhism, the religion of the mystical East, is completely rational, consistent and even practical – not a creed requiring a leap of faith into absurdity, but a method that can be shown to work. And it is even more ironic that the attractive features of Buddhism make it unattractive to the modern age; while the other major religions are all gaining believers, Buddhism is losing ground.26

  Christian doctrine blamed the flaw in man on original sin, which could be redeemed only by the mysterious workings of divine grace. For over a thousand years this ruled out any investigation of the self or belief in terrestrial fulfilment. It was not until the Enlightenment that thinkers gave the individual hope and scope.

  The ideas of the seventeenth-century Dutch philosopher Baruch de Spinoza were startlingly similar to those of Buddha. The Enlightenment thinkers worshipped reason, but Spinoza realized that reason was riding a tiger, that human nature is driven by largely unconscious ‘appetites’ which enter consciousness as ‘desires’. His expression of this insight could have come from The Dhammapada or the writings of Freud: ‘Desire is man’s very essence.’27 And his views on consciousness could have come from a contemporary neurobiologist: ‘The human mind is the very idea or knowledge of the human body.’28 However, like Buddha, he believed that drives may be controlled by being understood: ‘An emotion ceases to be a passion as soon as we form a clear idea of it.’29

  And, like Buddha, Spinoza is often dismissed as a mere seeker of tranquillity – but what he valued most was joy, which he defined as a sense of empowerment created by the understanding mind. But, again as in the teachings of Buddha, understanding is not a passive, final state, but a process requiring ceaseless effort. In another insight prefiguring neurobiology, which defines living organisms as systems for optimizing life conditions, Spinoza suggested that our very nature is to strive. His Latin word for human nature, conatus, means ‘striving’ or ‘endeavour’: ‘The striving by which each thing attempts to persevere in its being is nothing other than the actual essence of the thing.’30 And the striving has to be difficult to be valuable: ‘If salvation were readily available and could be attained without great effort, how could it be neglected by almost everyone? All that is excellent is as difficult to attain as it is rare.’31

  But seventeenth-century Europe was not ready for this. Where Buddha was revered as a master, Spinoza was reviled as a heretic. His Jewish community in Holland first tried to bribe him (an annuity of a thousand florins) to shut up, then they tried to murder him (the attempted stabbing was foiled by the volumi-nousness of Spinoza’s cloak) and finally they declared him anathema in fine Old Testament style: ‘With the judgement of the angels and of the saints we excommunicate, cut off, curse and anathematize Baruch de Espinoza…with the anathema wherewith Joshua cursed Jericho, with the curse Elisha laid upon the children, and with all the curses which are written in the law. Cursed be he by day and cursed be he by night; cursed be he when he lieth down, and cursed be he when he riseth up; cursed be he when he goeth out and cursed be he when he cometh in…’ So it thunders and thunders before commanding that no one may read Spinoza’s writing, communicate with him or even venture within four cubits of him. Spinoza’s response: ‘This compels me to nothing that I should not otherwise have done.’32

  After Spinoza’s death his writings and ideas were ruthlessly suppressed and it was not until the nineteenth century that Schopenhauer expressed a similar set of insights. His term for the id was the ‘will’, which he defined as ‘a blind driving force’ that causes ‘man’ to be ruled by urges ‘which are unknown to him and of which he is scarcely aware’.33 And Schopenhauer expressed, with matchless eloquence, the insatiability of appetite: ‘the desires of the will are boundless, its claims inexhaustible, and every satisfied desire gives rise to a new one. No possible satisfaction in the world could be enough to subdue its longings, set a limit to its infinite cravings and fill the bottomless abyss of its heart.’34 Foremost among these appetites is the sex urge: ‘Man is deluded if he thinks he can deny the sex instinct. He may think that he can, but in reality the intellect is suborned by sexual urges and it is in this sense that the will is ‘the secret antagonist of the intellect’.’ Sex is ‘the ultimate goal of nearly all human effort’ – and sexual repression will cause neurosis. Schopenhauer was a remarkably insightful psychologist, but he did not believe in social progress or personal fulfilment: ‘In a world where no stability…is possible, where everything is restless change and confusion and keeps itself on the tightrope only by constantly striding forward – in such a world, happiness is not so much as to be thought of.’35

  Nietzsche too came up with similar ideas, which he imagined were thrillingly new when, in fact, many were several thousand years old. He too acknowledged an unconscious driving force, which he called the ‘Self: ‘Your Self laughs at your Ego and its proud efforts. ‘What are these mental gymnastics to me?’ it says to itself. ‘Only a roundabout way to my goal. I am the Ego’s lead violin and I prompt all its ideas’.’36 And this lurking ‘Self is the most persistent and dangerous adversary: ‘But you yourself will always be the most dangerous enemy you can meet; you yourself lie in ambush for yourself in forests and caves.’37 Nietzsche also had the intuition that the drive to optimize is the essence of all living things: ‘Wherever I came upon a living creature, there I found will to power.’38 The ceaseless striving of the human organism he defined as ‘Self-Overcoming’: ‘Life revealed to me this secret: ‘Behold, it said, ‘I am that which must overcome itself again and again’. ‘‘39 And the friction of self overcoming self would generate enough heat and light to make life fulfilling. Nietzsche welcomed difficulty with typical grandiloquence: ‘Whatever does not kill me makes me stronger.’40

  In the twentieth century Freud proposed a similar ego-and-id model of the self that
he claimed was not only new but rigorously scientific. And to establish mastery of the id by the ego there was the ‘scientific’ method of psychoanalytic therapy, which sought to match the cunning of the id by catching it in unguarded moments, exposed in neurosis, free association or dreams (after a hard day’s manipulation of the ego, the id likes to party all night). But the therapist would have to be a special person: ‘The analyst must be in a superior position in some sense, if he is to serve as a model for the patient in certain analytical situations, and in others to act as his teacher.’ 41 In other words, the analyst would have to be as inspiring as a Buddhist Master. But there is an ongoing and acute worldwide shortage of Masters. Few analysts were willing or able to be models or teachers and many settled for being well paid to listen to wealthy neurotics for an hour a week – or, worse, became psychological cosmetic surgeons. I can remember being appalled when the theatre critic Kenneth Tynan revealed in an interview, with no sense of embarrassment or irony, that he had paid an analyst to remove his guilt at leaving his wife.

  And recent neuroscience research confirms the model of the self proposed by thinkers – except that the division between reason and emotion, the ego and the id, is not as clear as Freud and his predecessors believed. According to neuroscientists such as Joseph LeDoux, the brain’s emotional response is largely activated by the amygdala (in the limbic system, the old reptilian brain) and the rational response by the prefrontal cortex (directly behind the eyes).42 So, to put it very, very crudely, the ego is the prefrontal cortex and id is the amygdala. But the emotional brain is capable of thought and the rational brain, with an autoroute directly to the amygdala, is hugely influenced by emotion. And many of the impulsive responses of the emotional brain (for example, intuition) are good, while many of the considered responses of the rational brain (for example, self-delusion) are bad. So it is not exactly true that the ego is the hero and the id is the villain. But, in general, the rational brain makes wiser decisions than the emotional brain. Take the case of Mary Jackson, an intelligent, highly motivated 19-year-old student who planned to go through medical school, marry her boyfriend and establish a paediatric clinic in her deprived inner-city area. Suddenly she stopped attending classes and began drinking, taking crack cocaine, sleeping around and flying into violent rages if criticized. When she was eventually referred to a neurologist, Kenneth Heilman, he discovered from a brain scan that a huge tumour had damaged the prefrontal cortex, making it unable to resist impulses and maintain long-term goals.43 And, in the mid-twentieth century, many surgeons actually caused similar effects by carrying out frontal-lobe lobotomy, a procedure supposed to cure many conditions from epilepsy to schizophrenia. This crude technique, used on thousands of people in prisons and mental homes, involved inserting a scalpel under the eyelid and hammering it through the bone to sever the connections between the prefrontal cortex and the rest of the brain. (Anyone in awe of the Nobel Prize should bear in mind that the 1949 prize for medicine was awarded to the two surgeons who pioneered this lobotomy procedure.)

  The neuroscientist Jonathan Cohen has actually observed the conflict between the emotional and rational brains by putting subjects into a scanner and giving them the option of receiving a gift certificate immediately or a certificate for a larger amount in a few weeks’ time. The prospect of receiving a certificate right away activated the emotional brain, while the prospect of a larger certificate in the future activated the rational brain, the prefrontal cortex – and the area with the strongest activation decided the choice. So Cohen may be the first person to witness the oldest struggle in human history – the ego arm-wrestling the id. And it grieves my prefrontal cortex to reveal that the id mostly won.44

  3

  The Righteousness of Entitlement and the Glamour of Potential

  The limitation of much thinking about the self is that it considers the self as an isolated and immutable entity, independent of personal history and social circumstances. But, of course, there is no such self. Everyone is influenced by temperament and history and the prevailing social climate.

  Marx was the first to recognize the importance of cultural conditioning: ‘It is not the consciousness of men that determines their social being, but, on the contrary, their social being that determines their consciousness.’45 And Freud added to the id and the ego the concept of the superego, the internalized repository of society’s precepts that, like the id, operates below consciousness. But both these models were too simplistic. Conditioning is not a simple oneway transfer but a complex circular process fed by constant feedback loops. What often happens is that changing social attitudes cause a few people to develop a new need or a more urgent version of an old need and an astute entrepreneur notices the development and provides an appropriate product or service. This legitimizes, reinforces and spreads the new attitude, so more people express the need more openly and more entrepreneurs service it. Soon the phenomenon has become a new norm; everyone is doing it. Eventually it becomes the natural law and influences even those who do not have the need and wish never to have it.

  Marx was also too simplistic in assuming that conditioning always comes from the right. In recent times it has just as often come from the left. The 1970s was the decade of liberation, of anger at injustice and demands for recognition and rights. But, over time, the demand for specific rights degraded into a generalized sense of entitlement, the demand for specific recognitions into a generalized demand for attention and the anger at specific injustice into a generalized feeling of grievance and resentment. The result is a culture of entitlement, attention-seeking and complaint.

  The demand for attention is increasingly strong and various, a consequence of inner emptiness requiring identity conferred from without: I am seen, therefore I am. At the lowest level this is expressed as a need to be physically seen. So, in a typical example of feedback, social space is increasingly organized to provide visibility: open-plan design is now the norm in homes, offices, restaurants and bars; there is more and more year-round alfresco eating and drinking; and more and more public areas designed to facilitate ‘people watching’, where the pleasure is as much in being watched. If separation is unavoidable, the walls are transparent, as in the transparent manager’s office and the transparent elevator. The transparent home was an inevitable development – so, in Manhattan, the glass curtain wall is now as characteristic of the architecture as red brick and limestone in the 1920s. And, if all this visibility does not provide enough attention, those with sufficient disposable income can pay to be put under surveillance and⁄or stalked. This may seem an unlikely way to spend money but apparently such services are increasingly popular because they give their customers a unique sense of significance. As the founder of one such service puts it, ‘We’ve had clients who say that they wear nicer underwear or start taking better care of themselves simply knowing they’re being observed. Just knowing there’s attention on them can be enough.’46

  At the next level of attention-seeking, there is the need to be acknowledged as an individual. In its extreme form this becomes a craving for celebrity, the desire to be noticed not just now and then by a few, but to be bathed always in a universal warm glow of recognition, admiration, envy and desire. The contemporary prayer is: Let perpetual light shine upon us – spotlight. And this demand for celebrity is now so overwhelming (31 per cent of American teenagers sincerely believe they will be famous47) that the traditional means of supply – talent feted by the media – has become completely inadequate. It was inevitable that celebrity should become available to the talentless (by becoming stars of reality television, for example) and that it should have developed new channels open to everyone (such as self-promotion on the internet).

  And at the level above the individual is the demand for recognition of group identity. Here, attention-seeking, entitlement and complaint combine in the increasingly common phenomenon of taking offence, where some powerful group decides that its right to appropriately reverent recognition has been violated a
nd that it is due retribution. The beauty of taking offence is that the threats of the bully can be presented as the protests of the victim so that the ego can bask in virtue while the id exults in aggression. The arbitrariness is also appealing. Anyone can decide to take offence at anything and this ever-present potential creates a climate of fear satisfying to bullies.

  Of course no one foresaw any of these developments back in the 1970s during the heady liberation of women, gays, black people, youth and sex. Liberation was exhilarating, an unqualified good. As soon as the yoke of oppression was lifted everyone would inevitably flourish.

  Many on the political right were appalled; others saw an opportunity. Money, too, demanded to be liberated – and had its wish in major financial deregulation. Free at last to express the gypsy in its soul, money became restless, promiscuous and irresponsible. It would lie with anyone attractive but rarely stay a full night. Investors were no longer prepared to wait for the long-term return of dividends; they demanded a quick return by reselling shares at a profit – so share price rather than performance became the measure of company success. And share price tended to rise when companies seemed to be doing something new and exciting. So financial sexiness came to depend on appearing dynamic, flexible and innovative. Stability, on the other hand, became a dowdy frump – ugh.

  The smart move was to attack one’s own organization with a chainsaw. So began the mania for restructuring, even in institutions without shareholders, such as government departments, universities and the BBC. But there are a limited number of structures, so anyone who remained in an organization long enough found that their first structure, long since discarded as hopelessly out of date, eventually came round again as the newest thing. It was only after many years of employment that I began to see something in Nietzsche’s concept of eternal recurrence.

 

‹ Prev