The Self Illusion

Home > Other > The Self Illusion > Page 27
The Self Illusion Page 27

by Bruce Hood


  How is this online activity going to affect their development, if at all? A child’s social development progresses from being the focus of their parent’s attention as an infant and preschooler to stepping out and competing with other children in the playground and class. Initially, children’s role models are their parents but as they move through childhood and develop into adolescents, they seek to distance themselves from the family so that they can establish an independent identity among their peers. As a parent, I have come to accept that what peers think can easily trump whatever a parent wants for their child. This may not be such a bad thing. I am not a therapist but I can easily believe that overbearing parenting creates later problems if children are not allowed to establish their identity among their peers. These are the popularity contests that preoccupy most adolescents in Western culture. It is through this chaotic period of self-construction that the adolescent hopefully emerges out the other side as a young adult with the confidence to face the world.

  As every parent in the West knows, adolescence is typically a time of rebellion, bravado, showing-off, risk-taking, pushing boundaries, arguments, tears, strategic allegiances and Machiavellian negotiation. Some blame immature brains, which has perpetuated the ‘teen brain’ hypothesis – that the disruptive behaviour of this age group is the inevitable consequence of lacking of inhibitory control in the frontal lobes which are some of the last neurological structures to reach adult levels of maturity. Teenagers are hypersensitive to social evaluation, but does that explain the increase in risky behaviour? Psychologist Robert Epstein believes the teen-brain account of delinquency is a myth – that adolescent turmoil is more to do with culture and the way we treat our children.15 He points out, for instance, that teenage misbehaviour is absent in most pre-industrialized societies. He argues that teenage delinquency is an invention of modern societies and media stereotypes, and describes how, before the arrival of Western media and schooling, the Inuit of Canada did not have the problems of teenage rebellion. For him, the problems we see in Western teenagers are more to do with the way we isolate this age group and effectively let them establish their own social groups and hierarchies. These are the pecking orders of popularity through the processes of affiliation, competition and establishing one’s self esteem in the eyes of one’s peers. In this situation, teenagers think that in order to gain a reputation among their peers, they have to be outsiders to the rest of society.

  Others argue that the data on teenage risk-taking is incontrovertible. It may be somewhat culturally influenced, but far from being erratic, teenagers are just as good as adults at reasoning about risk. They simply consider some risks to be worth taking to prove themselves. Developmental psychologist Lawrence Steinberg has shown that teenagers perform just as well as adults on simulated driving tasks when they are alone, but run more risks when friends are watching them.16 When friends were present, risk-taking by speeding and running red lights increased by 50% in teenagers whereas there was no increase in adults. One neural account is that the reward centres in the teenage brain are highly active during this period of development. Rewards are thus super-charged when individuals succeed in front of their friends which makes success all that more sweet and the risks to achieve them worth taking. But it is not enough to succeed. One has to be seen to succeed.

  In the West, adolescents are faced with the paradox of wanting to be accepted by their peers but at the same time needing to be different. Music, fashion, films and, of course, sex are the things adolescents care about the most because these are the very things that help to create the unique identities they seek. It is no coincidence that these are the main topics of ‘likes’ and ‘dislikes’ in online social networks. Whenever you put something up on a social network, you are inviting a response from your peers. It is not for your own private viewing but rather you are broadcasting your presence to the world. The number of comments and hits your activities generate tell you and, more importantly, others that you matter. Most of us want to be noticed and social networking sites make this universal yearning the core of its activity. Having others validate your presence is the currency of popularity that individuals seek.

  What a Twit I Am

  One day earlier this year, the famous British actress, Dame Helen Mirren, started to follow me. I wasn’t walking down the road in London or Los Angeles where the Oscar-winner probably spends most of her time. Rather, I was seated at the kitchen table in my Somerset barn, taking a break to look at my Twitter account, when I saw that @HelenMirrenDBE was following me. Or at least I thought she was. My heart skipped a beat.

  For the uninitiated, Twitter is a site where you can post any message so long as it is less than 140 characters. It’s like open access texting to the world where anyone who follows you can read your messages or link to images and other websites that you might put up. I currently have over 3,000 Twitter followers. I don’t even personally know that many people and if you sat me down, I would be hard-pressed to name a couple of hundred individuals. Even though my following may sound impressive, I am way down on the Twitter hierarchy. Individuals whom you would never imagine being icons of interest are huge on Twitter. Lance Armstrong, the top cyclist, has well over a million followers. So does the actor Brent Spiner. Who was Brent Spiner I wondered? None other than the actor who played the android, ‘Data’, on Star Trek. There are a lot of Trekkies out there!

  What is it about Twitter that makes it so appealing? Why do we follow and want to be followed? It probably comes down to a number of related issues to do with the self. First, the human brain is a gossiping brain – we are nosey and want to know what others are up to even if that includes what they ate for breakfast that day. Second, we like our opinions to be validated by others. When someone responds positively to an opinion or shares it with others, we feel vindicated. Of course, if our opinion is rejected or ridiculed then our self-esteem is deflated. Having the option to follow or unfollow others means that individuals within a social network tend to share the same values and attitudes.

  We also like to be the first to know something or spread the word. This is something we did as children. Remember how important it was to be the first to spread the word in a playground? If you were the last to find out something then that was a reflection of how important you were in the pecking order. By being the first to know something, we cement our self-importance with others. However, one of the most powerful draws of social networking sites like Twitter is that they make you feel important if you have a large number of friends or followers. Your self-worth is validated and the more followers and friends you have, the more you value your self.

  Another reason why Twitter has taken off (it is the fastest growing social network) is that celebrities happily post their thoughts and updates on a regular basis. These used to be off-limits to the general public. Suddenly we have access to the famous in a way that was never possible before. The public’s appetite for celebrity trivia has long been insatiable. There is a whole industry of paparazzi and tabloid press that has evolved out of the primeval slime to provide the gossip to feed the masses but Twitter is far superior because it comes directly from the celebrities. Of course, celebrities need their followers because without the fans, they are out of the public eye, which usually also means out of work. So most now have Twitter presences. In fact, many employ writers to compose their tweets so that the illusion of accessibility and visibility is sustained.

  The biggest boost to your self-esteem is if a celebrity such as Helen Mirren follows you. Whenever someone of a perceived higher status befriends us then we are raised in our standing by association. This is known as basking in reflected glory. Many of us take vicarious pleasure by associating with the success of others. This is why fans are so passionate about the individuals and the teams they support. Sports fans are probably the most common example. I have heard many a pub argument where fans lament the team manager’s decisions as if it were a family feud. Fans even talk as if they are a member of the team by us
ing the pronoun, ‘we’.17 Twitter facilitates these distortions of reality by generating the illusion of easy accessibility to the famous. Anyone can follow a celebrity who is on Twitter, thus creating an interesting social phenomenon where we feel an intimacy with others whom we would never have the opportunity to meet in our normal lives. The relatively open access of Twitter also creates problems. Strangers feel that they are on a familiar basis with those they follow – which is not so very different from celebrity stalkers who are deluded in thinking that they share a life with their victims.

  Karl Quinn, an Australian journalist, pointed out that Twitter is perfect for mob nastiness. It enables individuals to make cruel comments and then pass them on: ‘Many of us are in two minds about whether celebrities are flesh-and-blood human beings or merely life-sized piñatas in need of a damned good whacking.’18 The trouble is that as soon as a victim is identified most of us are more willing to join in with the bullying than we imagine. Remember how that worked in the playground? It was easier than standing up to the mob. The same is true for Twitter – people join in with the mob rule. Also, with the problem of polarization (discussed shortly) that is endemic in social networking sites, attitudes and opinions will naturally shift towards more extremism as those who seem to agree with us egg us on or we feel the need to be more judgemental. With their combination of distorted opinions, rapid communication without the time for reflection and the perceived distance as well as anonymity, social networks are a perfect platform for us to behave in a way that we would not in real life.

  This raises an important point with regards to the difference between online and offline selves. If we behave differently when we are online, then where is the true self if the self really does exist? Can we draw a real difference between thoughts and actions? If our actions are virtual and anonymous, are they any less representative of our true self? One could argue that because social rules and the pressure to conform in real life are so powerful for many, offline activities do not reflect our true attitudes and thoughts. If they can only be expressed online in the absence of the threat of any repercussions or social rejection, what kind of true self is that? That’s one reason why we need to be reminded that the self is an illusion if we believe that it can exist independently to the different contexts and influences of others. One might counter that there is only one self that behaves differently depending on the situation but that is the heart of the illusion. We are far more under the influence of contexts and others than we appreciate. Just like the alcoholic who thinks they can control their drinking, our self illusion prevents us from seeing just how far we are at the mercy of influences outside of our control.

  But I am sure you want to hear more about Helen Mirren. What’s she like? What does she eat for breakfast? Sadly, I was deluding myself with my own self-importance. When I looked at her profile it was clear that with only 216 followers, my Helen Mirren was most definitely a ‘troll’. Trolls are individuals who take delight in disrupting social networking sites by posting offensive comments or pretending to be someone else. I don’t even know if Helen Mirren is on Twitter but, if she is, I have no doubt she has thousands of followers. For one tantalizing moment that morning, my heart skipped a beat as I thought that my adolescent crush was taking an interest in me. That would have been an enormous boost to my ego but why would a great British actress like Helen bother with a lowly egghead like me? There again, even celebrity actresses are sometimes intrigued by the mundane lives of mere mortals. She is human, after all.

  The Human Borg?

  Some commentators have expressed anxiety over the rapid rise of social networks and have predicted a breakdown in human civilization. We have heard similar prophets of doom decrying all media from books to radio to television. One fear is that we are allowing the brains of our children to be destroyed forever as they lose the skills necessary to interact with others in real life and pass through a critical period of psychological development that is essential for healthy socialization.19 As the plasticity of their frontal neural circuits hardens, we are told that they will be forever locked out of normal social development and grow up into retarded adults. The claim is that they may never acquire adequate attention spans that are stimulated by real life social interaction. Social networking sites and online activity in general are depriving them of normal social environments. More alarming is the suggestion that the rise in childhood autism may be linked to increased online activity.

  The scientific evidence for such claims is sparse to say the least and indeed the Internet is arguably beneficial for those who find normal social communication difficult.20 Also, critical periods are restricted to total deprivation at a very early age. Remember the Romanian orphans and the critical first six months? There are very few children using the Web before their first birthday! Also, as developmental neuropsychologist Dorothy Bishop pointed out, the claim that online activity causes autism is ludicrous as the condition appears well before school age and the use of computers.21 When it comes to social development, the human brain is incredibly resilient and resourceful. So long as there is some social interaction then all should be fine. Just like language, humans are wired for social interaction but incredibly flexible in the actual way they do it. Yes, children may not learn the same Ps and Qs of social etiquette that their parents acquired during real interactions, but they will develop their own ways of interacting both on and offline. Consider one example of how children communicate using abbreviations in texting such as LOL (‘laugh out loud’), OMG (‘oh my God’), SNM (‘say no more’), BRB (‘be right back’), GTG (‘got to go’), or ROFL (‘roll on the floor laughing’). This is a highly effective strategy for transmitting phrases in an optimal way. This was not deliberately invented and handed down by the custodians of social networks but, like much of the etiquette on the Web, emerged in a bottom-up fashion. Left to their own devices, the kids will be all right.

  In fact, there are arguments that rather than threatening the future of human psychological development, the new social media is returning us to the situation before the old media of print, radio and television infiltrated all of our lives. One of the gifted prophets of this new social revolution, June Cohen from the TED organization, makes this counterintuitive point.22 For much of human civilization, she argues, media was what happened between people in the exchange of news, stories, myths, jokes, education and art. We mostly communicated with one another around the Serengeti campfires. Up to a few hundred years ago, very few of us could actually read. Then the old media of books, radio and television appeared. If all of human history were compressed into a single twenty-four-hour day, these old media only emerged in the last two minutes before midnight. But this media was different from the village gossip we used to spend our time engaged in. Unlike normal communication, which flows in both directions, the media that entered our homes was one directional. We read the news, listened to the radio and watched the television. We stopped communicating with each other. As Cohen puts it, ‘TV created a global audience, but destroyed the village in the process.’

  Then Tim Berners-Lee invented the Web, providing a different kind of social experience. This new media, which by the same analogy just appeared seconds ago on the clock of human history, is much more democratized, decentralized and interactive. Cohen believes that we are returning to a point in human development where we really can communicate with each other again, only this time we are not restricted to the physical size and location of our village.

  This may be true but there are some cautionary tales that we must bear in mind. We are interacting once again but the Web is very different to the campfire or garden fence. We are unlikely to become socially retarded but the way we construct our sense of self will be affected. The process won’t stop, only the way we go about it. This is because the Web is changing the way we live our lives. It is not just the amount and range of readily accessible information or the way we do business or find entertainment. It is the very way we behave toward one anothe
r. After all, interaction with one another through a computer is not natural. Then again, neither are telephone conversations and the telephone hardly reshaped our social development. The real difference is the power of each of us to communicate simultaneously with the group as a whole. That’s a first.

  Never in the history of humankind have we had the ability to communicate rich information with practically anyone on the planet instantaneously. Each new innovation from the printing press to the telephone and eventually the computer has been regarded as a milestone in human technological advancement, but the invention of the Web will outstrip them all in terms of its impact on the human race. Now we can potentially communicate with anyone. We can harness the collective power of multiples brains. Many of us are amazed by what computers and software can do. For example, there is more technology and power in today’s programmable microwave oven than was needed to put a man on the moon. Moore’s Law tells us that computer power doubles approximately every two years, which is one of the reasons I always seem to delay replacing computers in anticipation of the more powerful model just around the corner. Eventually we will reach a technical limit to Moore’s Law and require a new type of computing. But the Web is different. The Web will not be so limited. This is because the Web is primarily a medium for sharing knowledge and ideas generated by brains.

  Every normal human brain is more powerful than any computer so far built. By connecting them together through the Web, we have the potential to harness the collective thinking of millions of individual brains that are constantly checking and rechecking material on the Web. In 2005, the premier science journal Nature declared that the online encyclopaedia ‘Wikipedia’, created entirely of voluntary contributions from Web users, was as accurate as the Encyclopaedia Britannica, the multi-volume traditional source of knowledge produced by teams of paid experts, first published at the end of the eighteenth century. Web users were simply motivated to distribute their knowledge free of charge and, to this day, Wikipedia is funded almost entirely from public donation.

 

‹ Prev