by Dean Burnett
The mesolimbic pathway has important neuronal connections and physical links to the hippocampus and the amygdala, allowing it to emphasize memories for certain occurrences it considers important and attach strong emotional resonance to them.27 It not only rewards or discourages behavior when it happens; it makes sure that the memory for the event is also particularly potent.
The heightened awareness, the intense rush, the vivid memories; all of this combined means that the experience of encountering something seriously scary can make someone feel more “alive” than at any other time. When every other experience seems muted and mundane in comparison, it can be a strong motivator to seek out similar “highs,” just as someone used to drinking double-strength espresso won’t find an extra-milky latte especially fulfilling.
And, quite often, it has to be a “genuine” thrill, rather than a synthetic one. The conscious, thinking parts of our brain might be easily fooled in many cases (many of them covered in this book), but they’re not that gullible. As such, a video game where you drive a high-speed vehicle, no matter how visually realistic, can’t hope to provide the same rush and sensation as actually doing it. The same goes for fighting zombies or piloting starships; the human brain recognizes what’s real and not real, and can cope with the distinction, despite what the old “video games lead to violence” arguments suggest.
But if realistic video games aren’t scary, how are totally abstract things like stories in books so terrifying? It may be to do with control. When playing a video game, you are in total control of the environment; you can pause the game, it responds to your actions in it, and so on. This isn’t the case for scary books or films, where the individual is a passive observer and, while caught up in the narrative, has no influence over what happens in it. (You can close a book, but that doesn’t alter the story.) Sometimes the impressions and experiences of the film or book can stay with us long after, unsettling us for quite some time. The vivid memories will explain this, as they keep being revisited and activated as they “bed in.” Overall, the more the brain retains control over events, the less scary they are. This is why some things that are “best left to the imagination” are actually more terrifying than the goriest effects.
The 1970s, long before CGI and advanced prosthetics, are widely regarded by connoisseurs of the genre as a golden age of horror films. All the scares had to come from suggestion, timing, atmosphere and other clever tricks. As a result, the brain’s tendency to look for and predict threats and dangers did most of the work, causing people literally to jump at shadows. The arrival of cutting-edge effects via big Hollywood studios meant the actual horror was far more blatant and direct, with buckets of blood and CGI replacing psychological suspense. There’s room for both approaches, and others, but when the horror is conveyed so directly, the brain isn’t as engaged, leaving it free to think and analyze, and remain aware that this is all a fictional scenario that could be avoided at any time, and as such the scares don’t have the same impact. Video game makers have learned this, with survival horror games being a genre that requires the character to avoid an overwhelming danger in a tense, uncertain environment, rather than blow it into countless wobbly pieces with an oversized laser cannon.28
It’s arguably the same with extreme sports and other thrill-seeking activities. The human brain is perfectly able to distinguish actual risk from artificial risk, and there usually needs to be the very real possibility of unpleasant consequences for the true thrill to be experienced. A complex set-up using screens, harnesses and giant fans could feasibly replicate the sensation of bungee jumping, but it would be unlikely to be authentic enough to convince your brain that you are falling from a great height, and thus the danger of actually hitting the ground is removed, and the experience is not the same. The perception of traveling up and down quickly through space is hard to replicate without actually doing it, hence the existence of rollercoasters.
The less control you have over the scary sensation, the more thrilling it is. But there’s a cut-off point, as there still has to be some influence over events in order to make it “fun” scary, rather than simply terrifying. Falling out of a plane with a parachute is considered exciting and fun. Falling out of a plane without a parachute on your back is not. For the brain to enjoy a thrilling activity, it seems there has to be some actual risk involved, but also some ability to influence the outcome, so the risks can be avoided. Most people who survive a car crash feel relieved to be alive, but there’s rarely any desire to go through it again.
Also, the brain has that weird habit, hinted at earlier, called counterfactual thinking; the tendency to dwell on the possible negative outcomes of events that never happened.29 This is going to be even more noticeable when the event itself was a scary one, as there’s the sense of actual danger. If you narrowly avoid being hit by a car while crossing the road, you might think about how you could have been hit for days afterwards. But you weren’t; nothing has physically changed for you at all. But the brain really does like to focus on a potential threat, be it in the past, present or future.
People who enjoy this sort of thing are often labeled adrenalin junkies. “Sensation seeking” is a recognized personality trait,30 where individuals constantly strive for new, varied, complex and intense experiences, invariably at some physical/financial/legal risk (losing money and getting arrested are also dangers many people strongly wish to avoid). The previous paragraphs argued that a certain amount of control over events is required to enjoy thrills properly, but it’s possible that sensation-seeking tendencies cloud the ability to assess or recognize risk and control accurately. A psychological study from the late 1980s looked at skiers, comparing injured skiers to uninjured skiers.31 They found injured skiers were far more likely to be sensation seekers than the uninjured ones, suggesting their drive for thrilling sensations caused them to make decisions or perform actions that pushed events beyond their ability to control, resulting in injury. It’s a cruel irony that a desire for seeking risk may also cloud your ability to recognize it.
Why some people end up with such extreme tendencies is uncertain. It could just happen gradually, a brief flirtation with a risky experience providing some enjoyable thrills, leading to seeking out more and more with ever increasing intensity. This is the traditional “slippery slope” argument. Quite an appropriate term for skiers, really.
Some studies have looked into more biological or neurological factors. There’s some evidence that certain genes, such as DRD4, which encodes a certain class of dopamine receptor, can be mutated in sensation-seeking individuals, suggesting that activity in the mesolimbic reward pathway is altered, resulting in changes in the way sensations are rewarded.32 If the mesolimbic pathway is more active, intense experiences may be even more powerful. But if it is less powerful, it may require more intense stimulation to achieve true enjoyment as a result; the sort of thing most of us take for granted would require extra life-risking effort. Either way, people could end up seeking more stimulation. Trying to figure out the role of a specific gene in the brain is always a long and complex process, so we don’t know this for certain yet.
Another study from 2007 by Sarah B. Martin and her colleagues scanned the brains of dozens of subjects with varying scores on the experience-seeking personality scale and their paper claims that sensation-seeking behavior is correlated with an enlarged right anterior hippocampus.33 The evidence suggests that this is the part of the brain and memory system that is responsible for processing and recognizing novelty. Basically, the memory system runs information via this area and says, “Have a look at this. Have we seen this before?” and the right anterior hippocampus says yes or no. We don’t know exactly what the increased size of this area means. It could be that the individual has experienced so many novel things that the novelty-recognizing area has expanded to cope, or maybe it’s that the novelty-detecting region is overly developed so requires something a lot more unusual to be truly recognized as novel. If this were the case, novel stimula
tions and experiences are potentially more important and salient to these individuals.
Whatever the actual cause for this anterior hippocampal enlargement, for a neuroscientist it’s actually quite cool to see something as complex and subtle as a personality trait potentially reflected by visible physical differences in the brain. It doesn’t happen nearly as often as the media implies.
Overall, some people actually enjoy the experience of encountering something that causes fear. The fight-or-flight response triggered by this leads to a wealth of heightened experiences occurring in the brain (and the palpable relief that occurs when it ends), and this can be exploited for entertainment purposes within certain parameters. Some people may have subtle differences in brain structure or function that cause them to seek out these intense risk- and fear-related sensations, to sometimes alarming extents. But that’s nothing to pass judgement on; once you get past the overall structural consistencies, everyone’s brain is different, and those differences are nothing to be afraid of, even if you do enjoy being afraid of things.
You look great—it’s nice when people don’t worry about their weight
(Why criticism is more powerful than praise)
“Sticks and stones will break my bones, but names will never hurt me.” This claim doesn’t really stand up to much scrutiny, does it? Firstly, the hurt caused by a broken bone is obviously quite extreme, so shouldn’t be used as a casual baseline for pain. Secondly, if names and insults genuinely don’t hurt at all, why does this saying even exist? There’s no similar saying to point out that, “Knives and blades will slash you up but marshmallows are pretty harmless.” Praise is very nice but, let’s be honest, criticism stings.
Taken at face value, the title of this section is a compliment. If anything, it’s actually two compliments, as it flatters both appearance and attitude. But it is unlikely that the person it’s directed at will interpret it as such. The criticism is subtle and requires some working out, as it is mostly implied. Despite this, it is the criticism that becomes the stronger element. This is just one of countless examples of a phenomenon that arises from the workings of our brains; criticism typically carries more weight than praise.
If you’ve ever had a new haircut or outfit or told a funny story to a group or anything else like this, it doesn’t matter how many people praise your look or laugh at your jokes, it’s the ones who hesitate before saying something nice or roll their eyes wearily at you that will stick with you and make you feel bad.
What’s happening here? If it’s so unpleasant, why do our brains take criticism so seriously? Is there an actual neurological mechanism for it? Or is it just some morbid psychological fascination with unpleasantness, like the bizarre urge to pick at a scab or poke a loose tooth? There is, of course, more than one possible answer.
To the brain, bad things are typically more potent than good things.34 At the very fundamental neurological level, the potency of criticism may be due to the action of the hormone cortisol. Cortisol is released by the brain in response to stressful events; it is one of the chemical triggers of the fight-or-flight response, and is widely regarded as the cause of all the issues brought about by constant stress. Its release is controlled mainly by the hypothalamic–pituitary–adrenal (HPA) axis, which is a complex connection of neurological and endocrine (meaning hormone-regulating) areas of the brain and body that coordinate the general response to stress. It was previously believed that the HPA axis was activated in response to a stressful event of any sort, such as a sudden loud noise. But later research found it was a bit more selective than that and was activated only under certain conditions. One theory today is that the HPA axis is activated only when a “goal” is threatened.35 For example, if you’re walking along and some bird droppings land on you, that’s annoying and arguably harmful for hygiene reasons, but it’s unlikely to activate the HPA mediated response because “not being soiled by an errant bird” wasn’t really a conscious goal of yours. But if the same bird were to target you while you’re walking to a very important job interview, then it is very likely to trigger the HPA response, because you had a definite goal: go to the job interview, impress them, get the job. And now it’s been largely thwarted. There are many schools of thought about what to wear to a job interview, but “a generous layer of avian digestion by-product” doesn’t feature in any of them.
The most obvious “goal” is self-preservation, so if your goal is to stay alive and something occurs that might interfere with your goal by stopping you being alive, the HPA axis would activate the stress response. This is part of the reason it was believed the HPA axis responded to anything, because humans can and do see threats to the self everywhere.
However, humans are complex, and one result of this is they rely on the opinions and feedback of other humans to a considerable degree. The social self-preservation theory states that humans have a deep-rooted motivation to preserve their social standing (to continue being liked by the people whose approval they value). This gives rise to social-evaluative threat. Specifically, anything that threatens someone’s perceived social standing or image interferes with the goal of being liked, and therefore activates the HPA axis, releasing cortisol in the system.
Criticisms, insults, rejections, mockery, these attack and potentially damage our sense of self-worth, especially if done publicly, which interferes with our goal of being liked and accepted. The stress this causes releases cortisol, which has numerous physiological effects (such as increasing release of glucose), but also has direct effects on our brain. We are aware of how the fight-or-flight response heightens our focus and makes our memories more vivid and prominent. Cortisol, along with other hormones released, potentially causes this to happen (to varying degrees) when we’re criticized; it makes us experience an actual physical reaction that sensitizes us and emphasizes the memory of the event. This whole chapter is based on the brain’s tendency to go overboard when looking for threats, and there’s no real reason why this wouldn’t include criticism. And when something negative happens and we experience it first hand, producing all the relevant emotions and sensations, the hippocampus and amygdala processes spark into life again, and end up emotionally enhancing the memory and storing it more prominently.
Nice things, such as receiving praise, also produce a neurological reaction via the release of oxytocin, which makes us experience pleasure, but in a less potent and more fleeting manner. The chemistry of oxytocin means it’s removed from the bloodstream in about five minutes; cortisol, by contrast, can linger for over an hour, maybe even two, so its effects are far more persistent.36 The fleeting nature of pleasure signals may seem a bit of a harsh move by nature, but when things cause us intense pleasure for long periods they tend to be quite incapacitating, as we’ll see later.
However, it’s easy but misleading to attribute everything that goes on in the brain to the actions of specific chemicals, and this is something that more “mainstream” neuroscience reports do often. Let’s look at some other possible explanations for this emphasis of criticism.
Novelty may also play a role. Despite what online comment sections might suggest, most people (with some cultural variations, admittedly) interact with others in a respectful manner due to social norms and etiquette; shouting abuse at someone in the street is not something that respectable people do, unless it’s directed at parking enforcement officers, who are apparently exempt from this rule. Consideration and low-level praise are the norm, like saying thank you to the cashier for handing you your change even though it’s your money and they’ve no right to keep it. When something becomes the norm, our novelty-preferring brains start to filter it out more often via the process of habituation.37 Something happens all the time, so why waste precious mental resources focusing on it when it’s safe to ignore?
Mild praise is the standard, so criticism is going to have more impact purely because it’s atypical. The single disproving face in a laughing audience is going to stand out more because it’s so different. O
ur visual and attention systems have developed to focus on novelty, difference and “threat,” all of which are technically embodied by the grumpy-looking person. Similarly, if we’re used to hearing “well done” and “good job” as meaningless platitudes, then someone saying, “You were crap!” is going to be all the more jarring because it doesn’t happen as often. And we will dwell on an unpleasant experience all the more to figure out why it happened, so we can avoid it next time.
Chapter 2 discussed the fact that the workings of the brain tend to make us all somewhat egotistical, with a tendency to interpret events and remember things in such a way as to give us a better self-image. If this is our default state, praise is just telling us what we already “know,” whereas direct criticism is harder to misinterpret and a shock to the system.
If you put yourself “out there” in some form, via a performance, created material or just an opinion you think is worthy of sharing, you are essentially saying, “I think you will like this”; you’re visibly seeking people’s approval. Unless you’re alarmingly confident then there’s always an element of doubt and awareness of the possibility that you are wrong. In this instance you are sensitive to the risk of rejection, primed to look for any signs of disapproval or criticism, especially if it’s regarding something that you take great pride in or that required a lot of time and effort. When you’re primed to look for something you’re worried about, you’re more likely to find it. Just as a hypochondriac is always able to find himself showing worrying symptoms for rare diseases. This process is called confirmation bias—we seize on what we’re looking for and ignore anything that doesn’t match up to it.38