by Adam Perkins
A second and less obvious benefit of a high score on conscientiousness that Friedman and Martin found was the remarkable discovery that such participants are less prone to diseases that are unrelated to healthy habits. The causal mechanisms for this relationship are unclear but Friedman and Martin speculate that conscientiousness scores may reflect levels of neurotransmitters that also affect health-related behaviours and processes such as sleeping and eating. In other words, conscientiousness may be a proxy measure of the healthiness of a person’s brain. Again, in the context of this book, this finding is bad news for nations with welfare systems that increase the number of children born to people with relatively low levels of conscientiousness: they not only have less healthy habits, they are also physiologically less healthy and thus will create more of a burden on the healthcare system. This finding has been confirmed by subsequent follow-up studies: in a study of thousands of contemporary US citizens, it was found that individuals with low scores on conscientiousness were more likely than average to smoke, suffer from affective disorders (anxiety/depression), have high blood pressure, and contract tuberculosis, diabetes, strokes or suffer from joint problems (Goodwin & Friedman, 2006).
Finally, Friedman and Martin found that conscientiousness influenced the tendency for a person to get into healthier job situations and social relationships: highly conscientious people tended to end up in happier marriages, more rewarding friendships and more suitable work than less conscientious people did. So, overall, highly conscientious people have healthier habits, healthier brains and healthier environments than their peers with low scores on conscientiousness.
If we allow ourselves to indulge in a thought experiment and extrapolate these findings by Friedman and Martin up to national level, it would seem plausible that a population containing predominantly highly conscientious and agreeable individuals will indeed be healthier, happier, more productive and longer lived than a population with the inverse personality profile. Crucially, they will not only be likely to pay more tax than the low scorers owing to their better employability, but will also be less of a drain on the public purse (for example, via lower expenditure on criminal justice or healthcare). This point strengthens the argument that a welfare system which proliferates the employment-resistant personality profile is a bad thing for both the economic and the social prospects of the nation.
Since, as we will see in later chapters, personality runs in families for both genetic and environmental reasons, the findings of Friedman and Martin mean that children born to parents relatively lacking in conscientiousness and agreeableness are likely to resemble those parents, as would their children, and so on. This cycle of proliferation of personality dysfunction will therefore place an ever greater burden on the more functional citizens. On the other side of the coin, would an imaginary utopia populated solely by highly conscientious and highly agreeable ‘solid citizens’ be boring and bland, full of colourless, corporate drones who work hard, raise their children responsibly, wear seatbelts, do what the doctor says and pay their taxes yet lack any spark of the creativity or dynamism that makes society flourish? Not according to the Terman data: Friedman and Martin concluded that the most conscientious and agreeable children tended also to end up with what they viewed as the most exciting, creative and rewarding life trajectories.
The Dunedin Multidisciplinary Health and Development Study (known as the Dunedin Study)
One criticism that could be levelled at the Terman Study is that the participants were highly intelligent individuals from mostly Caucasian families with a relatively high (middle/upper class) SES. So it may be the case that, although the conclusions of researchers such as Friedman and Martin are rigorously supported and so are likely to be correct, they do not apply beyond that relatively narrow segment of society. Fortunately, this criticism does not apply to the Dunedin Study, which allows us to check the robustness of the findings of the Terman Study.
Perhaps the most carefully designed and controlled longitudinal investigation of child development ever conducted, this study has followed the lives of a complete birth cohort of children (totalling 1,037 individuals) born in Dunedin, New Zealand in 1972–1973. The participants are about 43 years old at the time of writing and, remarkably, 96 per cent of them have been retained in the study. Crucially in the present context, since all children born in Dunedin in that time period were enrolled into the study and almost all of them have remained in it, the participants’ qualities are not skewed towards high IQ or any other individual differences or demographic variable. This has the benefit of providing results that are likely to be generally valid across the full spectrum of society, at least in countries with generic westernised lifestyles and democratic cultural values that are roughly equivalent to those found in New Zealand such as the USA, UK, Canada, Australia, Scandinavia and most of mainland Europe.
The key results from the Dunedin Study that relate to this book were those that were published in 2011 by Terrie Moffitt, Avshalom Caspi and colleagues that investigated the effects of self-control measured in childhood on health, wealth and criminality measured at the age of 32 years. Interest in the effects of childhood self-control on adult life outcomes has deep roots: perhaps the most famous experiment on the topic is the celebrated Stanford Marshmallow Experiment begun in 1972 by Walter Mischel. This used a delay-of-gratification design in which 95 children (53 girls and 42 boys) with an average age of four years and five months were presented with a marshmallow. The children were allowed to eat the marshmallow immediately if they wished, but were told that if they waited for 15 minutes without eating it, they would receive a second marshmallow. The children were followed up ten years later as adolescents and it was found that those who were able to resist eating the marshmallow for 15 minutes turned out to be rated by their parents as significantly more competent than the non-resistors (Mischel, Shoda & Peake, 1988). More specifically, it was found that ‘their parents rated them as more academically and socially competent, verbally fluent, rational, attentive, planful, and able to deal well with frustration and stress’ (Shoda, Mischel & Peake, 1990, p. 978).
In the context of this book, these results are important, as the children with high levels of self-control at age four go on to manifest as young adults precisely the kind of behaviour patterns that I would argue aid effectiveness in the workplace, as well as general solid citizenship. This latter impression is congruent with earlier work done by Mischel in Trinidad in which he found that children whose fathers had abandoned the family showed poorer delay of gratification than children from intact families, as if the tendency for a father to abandon his children is linked to lack of self-control in him that is then passed on genetically to those children (Mischel, 1958).
However, the studies by Mischel suffer from the same flaws as the Terman Study, namely that we cannot be sure the participants were representative of the general population but, reassuringly for the Dunedin Study, they do suggest that self-control measured in childhood is an important personality characteristic that goes on to influence employability and so is presumably related to conscientiousness and agreeableness.
Importantly for the generality of the Dunedin results, childhood self-control was not measured with a marshmallow. Instead, when the children were three and five years old, each of them was tested on a battery of cognitive and motor tasks. The participants were tested by examiners who did not know them. After the session, each examiner rated the child’s self-control using the following trait labels: lability, low frustration tolerance, hostility, resistance, restlessness, impulsivity, requires attention, fleeting attention and lacking persistence.
Subsequent assessments then followed: at ages five, seven, nine and 11, with the self-control of participants rated by parents and teachers who completed the Rutter Child Scale (RCS) which includes items that measure impulsive aggression and hyperactivity. At ages nine and 11, the RCS was supplemented with further questions about the children’s lack of persistence, inattention and impuls
ivity. Finally, at age 11, the participants were interviewed by a psychiatrist who rated them on hyperactivity, inattention and impulsivity. The component scores for all these separate ratings were then averaged to produce a single score of childhood self-control. In terms of the standard Big Five personality dimensions, a high score on childhood self-control primarily reflects high conscientiousness, but also to a lesser extent, high agreeableness and low neuroticism.
Conversely, low childhood self-control primarily reflects low conscientiousness and to a lesser extent, low agreeableness and high neuroticism. A low score on childhood self-control therefore is a plausible precursor for the adult employment-resistant personality profile that is proposed in this book. As found by Mischel and colleagues, it makes sense that a child with high self-control should go on to be an adult who is better suited to holding down most types of job than a person with low childhood self-control is. They should also be better able to look after their health and also are more likely to stay out of trouble with the law. But again, as with the Terman Study, data are needed to back up these educated guesses.
The Dunedin Study data, as reported by Moffitt et al. (2011), support the validity of these guesses. Four hypotheses were tested by Moffitt et al. (2011): first, they asked if self-control as measured in the participants as children went on to predict their adult health, wealth and criminality similarly across the self-control range, from low to high. Second, they asked whether the Dunedin Study participants who moved up in the self-control range during the study showed improved health, wealth and reduced criminality. Third, they investigated whether children at the low end of the self-control range were more likely than average to make errors of judgment as teenagers that limited their subsequent success as adults (for example, dropping out of high school, smoking, abusing drugs, criminality, becoming a teenage parent). Moffitt et al. (2011) dub these mistakes ‘snares’ that trap the youngsters in lifestyles that damage their health and wealth as well as undermining the future safety and prosperity of society in general. Fourth, Moffitt et al. (2011) wanted to take advantage of the extremely comprehensive testing regime of the Dunedin Study and find out whether self-control scores at three years old influenced important life outcomes.
One concern in longitudinal studies of human development is whether the variable of interest (in this case self-control) is having a real effect and is not merely a by-product of the effect of another underlying variable such as intelligence or social class. Fortunately, the Dunedin Study also measured the intelligence and social class of the participants, allowing the influence of self-control on life outcomes to be separated from the effects of these other factors. This turned out to be important because preliminary analyses showed that self-control was significantly higher in children from higher socio-economic classes and with higher IQ scores. Lastly, Moffitt et al. (2011) took advantage of sibling data gathered in a related British study (the Environmental-Risk Longitudinal Twin Study; E-Risk) that followed children up to the age of 12 years old, including measures of self-control taken at age five, and three important precursors of poor health, low wealth and criminality, namely smoking at age 12, poor school performance and antisocial behaviour. These data were used to determine if individual differences in self-control predicted differences in life outcomes, even in siblings who are raised together. This meant the role of a participant’s self-control level in influencing life outcomes could be separated from other factors which differ between families.
To assess health in adulthood, the participants were given a battery of physical and psychiatric medical assessments at the age of 32. This process generated four health-related scores: physical health, recurrent depression, substance dependence and informant-reported substance problems. Low childhood self-control was generally associated with poor health. These associations persisted even when the influence of social class and IQ were removed. More specifically, participants with low childhood self-control were more likely to suffer from metabolic abnormalities, periodontal disease, sexually transmitted infections and inflammatory diseases. Childhood self-control did not affect the risk of respiratory disease or depression. Low childhood self-control was associated with a tendency to smoke cigarettes and consume illicit drugs but did not affect risk of marijuana use or alcoholism.
With regard to wealth, Moffitt et al. (2011) found that low childhood self-control was associated in adulthood with lower income, greater probability of single-parenthood, less financial planning and increased financial struggles such as credit card debts (whether self-reported or informant-reported). These effects of self-control on wealth were stronger than the effects of IQ or childhood social class. Finally, Moffitt et al. (2011) looked at criminality and found that 24 per cent of the participants had criminal convictions by the age of 32. They found that low childhood self-control was associated with a significantly higher risk of criminal offending, even when IQ and SES was taken into account.
Importantly, all three of these findings held true at all levels of the gradient of self-control in the participant cohort. Additionally, Moffitt and colleagues studied participants who moved up the self-control gradient by the time they had become young adults (as measured by self-report at 26 years old), finding that these individuals went on to show better life outcomes at the age of 32. This change implies that self-control has a learned component and so could be increased by interventions during childhood that aim to facilitate socialisation. This idea is congruent with the results of longitudinal studies of preschool training in conscientiousness and agreeableness which show that trained children went on to have better life outcomes than the control group who were not trained (for example, Heckman, 2006), as will be discussed in Chapter 5.
With regard to adolescent mistakes, Moffitt et al. (2011) found that children with low levels of self-control were more likely to make mistakes in their teenage years that harmed their prospects of a successful life trajectory. More specifically, children with poor self-control were more likely to start smoking by the age of 15, leave school prematurely with no qualifications and to become teenage parents, even when taking into account the effects of IQ and socio-economic class. Interestingly, the effect of children falling into these traps did not fully account for effects of self-control on adult health, wealth and criminality: even amongst the participants who did not smoke, graduated from high school and did not become teenage parents, self-control still significantly influenced their adult life outcomes. This finding underlines the importance of self-control in determining the success or otherwise of an individual’s life, suggesting that it would be more cost effective to introduce policies that increase self-control in the population rather than trying to target the surface symptoms of low self-control like smoking, teen parenthood and school failure. These results also add weight to the idea that welfare policies that increase the proportion of individuals with low levels of self-control are a bad thing for everyone – welfare claimants and workers alike. This is a crucial take-home point so is worth emphasising: welfare policies that increase the proportion of individuals with low levels of self-control are a bad thing for everyone – welfare claimants and workers alike.
The attempt by Moffitt et al. (2011) to test the effects of early measurements of childhood self-control (at age three) was successful: the scores predicted health, wealth and criminality at age 32, although the size of the effect was smaller than for the full composite score that incorporated all measurements of self-control up to age 11. Finally, by looking at a similar cohort study of siblings in the UK, Moffitt et al. (2011) aimed to test whether siblings with different scores on self-control showed different life outcomes. As predicted, it was found that at five years old, the sibling with lower self-control was significantly more likely to begin smoking when 12 years old, to behave in an antisocial manner and to perform poorly at school.
Conclusion
Putting the findings from the Terman Study together with the findings from the Dunedin Study, it is clear that children wi
th personality characteristics reflecting a lack of self-control have an elevated risk of encountering occupational difficulties later in life. It is also clear that children with low self-control possess a personality profile that approximates to the employment-resistant personality profile (low conscientiousness plus low agreeableness). This means the employment-resistant personality profile is not a product of unemployment but increases the likelihood of becoming unemployed. Moreover, this effect of personality on employability is not a product of intelligence or social class, since the children in the Terman Study all possessed relatively high intelligence and social class, and the participants in the Dunedin Study were tested on IQ and for social class and had those effects statistically controlled when examining effects on life outcomes of personality. A secondary conclusion from these two studies is that the children with the employment-resistant personality not only went on to experience occupational difficulties but also difficulties in generally behaving like a solid citizen outside the workplace. These findings indicate that relatively low levels of conscientiousness and agreeableness play a causal role in promoting an unhealthy, work-shy, impoverished and criminal life trajectory and back up the epidemiological data in the previous chapter that showed individuals with the employment-resistant personality profile are over-represented amongst welfare claimants.
4
The Influence of Benefits on Claimant Reproduction
In Chapter 1, I raised the possibility that the welfare state can proliferate the employment-resistant personality profile by boosting the number of children born into disadvantaged households. This chapter is devoted to examining whether it is indeed the case that welfare claimants on average have more children than employed citizens and whether such differences could be driven partly by personality differences in reactions to welfare policy. To do this, I utilise the ecological theory that there are two opposing strategies for reproduction, the success of which depends upon the availability of resources. In conditions where resources are plentiful and competition for these resources is low, the optimal strategy is to produce as many offspring as possible but put little effort into their care. Conversely, when resources are scarce and have to be competed for, the optimal strategy is to produce fewer offspring but care for them conscientiously so that each offspring is itself capable of competing for resources.