Generation Me--Revised and Updated
Page 4
Even our shoes are different. Today’s casual footwear are called tennis shoes because people once wore them only to play tennis or basketball. Not even kids wore these types of shoes on the street—their shoes were made of stiff leather, just like adults’.
Now that’s all but forgotten. Except in the most formal of workplaces, few men wear suits to work, and virtually no one wears them to baseball games. Women have (thankfully) abandoned wearing tight girdles and white gloves everywhere they go (and many young women don’t even know what a girdle is, though some are devoted to Spanx, the GenMe version). The trend toward more informal dress has accelerated in the past ten years, with many companies opting for “business casual” and others going for just plain casual. The trend reached all the way to the top in July 2005, when about half the members of the Northwestern University women’s lacrosse team wore flip-flops during their White House visit, resulting in a picture of the president of the United States standing next to several young women wearing shoes that were once reserved for walking on sand or showering in scuzzy gymnasiums. Although most people still want to look good, we are a much more informal and accepting society than we once were. This is a perfect illustration of generational trends in attitudes, as the entire point in dressing up is to make a good impression on others and elicit their approval. You don’t dress to be relaxed, natural, and happy.
Holiday card, Minnesota, 1955. Not only are the clothes formal, but so is the posing and demeanor. The perfect family was proper and composed.
Holiday card, Massachusetts, mid-2000s. Formal clothing is no longer necessary to make a good impression. It is now more important to dress for yourself or for your comfort; if you really wanted to do things “your way” and just for yourself, you’d wear jeans to work. Many of us already do.
The strict rules of previous decades went far beyond appearance. Beneath the wool suits and tailored hats, yesterday’s men and women were bound by another type of conformity. Male or female, you were considered strange if you did not marry by age 25 and even stranger if you married outside your race or religion. It was expected that you would have children—it was not considered a choice. Your race and sex dictated your fate and behavior. When war came, you went to fight if you were male and able. Overall, duty and responsibility were held more important than individual needs and wants. You did certain things, you said certain things, and you didn’t talk about certain things. End of story.
Today, few of these rules apply. We are driven instead by our individual needs and desires. We are told to follow our dreams, to pursue happiness above all else. It’s okay to be different, and you should do what’s right for you. The phrase my needs was four times as common in American books in the 2000s compared to those in the 1960s. Young people today are only half as likely as those in the late 1980s to believe that children should learn obedience above all else. Baby boys in the 2010s (versus the 1950s) were three times less likely to receive one of the ten most popular names. These changes are not clearly good or clearly bad, but they do indicate a strong shift toward individualism.
The choices of the individual are now held so paramount that the most common advice given to teenagers is “Just be yourself.” (Not that long ago, it was more likely to be “Be polite.”) This started with Generation X: Filmmaker Kevin Smith says, “My generation believes we can do almost anything. My characters are free: no social mores keep them in check.” Or take Melissa, 20, who says, “I couldn’t care less how I am viewed by society. I live my life according to the morals, views, and standards that I create.”
This is the social trend—so strong it’s a revolution—that ties all of the generational changes together in a neat, tight bundle: do what makes you happy, and don’t worry about what other people think. It is enormously different from the cultural ethos of previous decades, and it is a philosophy that GenMe takes entirely for granted. “As long as I believe in myself, I really do not care what others think,” says Rachel, 21.
GENERATIONS AT THE CINEMA
The ethos of self-belief appears frequently in popular movies; my favorite examples involve what I call “the apparent time traveler.” The main character in these films is supposed to be a real person in the 1950s, but he or she actually represents the enlightened voice of the 21st century, which makes him (or her) the hero of the film. These movies were ubiquitous in the 2000s, when much of GenMe were forming their view of the world. In 2003’s Mona Lisa Smile, Julia Roberts plays a professor at Wellesley College in 1953. Soon after arriving, she rallies her students against the restrictions of early marriage and training for motherhood. When she critiques sexist advertising during a class, the modern audience knows exactly what she is doing, but few people in the 1950s would have seen it before—or even thought to do it. Roberts’s character has clearly taken the time-traveler shuttle to the future and absconded with a copy of the 1987 feminist antiadvertising film Still Killing Us Softly.
The Majestic, released in 2001, is an even worse movie. Jim Carrey’s character, a Hollywood screenwriter, gets blacklisted and takes refuge in a small town. After he is asked to testify, he convinces the entire town that McCarthyism is bad and that free speech is our most treasured right. The whole town unites behind the accused writer, and the main female character says, “It doesn’t really matter if you are a Communist or not—this is America and you can be one if you want to. It’s nobody’s business.” Uh, not really. Had this actually been the 1950s, an accused Communist would have been everybody’s business. This viewpoint was common even in the 1970s, when 48% of Americans believed Communists should not be allowed to give a speech, teach at a college, or have a book in a local library.
Movies that admit to time travel are somewhat more enjoyable. In Pleasantville, two modern teenagers help a 1950s town find passion and the freedom of ideas. Every character who discovers an individualistic freedom such as sex or intellectual questioning instantly turns from black and white into color. The film sinks into predictability once discrimination against the “colored” people begins. (Get it?)
Other movies travel across cultures rather than time, but they promote the same message. In 2002’s Bend It Like Beckham, an Indian girl living in London wants to play soccer. Her parents, already taken aback that their older daughter did not have an arranged marriage, want Jess to learn to cook and be a proper young lady. The plot comes to a head when Jess must shuttle back and forth between a game and her sister’s wedding. By the end of the movie, Jess wants to join a professional women’s soccer team and move to America. Her parents, finally convinced that it’s right for Jess to follow her dreams, reluctantly agree. The overall message of all of these movies—whether they travel in time or cultures—is to rebel against restrictive social mores. Don’t follow the rules; do whatever makes you happy.
And sometimes you don’t even need to travel. The biggest box-office draw in late 2004 and early 2005 was Meet the Fockers, the sequel to the highly successful comedy Meet the Parents. The movie revolves around the culture clash between the conservative Byrnes family and the hippie Focker family. The Fockers provide most of the comedy in the film, with their sex-therapy business, their leather sandals, and their display of their son’s ninth-place ribbons (because, they say, “It’s not about winning—it’s about what’s in your heart”). But by the end of the movie, the Fockers are not the ones who have been convinced to change—it’s the straitlaced Byrnes family who learns from them. Mr. Byrnes, played to crusty perfection by Robert De Niro, learns to loosen up and show emotion toward his daughter. He also decides that it might be good for him and his wife to enjoy more physical affection in their marriage, and he puts some of Mrs. Focker’s sex tips to good use. Hippies may be laughable, but they teach us how to live. No need to walk around all uptight like that—which you must be if you’re not a hippie. I’m exaggerating a bit, but the movie does make it clear which life philosophy is correct, and it’s definitely Let It All Hang Out.
These movies dramatize two interlocki
ng changes: the fall of social rules and the rise of the individual. As the individualistic viewpoint became prominent, concern with the opinions of others plummeted. This chapter discusses the decline in the need for social approval, and the following two chapters document the ascendance of the individual self. Over the last few decades, the entire nation has experienced the transformation parodied in an episode of The Simpsons, when Springfield’s usual Do What We Say Festival (started, they say, in 1946 by German settlers) is replaced with the new Do What You Feel Festival.
DO YOUR OWN THING
Imagine you are seated at a table with six other people. Four lines are drawn on a chalkboard at the front of the room: a medium-length target line; along with line A, medium; line B, short; and line C, long. You’re to say which of the lines is the same length as the target. You’re all ready with the obvious answer of A, but the six others go first and say line C. What do you do?
When Solomon Asch first performed this experiment in 1951, 74% of people gave the group’s incorrect answer on at least one trial, and 28% did on the majority of trials. People felt the need to conform to the group and not to stand out. The study became one of the most famous in social psychology, taught in every class as an example of the social nature of human beings. Yet some have pointed out that this was the essence of getting along in 1950s society, when no one wanted to be thought of as different. But when researchers tried to replicate the study in 1980, they got completely different results: few people conformed to the group anymore. Apparently, it was no longer fashionable to go along with the group even when they were wrong. The authors of the study concluded that the Asch study was “a child of its time.” A similar thing happened when a psychologist tried to replicate the Milgram study, an early 1960s study finding that people would shock someone else at dangerous levels when told to do so by an authority figure. In 2009, nearly twice as many men refused to obey the experimenter’s orders.
Throughout the 1970s, self-help books and therapists actively encouraged people to flout social rules, telling readers they should stop caring about what others think. A central chapter in the 1976 megabestseller Your Erroneous Zones, by Wayne Dyer, is called “You Don’t Need Their Approval.” The author argues that people can do anything they put their minds to, and that others’ opinions only get in the way. (It’s probably no coincidence that both the cover and back of the book feature oversize pictures of the author, complete with a 1970s, powder-blue, V-neck shirt and the resulting display of male chest hair.) Dyer rants on and on about how courteous acts such as giving a wedding gift or attending a funeral are “musterbation,” his double-entendre term for unnecessary social rules. Dyer argues that seeking approval from parents, teachers, and bosses undermines self-reliance and truth. “Needing approval is tantamount to saying ‘Your view of me is more important than my own opinion of myself,’ ” he writes. Another self-help book carries on the tradition with the title What You Think of Me Is None of My Business. Unlike the Baby Boomers, who learned these new standards as adults, GenMe takes these attitudes for granted and always has.
“Just be yourself” is the central ethos of modern parenting. In 1924, a group of sociologists did an extensive study of the citizens of a place they called Middletown (later revealed as Muncie, Indiana). When mothers were asked which traits they wanted their children to have, they named strict obedience, loyalty to church, and good manners. In 1988, when the first wave of GenMe were young children, few mothers named these traits; instead, they chose independence and tolerance. Modern mothers might be gratified to learn that these values sank in. In Growing Up Digital, an 11-year-old girl says, “I think the individual determines what is cool, and it is his or her opinion. What is cool to one person might not be to another. The days of conformity are over.” Danielle, 29, agrees: “I refuse to do something because it’s what everyone else is doing, or because it’s the socially acceptable thing to do at the time.” When I asked my undergraduate students to name the characteristics that best described their generation, the two most popular answers were “independent” and “open-minded.”
GenMe has been taught these values since birth—beginning with the unique names bestowed upon them. Like just about everyone else, I’d noticed that baby names seemed to be getting stranger every year. When my husband and I were naming our first child in 2006, I discovered the Social Security Administration’s database of 325 million Americans’ names going back to the 1880s. So I had to see if there was a generational change. Sure enough, the parents of GenMe’ers, and GenMe’ers themselves, were more likely than those in previous eras to give their children unique names (so they could stand out) instead of common names (so they could fit in). In 1950, 1 out of 3 boys received one of the top 10 names. By 2012, less than 1 out of 10 did. Girls receiving a common name dropped from 1 out of 4 to less than 1 out of 10. (We also controlled the analyses for immigration and looked within states with low Latino populations, such as North Dakota and Mississippi, to make sure that ethnic changes didn’t account for the effects, and they did not.) By 2012, new parents—the majority of whom were GenMe—took things a step further to proclaim their child’s greatness. The boys’ names that increased the most in popularity between 2011 and 2012 included Major, King, and Messiah. Somewhat high expectations to put on a newborn.
As Jaden, 25, puts it, “For my grandparents, questioning their religion, their country’s system of government, or what they ate was not acceptable. The fear of standing out or being judged by others for their beliefs was strong. My generation is much more independent. I pride myself on being a free and independent thinker. My wish is to break down the walls that humans have socially constructed.” A book on generations in the workplace notes that today’s young people were instructed to “Never just do what an adult asks. Always ask, ‘Why?’ ” Some people say this should be the label for the generation—not Generation Y, but Generation Why?
At times, this attitude can lead to the more questionable idea that there are no rules, so you might as well make up your own. In interviews of 18-to-23-year-olds conducted in 2008 for his books Souls in Transition and Lost in Transition, Christian Smith found that most young Americans espouse “moral individualism,” believing that morality is a personal choice. “I have no other way of knowing what to do morally but how I internally feel. That’s where my decisions come from. From me, from inside of me,” said one. So should people follow rules for what the society says is right or wrong? the researchers asked. “I think it’s your personal belief system,” said another young person. “I don’t think it’s anything like social norms or like that. I think it’s just . . . dependent on each person and their own beliefs and what they think is right or wrong.”
Thus, it follows that everyone has his or her own individual moral views, and it’s not right to question someone else’s view. “I guess what makes something right is how I feel about it, but different people feel different ways, so I couldn’t speak on behalf of anyone else as to what’s right and what’s wrong,” said one young man.
This moral individualism can easily become, as Smith puts it, a “live and let die” philosophy. When asked if people have any moral responsibility or duty to help others, one young person replied, “No, not really.” Would it be a problem if someone didn’t want to help others? asked the interviewer. “No. . . . They can help themselves. . . . Do they really need anyone else?” he replied. “So if someone asks for help, we don’t have an obligation to them?” prodded the interviewer. “Yeah, it’s up to each individual, of course,” the young adult asserted.
Smith concludes that most emerging adults seem unaware of any source of moral reasoning outside of themselves. “Instead . . . the world consists of so many individuals, and each individual decides for themselves what is and isn’t moral and immoral,” Smith writes. “Morality is ultimately a matter of personal opinion. Everyone should tolerate everyone else, take care of their own business, and hopefully get along.” This is the razor’s edge of modern i
ndividualism: tolerance is great, but perhaps not when each individual is free to decide for himself which rules to follow, and helping others is rarely one of those rules.
What about all of the GenMe’ers who are serving in the military, and who served in Afghanistan and Iraq when we were all sitting safe at home? Military service can certainly be an example of self-sacrifice, duty, and collectivism. However, the data suggest that GenMe service members are the exception, not the rule. According to the Pew Center, only 2% of GenMe has served in the military, compared to 6% of GenX and 13% of Boomers. Polls of 16-to-24-year-olds conducted by the Department of Defense show that fewer now say they are likely to join the military: 18% expressed interest in 2010, down from 26% in 1986. This is partially because many more young people automatically rule out military service. In a nationally representative sample of high school students, 2 out of 3 (67%) said they “definitely won’t” join the military in 2012, up from 57% in 1976. This does not diminish the contributions of the GenMe’ers who do serve, but it contrasts them with the majority of their generation.
One upside to the individualistic attitude is lessened prejudice and discrimination. Amanda, 22, says that one of the main lessons in her Girl Scout troop was “being different is good.” It’s a mantra GenMe has heard over and over. They absorbed the lesson of tolerance with their baby food—not just for race and religion, but for sexual orientation. It also extends to beliefs, feelings, and all kinds of other intangibles. Just about the only difference that wasn’t good? Someone who was prejudiced.
That’s exactly what appears in our recent analysis of data from the nationally representative General Social Survey. Boomers set in motion strong trends toward tolerance of groups such as Communists, gays and lesbians, and those who oppose religion. Generation Me continued those trends throughout the 2000s and 2010s, but diverged from Boomers in one major way: they were less tolerant than Boomers toward someone who claimed that blacks are genetically inferior. GenMe is thus the most tolerant generation in American history—the only group they will not tolerate are those who are intolerant themselves.