Book Read Free

Sid Meier's Memoir!

Page 23

by Sid Meier's Memoir! (retail) (epub)


  Aside from art and audio, which are fully replaced with each new technology cycle, Civ designers traditionally follow a rule of thirds. One-third of the previous version stays in place, one-third is updated, and one-third is completely new. These days, “updated” is a synonym for “scaled back to make room for the new things,” because we don’t want the game to become too complicated for someone who’s never played. On the other hand, we don’t want to alienate existing fans with cookie-cutter sequels, either—and our designers themselves are existing fans, so there’s always a strong impetus to add more features. Civilization III tried out a new espionage system, Civ IV added major mechanics for religion and culture, and Civ V overhauled the board itself, by implementing a “one unit per tile” rule and switching the terrain layout from squares to hexagons.

  Most of these were ideas that I had considered at some point for the first game in the series, but either the technology couldn’t handle it, or it wasn’t right for the audience of the era. Hex grids, for example, had been a mainstay of board gaming for decades, and were clearly superior to squares because they eliminated diagonal moves. Visually, diagonals seem fine, but mathematically, they cover significantly more ground than straight moves, and throw off the balance of the game. Designers either have to accommodate for the irregular speed, or else put directional constraints on the player that seem arbitrary and frustrating—no one likes being cut off from a square that’s physically touching the corner of their own. Unfortunately, math rarely triumphs over popularity in a head-to-head battle. Despite being better from a design standpoint, hexes were considered too nerdy for the average computer user when Civ first came out, so we had to fall back on the familiarity of squares in order to get a strategy game into their hands at all. Like I said, everything comes with a price.

  Other new components, like slavery, were left out of the original because of their potential to offend. Here, again, I learned that public figures are doomed no matter what we choose. Civilization’s popularity brought it to the attention of professional academics, and it wasn’t long before I was being hammered in peer-reviewed journals for “trafficking in tropes” and generally glossing over the sins of Western expansion. Yet, when Civilization IV tried to address the issue of slavery for the first time, the complaints were even louder. Shortly after that, we built a remake of Colonization that once again removed slavery, and it caused the biggest uproar of them all.

  Once the seal was broken, this philosophical analysis quickly spread to my older titles—or as one paper described them, my “Althusserian unconscious manifestations of cultural claims” with “hidden pedagogical aspirations.” Pirates! wasn’t about swashbuckling, it turned out, but rather “asymmetrical and illegal activities [that] seem to undermine the hierarchical status quo while ultimately underlining it.” Even C.P.U. Bach was accused of revealing “a darker side to the ideological forces at work behind ludic techniques.”

  Oddly enough, my military titles weren’t subjected to any real scrutiny, despite being chock full of “hegemonic assumptions.” I suspect it had to do with their stated intent. F-15 Strike Eagle was never about anything but military dominance, while Civilization was clearly trying to accomplish something more. It’s only once you start aiming for a universal, apolitical theme that you begin to be judged by that metric, and inevitably fall short of the ideal.

  All I can say is that our motives were sincere, and maybe these guys have a little too much time on their hands. I don’t deny that the earliest version of Civilization had a predominantly Western perspective—it was a time of pervasive Cold War rhetoric, which tended to oversimplify all narratives into good guys, bad guys, and unfortunately no one else. Americans in the early nineties were brand new to the concept of international diversity in games at all, and at least we can claim that we were at the forefront of a movement that still had a long way to go. We’ve achieved a much better balance of South American, Asian, and African cultures as the series has matured, with each game striving to be more inclusive than the last. We worked so hard at it, in fact, that we eventually encountered the opposite extreme: due to their taboos about photography and idols, the All Pueblo Council of Governors in New Mexico objected to the inclusion of the ancient Pueblo leader Popé in Civilization V. Fortunately, this was discovered during development, and we were happy to respect their wishes and replace him with Chief Pocatello of the Northern Shoshone tribe instead. We might have been occasionally ignorant of other perspectives, but we were never dismissive once they were brought to our attention.

  The accusation that we embrace a “progress” model of civilization is also a fair one, and realistically, that’s not going to change. Games must involve accomplishment. It’s certainly not the only way to look at the world, but it’s the only way that makes sense in the context of what we’re trying to create. Likewise, the revelation that our historical figures and events have been caricatured to some degree is not an earth-shattering one. All games are inherently reductive. But we strive to be reductive in a balanced and polite way, and always with the goal of improving the overall experience for the player. As Dr. Tonio Andrade of Emory University once put it, “History’s not just about the past. History’s about the present reflecting on the past.” He was talking with Dr. John Harney on the History Respawned podcast, where guests dissect the cultural and historical implications of various videogames. Regarding the latest incarnation of Civ, Dr. Andrade said, “There’s a bunch of assumptions in it that maybe aren’t entirely realistic, but that’s exactly the point. As historians, no matter how many texts we look at, how careful we are, we’re still making models and assumptions . . . and this is just a sort of tangible and fun model.”

  In our line of work, everything must be in service to fun, and it happens that learning history often is fun. But sometimes, it’s also super depressing. We have to offer a moral clarity to our players and eliminate the painful quandaries, because unlike other forms of storytelling, they are personally standing in for our main character. Their ego is on the line, and we have to be gentle with it. Our version of Genghis Khan doesn’t beg for his life when he’s near defeat, because that puts the player in the uncomfortable position of questioning whether winning is worth it—which is effectively the same as asking whether the game itself is worth it. What we offer instead is the ability to play as Genghis Khan yourself the next time around. An engaging comparison of two positive, yet opposing experiences is always going to be more effective than shaming the player until they walk away entirely.

  Generally speaking, though, I don’t mind philosophical hairsplitting and constructive feedback. Our critics have helped us find some legitimate blind spots, and the end result has been a better game. Even when they’re totally wrong, that’s good too, because it reminds us that we can’t make everyone happy all the time, and we have to answer to our own conscience above all. Not everybody appreciated the presence of global warming in the original Civilization, for example, and one early reviewer called our implementation of women’s suffrage “another brick in the wall of political correctness.” So I can confidently say that, at least on occasion, we’re only unpopular because we’re ahead of the curve.

  I’d even venture to claim that the whole conversation is a creature of our own making. The earliest academic commentary on videogames was sparse, and intellectually removed from the people who actually played the games. Nearly every discussion made some kind of reference to age: in 1997, one author coined the term “screenager” to describe our audience, while in 2002, an anthropologist scorned the (supposedly token) inclusion of nonmilitary victories in Civ by comparing them to “five members of a boy band” for naïve young girls to swoon over. Whether the industry’s youthfulness was seen as a negative or a hopeful aspect of the industry, the implication of immaturity was always there.

  What none of the critics seemed to realize was that teens were our least-established demographic. Gaming had begun as an adult nerd activity, with no connection to children
at all. When I brought Hostage Rescue home with me to Michigan in 1980, my mother was the only one who tried to defeat the Ayatollah. My siblings Vicky and Bruce were about ten and eight years old at the time—what we would consider prime videogame age today—yet it didn’t occur to anyone to call them over. Computers, and therefore any activity that happened to take place on them, were for grownups.

  But by 1994, Disney had entered the market, and it was perfectly normal for four-year-old Ryan to be sitting on my lap playing Dick Tracy: The Crime-Solving Adventure. The Entertainment Software Rating Board was formed that same year, partly because of parents’ false assumption that all games were meant for young children. It wasn’t a complete demographic takeover, but in the ESRB’s inaugural round of ratings, games marked for “Early Childhood” and “Everyone” outnumbered “Teen” and “Mature” by roughly two to one. This ratio held steady until the year 2000, just as the generation that had grown up with games began to move away from home. The rebalancing was swift, and by 2003, kids’ games had lost their lead entirely.

  Since then, the split has remained roughly even, as it is for movies and books. But our new, late-teenage cohort didn’t just wander off. They kept playing games through college, and then during master’s degree programs, until finally, around 2010, the first lifelong gamers started earning PhDs—right about the same time that nuanced academic debate on the societal effects of gaming (and, yes, the specific ways in which we could do better) really exploded into the mainstream.

  Scholars talk about us, and critique us, because they know us. Gamers didn’t magically gain credibility with academics; they grew up and became academics. We created our own watchdogs, and when they complain, I know it’s only evidence of how much they care.

  A few years back, we held our own little gaming convention in Baltimore called Firaxicon, and I was truly unprepared for the number of parents who brought their children. These adults weren’t acting as reluctant chaperones, but as native guides. Mothers and sons, fathers and daughters, even a few grandparents and grandchildren—all wanted to express their love of games by passing on the traditions. They weren’t embarrassed, but deeply proud. What’s more, they were living proof that gamers aren’t just shut-in teenagers. They have careers, and relationships, and families. There is life after Civ! It was enough to bring a tear to your eye.

  These days, signs of our legitimacy can be found everywhere. The musical theme to Civilization IV, “Baba Yetu,” won a Grammy, and a concert series called Video Games Live currently travels the world playing fully orchestrated versions of game music. Their opening night sold 11,000 tickets, and they’ve played over 400 concerts since. I once received a call from the Wall Street Journal wanting to know how we so perfectly captured the essence of tax policy, and which parts of Adam Smith’s economic theories we found most relevant. (The answer was none in particular, because I’d never read his works, and I didn’t think Civ’s tax system was nearly as profound as the reporter was making it out to be.) In 2016, an article appeared on the AARP website extolling the virtues of gaming for senior citizens. And though there are still professors who dislike our simplification of history, they’ve been balanced by a not-insignificant number who assign Civilization to their students for academic purposes. Our game is an official part of the curriculum at universities in Wisconsin, Pennsylvania, Kentucky, Oregon, Massachusetts, Colorado, Georgia, and more.

  We’re in high schools, too. Back in 2007, a Canadian company made a Civ III mod called HistoriCanada, which included extra Civilopedia entries, accurate maps, and aboriginal art and music. It was distributed for free to 20,000 schools and another 80,000 individual students, to help them experience the birth of their country firsthand.

  Though the educational overlap is entirely logical, I’ve always been uncomfortable with the label “educational software.” I’ve always preferred the word “learning,” myself. Education is somebody else telling you what to think, while learning is opening yourself to new possibilities, and grasping a concept because you understand it on a personal level. To chastise us for our lack of historical accuracy is fair in the educational sense, but misses the point entirely when it comes to learning. Are Aesop’s fables meaningless because real mice can’t talk? What we encourage is knowledge-seeking in itself, and ownership of one’s beliefs. We want you to understand that choices have consequences, that a country’s fate can turn on a single act of diplomacy, and that historical figures were not black-and-white paragons of good and evil—not because we’ve told you, but because you’ve faced those complex dilemmas for yourself.

  When games are done right, players don’t even realize they’re learning. Of course one could also argue that when teaching is done right, students don’t realize how much fun they’re having, either. As Marshall McLuhan famously quipped, “Anyone who tries to make a distinction between education and entertainment doesn’t know the first thing about either.” But technology gives us an undeniable advantage over traditional teaching methods, because we’re able to reach more students, and offer a broader range of topics, than could ever be contained in a single class.

  One married couple reported that their monthly finances had been brought under control with a household budget based on Civ’s economic system. A professor at the University of Colorado praised Railroad Tycoon for teaching him about the pitfalls of debt and bankruptcy in third grade, and more than one Pirates! fan has told me they aced a geography test thanks to their encyclopedic knowledge of Caribbean coastal towns (though I imagine their teachers might have been less thrilled to hear which towns were easiest to sack and loot). A journalist for the website Kotaku credited gaming for his precocious vocabulary as a child, including words like “ziggurat,” “aileron,” “épée,” and “polytheism,” and his readers chimed in with dozens more. My own son learned how to read almost entirely through computer game hint books.

  None of which is to say that our games are designed for children—nor are they not designed for children. Our belief is that a really good game covers all the bases. Bruce Shelley used to joke that we do our research in the children’s section of the library, and it’s not entirely a metaphor. Kids’ books skipped the details, and got right down to the important themes. Their simple illustrations usually translated well to the limits of graphics cards at the time, and the information inside was a solid baseline for what our players would already know coming in. We could layer our own fantasy, humor, and drama on top of it, while remaining confident that everything underneath would resonate with that foundation of joy that adults tended to forget was inside themselves. Certainly the world is more complicated as an adult, but children aren’t dumb, and if the fundamental essence of an idea isn’t enough to capture the interest of a child, then I would argue that it’s not really as interesting as you think it is.

  22

  FUZZY MATH

  Sid Meier’s Civilization Revolution (2008)

  MY OWN FIRST EXPOSURE TO videogames was, like most people my age, the venerable black-and-white tennis match known as Pong. There was a small restaurant down the street from General Instrument where some of us would hang out and have dinner after work, and at some point they installed this weird little table in the lounge with a television screen facing upward underneath the plexiglass surface. The idea was you could set your drinks and bar snacks on it while you played, but it seemed irreverent to eat on the surface of a TV, so most evenings we would just wander over to play a few rounds before returning to our normal, wooden tables. The most memorable thing about it was that one side of the cabinet had somehow ended up wired backwards, sending the little white line to the left side of the screen when the player turned the knob to the right. So we had always agreed that whoever was more skilled had to sit on the broken side to compensate—perhaps my earliest experience in balancing gameplay.

  Rotating dial controls were sometimes called “spinners” in arcade hardware terminology, and truly inveterate nerds recognized them as either potentiometers or
rheostats, depending on their function. But to the general public, they were incongruously known as “paddles,” due to their original table tennis associations. A year after Pong’s release, the first four-way gaming joystick—a word which, oddly enough, had its roots in early airplane controls—made its debut in the arcade game Astro Race. It caught on quickly, and by 1977, the Atari 2600 home console offered a standardized plug that could support a potentially limitless number of third-party controllers, in addition to the five different styles produced by Atari themselves.

  The market responded. A 1983 issue of Creative Computing magazine included a 15,000-word hardware review comparing sixteen different joystick brands and eight unique paddle sets, plus eight converters for the less-common plugs those accessories might be required to fit. Some of the products were surprisingly forward-thinking, like Datasoft’s “Le Stick,” which detected motion through a set of liquid mercury switches that triggered whenever the freestanding cylinder was tilted more than twenty degrees in any direction. It’s easy to see why it didn’t last, but toxic metals aside, Datasoft deserves credit for predating the motion sensor craze by a quarter of a century.

  Soon, however, the third-party manufacturers fell away, and an evolutionary split emerged. On one side, the traditional knobs, buttons, and joysticks of arcade cabinets consolidated into a single proprietary controller for each console system. On the other, the personal computer industry began to drift toward more established business peripherals, namely the mouse and alphanumeric keyboard. Major gaming companies tried to straddle the gap for as long as they could, but in late 1983, the North American console market crashed, with previous annual revenues of $3.2 billion plunging to just $100 million by 1985. The drop was so devastating to Atari in particular that the whole event was simply known as “Atari shock” in Japan. For various reasons, the Japanese market remained stable, and with every console company in America either bankrupt or pivoting sharply to the PC, Japan emerged as the home console champion for the next twenty years.

 

‹ Prev