A Nice Little Place on the North Side: Wrigley Field at One Hundred
Page 10
Do you see where this is heading?
Cubs attendance “is the least sensitive to performance in all of baseball.” The Yankees’ attendance sensitivity is 0.9, “meaning that attendance moves almost one for one with winning percentage.” Red Sox attendance sensitivity also is 0.9, which must have something to do with why these two teams have done so much winning. The sensitivity of Cubs attendance per game to winning percentage is only 0.6, much less than one. “The league average is one,” write Moskowitz and Wertheim. So “the Cubs are America’s Teflon team,” which must have something to do with why they have done so much losing.
Comparing Cubs and White Sox season attendance numbers from 1998 through 2009, Moskowitz and Wertheim found that Cubs attendance varied between 82 percent and 99 percent of Wrigley Field’s capacity. White Sox attendance was 37 percent of their home field’s capacity in 1999 and 90 percent of capacity in 2006, the year after the team won the World Series. In 2006, the Cubs finished last and averaged 94 percent of Wrigley Field’s capacity. And the 165,801 more fans the Cubs drew into Wrigley than the White Sox attracted to U.S. Cellular Field do not include the many thousands of fans who purchased seats on the rooftops across from Wrigley on Waveland and Sheffield Avenues.
In the 2013 season, the one hundredth played at Wrigley Field, the home team did something never done in any of the previous ninety-nine: It lost fifty home games. The previous record was forty-nine, set in 1971. In 2013, Cubs attendance declined, as it has done in every season since 2008, when the Cubs won the National League Central and drew a franchise-record 3.3 million. But in 2013, even while setting a record for Friendly Confines futility, the team drew about seven hundred more fans per game than did the 1998 team that played in the postseason as winner of the National League wild card.
You can say this for Cub fans: They are not front-runners. You can say this against them: They are incorrigible. In 2002, the Cubs lost 13 percent more games than in 2001, but attendance increased 1 percent. Nothing new there. In 1999, the Cubs lost 14 percent more games than in 1998—and attendance increased 7 percent. Attendance rates at Wrigley are, write Moskowitz and Wertheim, “as steady as a surgeon’s hands.” But what is healthy in surgery is unhealthy in baseball.
In baseball, the difference between excellence and mediocrity is usually not the blockbuster signing of this or that free agent. Rather, it is the cumulative effect of management’s attention to scouting, player development, and so on—which requires time, effort, and, always, money. Because Cub fans fill so many seats no matter what is happening on the field, there is a reduced incentive to pay the expense of organizational excellence.
It was in 1932 that Harpers Magazine quoted P. K. Wrigley on giving the fans a reason other than good baseball for going to the ballpark: “The fun … the sunshine, the relaxation. Our idea is to get the public to go see a ball game, win or lose.” Again, there is a lot of losing in baseball, even for the best teams. If you can’t bear losing, find another sport. And if you do not much mind losing, or if you actually rather enjoy it, you should feel right at home in Wrigley Field. With an acerbic terseness perhaps born of frustration, Moskowitz and Wertheim say, “There is equity in futility.” That is, the Cubs may have had a perverse financial incentive to cultivate the image of “lovable losers.” And speaking of incentives and, as any baseball person must, of beer, they also say, “Attendance at Wrigley Field is actually more sensitive to beer prices—much more—than it is to the Cubs’ winning percentage.”
Moskowitz and Wertheim studied Wrigley Field beer prices, adjusted for inflation, between 1984 and 2009 and concluded that attendance was four times more sensitive to beer prices than to the team’s won-lost record. They do not make clear exactly how they come to that conclusion, but they do offer this tantalizing data: Over the two decades beginning in 1990, while the Cubs were compiling a 48.6 winning percentage, the team’s management increased ticket prices 67 percent, far above the MLB average of 44.7 percent. By 2009, Cubs ticket prices (average: $48) were the third highest in all of baseball, behind only those of the Red Sox, in the smallest major league park, Fenway ($50), and the Yankees, in the ostentatious new Yankee Stadium ($73). Demand for Cubs tickets remained remarkably inelastic. But the team knew better than to tamper with beer prices, which remained the third lowest in the major leagues. “Only the small-market Pittsburgh Pirates (at $4.75 a beer) and the medium-market Arizona Diamondbacks (at $4.00) had cheaper beer—and their average ticket prices were $15.39 and $14.31, respectively.” “Cub fans,” Moskowitz and Wertheim conclude, “will tolerate bad baseball and high ticket prices but draw the line at bad baseball and expensive beer.”
If you stand in the middle of the intersection of Clark and Addison Streets and slowly turn in a circle, your gaze will fall on a lot of places, including the ballpark, for drinking beer. Of course, Wrigley Field, unlike the various bars and restaurants and rooftops, is not for drinking beer. And yet …
If you believe, as baseball fans are inclined to, that the point of the Big Bang was to set in motion the process—the universe, et cetera—that led to baseball, you should believe that beer was part of the Plan from the start. Beer, it seems, has been crucial to the flourishing of civilization, and the connection between beer and baseball, two of civilization’s better products, has been close and longstanding. Indeed, a case can be made that civilization is a result of, and flourished because of, beer.
For three million years, give or take a bunch, human beings went about the business of evolving from lower primates, and they did so without the assistance and comfort of alcohol. About one hundred thousand years ago they were more or less recognizably human, but they had not yet developed agriculture, so they had to keep moving around to find food. Then, according to the Discovery Channel program How Beer Saved the World, they began—by a happy accident, and even before mankind started baking bread—brewing beer. Humans were nomads, hunter-gatherers who occasionally gathered barley that was growing in the wild. One day, when some of these people were off on extended hunting-and-gathering treks, rain fell on barley they had stored in clay pots. The rain made the barley soggy, which was bad. But it also, with the help of natural sugars and other ingredients in the grain, and also airborne yeasts, started the process of fermentation, which was very good indeed. Eventually, homeward the hunters and gatherers made their weary way, and, being understandably thirsty from their exertions, they took a sip of the resulting fluid in the jars. Thus did humanity’s era of sobriety come to an end.
One sip led to others, and to the desire for more beer, which required more barley, which required systematic agriculture. So humanity vowed to put aside its nomadic ways, to develop the plow, and irrigation, and the wheel, so there could be carts to get surplus barley to markets. While they were at it, they developed writing to record commercial transactions, mathematics to make possible land sales and business computations, and, eventually, the U.S. Department of Agriculture and farm subsidies—all to keep the beer flowing.
An inscription on an ancient Egyptian tomb says that one thousand jugs of beer is about the right provision for the afterlife. The toiling masses who built the pyramids were paid in beer chits—a sort of early version of debit cards—and drank about a gallon of beer a day. It was what moderns disdainfully call near beer, only 3 percent alcohol. But it was nutritious enough to enable the toilers to pile up all that stone.
Beer was not only a stable food and a kind of currency, it was also a medicine. Traces of the antibiotic tetracycline, which was invented (or so we thought) in 1952, were found by puzzled archaeologists in the mummified bones of ancient Egyptians. Long before Alexander Fleming won the 1945 Nobel Prize for medicine as a result of his contribution to the development of penicillin, tetracycline was a health-enhancing residue of beer.
Beer also rescued the Middle Ages from a scourge and killer: water. Living centuries before the discovery of the germ theory of disease, people drank pond water fouled by human sewage, defecating ducks, w
aste from tanneries, butchers’ offal, and other insalubrious ingredients. Brewing, however, removed many of the microorganisms that made people sick. It was, therefore, probably good that people then drank three hundred quarts of beer a year, which is six times today’s consumption by American adults.
“The search for unpolluted drinking water is as old as civilization itself.” So wrote Steven Johnson in The Ghost Map: The Story of London’s Most Terrifying Epidemic—and How It Changed Science, Cities, and the Modern World. In his account of the cholera epidemic of 1854, Johnson explains that the beginning of civilization occurred with the formation of settled communities, and mass settlements also brought the beginnings of waterborne diseases, often from the settlements’ human wastes, especially feces. “For much of human history,” writes Johnson, “the solution to the chronic public-health issue was not purifying the water supply. The solution was to drink alcohol.”
Alcohol has antibacterial properties that in early human settlements were more beneficial than the risks of alcohol were baneful. As Johnson says, “Dying of cirrhosis of the liver in your forties was better than dying of dysentery in your twenties.” Alcohol is addictive and, consumed immoderately, is a potentially lethal poison. People who drink lots of it—who can “hold their liquor,” as the saying goes—are apt to be those whose bodies, thanks to some genes on chromosome 4 in human DNA, are especially able to produce particular enzymes. Among the early agrarians who abandoned the hunter-gatherer lifestyle and dwelled together in settlements, those who lacked this genetic advantage were doomed by a Darwinian selection that favored those who could drink more. Here is Johnson on those who died young and childless, either from alcohol’s ravages or from waterborne diseases:
Over generations, the gene pool of the first farmers became increasingly dominated by individuals who could drink beer on a regular basis. Most of the world’s population today is made up of descendants of those early beer drinkers, and we have largely inherited their genetic tolerance for alcohol.
Well. If beer is, strictly speaking, a health food, then Wrigley Field, which has been called the world’s largest outdoor singles bar, is actually also a health club, of sorts.
It has been accurately said that the United States is the only nation founded on a good idea: the pursuit of happiness, of which baseball is an important ingredient. But there also is a sense in which America was founded on beer. Within two years of the 1607 founding of Jamestown, Virginia, leaders of that settlement wrote to London, asking that a brewer be sent out because “water drinkers”—the phrase drips disdain—were no basis for a colony. Which was true, but for reasons they could not have then known, not understanding about germs. The Mayflower put passengers ashore in what would become Massachusetts, although its captain had been searching for a landing much farther south; the problem, according to William Bradford’s journals, lay in “our victuals being much spent, especially beer.” While Thomas Jefferson was brewing beer at Monticello, his boon companion James Madison diluted his limited government convictions enough to consider advocating the establishment of a national brewery to provide a wholesome alternative to whiskey. It almost seems that Manifest Destiny pointed toward Wrigleyville.
George Washington, Sam Adams, and Thomas Jefferson were among our Founding Brewers, and beer was integral to the Internet of colonial America—the communications network of taverns, such as the Green Dragon in Boston, where, on the evening of December 16, 1773, some patriots decided to go down to the docks and toss cases of tea into the harbor. When the nation was born, it needed a national anthem, and it found one by giving new words to what had been a drinking song that sometimes served as a sobriety test: If you could sing it, you could have another tankard of beer. In the 1860s, beer, not milk, became the first beverage to be pasteurized. The reason beer could spoil was that it was alive. It contained a hitherto unknown life-form, bacteria, which could make beer sick—and people, too. Hence the cornerstone of modern medicine, the previously mentioned germ theory.
What America needed was not just better medicine but more fun. Fun-loving Benjamin Franklin had understood this when he’d said, “Beer is living proof that God loves us and wants us to be happy.” It was not, however, until the nineteenth century, when German immigrants began arriving in large numbers, that America had a cohort that took fun seriously. The German immigrants were astonished, and not happily, to find that there was no beer culture and, not coincidentally, no culture of pleasure. The German Americans set about rectifying this defect in the Republic by creating beer gardens where people could play cards to the accompaniment of music.
Most beer was drunk in taverns, some of which had basements where ice cut from northern lakes kept the beer cool in the summer. But ice was not always cheap or plentiful, either because of warm winters or because, during part of the nineteenth century, ice was, by weight, America’s biggest export, sent as far as India and China. And American beer that was taken home from taverns in pails would last only a day before spoiling because of the absence of refrigeration.
It was the American preference for one particular form of beer—lager, which has to be brewed slowly and at cold temperatures—that led brewers to drive the development of refrigeration, which made possible a constant supply of beer year-round. It also solved mankind’s problem of food storage and made Las Vegas possible.
Before the mechanics of refrigeration and the technique of pasteurization arrived from Europe, beer had been brewed in batches of seven to ten barrels a day. Now, suddenly, there were the technological prerequisites for the emergence of beer barons. Emerge they did, and some of their German names—such as Adolphus Busch, Gottlieb Heileman, Frederick Pabst, Joseph Schlitz, and Bernhard Stroh—would eventually be emblazoned on the labels of billions of beer bottles and cans.
By the turn of the twentieth century, when the beer business was booming enough to finance advertising (“Budweiser gives punch to the lunch”), most beer was sold in “tied houses”—taverns tied to particular breweries. Soon there were many more taverns than could survive by simply selling beer. So they branched out, doing a brisk business in gambling and prostitution. This, in turn, fueled the Prohibition movement, which was so vividly embodied in Carry Nation, who, as Daniel Okrent writes in Last Call: The Rise and Fall of Prohibition, was “six feet tall, with the biceps of a stevedore, the face of a prison warden, and the persistence of a toothache.” This hatchet-wielding scourge of taverns used a vigorous persuasion technique called “hatchetization.”
On the defensive, brewers argued in vain that distilled spirits, not beer, were the real alcohol problem. Perhaps that was so, but during Prohibition, which arrived in January 1920, spirits fared better than beer because beer is bulky and therefore difficult to smuggle into the country or on your person. As support grew for repealing Prohibition, the Woman’s Christian Temperance Union warned that “no nation ever drank itself out of a depression.” To which the nation responded: Maybe not, but drink might make the Depression more endurable. Prohibition ended in 1933, but serious damage had been done: America had lost much of its taste for beer. Beer consumption did not reach pre-Prohibition quantities until the 1970s. Which was certainly not Wrigley Field’s fault.
In the second half of the nineteenth century, baseball did much to help the nation shed some of its Puritan earnestness and learn to play, or to relax by watching others play. In the second half of the twentieth century, baseball helped the nation reacquire its thirst for beer. In 1950, Heileman’s Old Style lager became not just the only beer then sold at Wrigley Field but the official beer of the place. Although Heileman is now owned by Pabst—these familiar names do endure—Heileman’s association with the Cubs is older than Anheuser-Busch’s with the St. Louis Cardinals, who today play in their third ballpark to bear the Busch name. Wrigley Field, like all ballparks but more than some, performs a function that taverns used to perform: It brings people out of their homes and together for a social drink. Home refrigerators helped prompt the shift of
beer drinking from taverns to homes, and by the time Prohibition ended, one-third of all beer sold was not from a tap but in a bottle. By 1940, half was. By 1960, 80 percent was sold in bottles or cans. Today, millions of bottles and cans are emptied in the North Side tavern that also is a ballpark.
To the delight of fans who work while the sun shines, night games came to Major League Baseball on May 24, 1935, in Cincinnati’s Crosley Field. This was the handiwork of Larry MacPhail, grandfather of Andy MacPhail, who would be president of the Cubs through twelve seasons, 1994–2006. By the 1938 season, two of the fourteen ballparks (the St. Louis Cardinals and Browns shared Sportsman’s Park, and the Philadelphia Phillies and Athletics shared Shibe Park) had lights. If the Cubs had had lights that year, the most dramatic home run ever hit by the home team would have been drained of much of its drama.
On September 28, 1938, the Cubs were half a game behind the first-place Pirates as the two teams continued a three-game series at Wrigley Field. Late in the afternoon of an overcast day, the game was tied 5–5 as the Cubs came to bat in the bottom of the ninth, and the home plate umpire announced that the game would be called at the end of the inning if the Cubs did not score. At 5:37 P.M., with two outs, no runners on base, and no one able to see much of anything, Gabby Hartnett hit a two-strike pitch for what would ever after be known as the “Homer in the Gloamin’.” The next day the Cubs pummeled the Pirates 10–1, earning the right to be trounced by the Yankees in a four-game World Series in which they were outscored 22 to 9.
In 1936, when the two leagues had separate governance under their own presidents, the American League gave its members permission to have night games. Until 1942, each American League team was allowed to have only seven a season. This restriction was relaxed in 1942 to accommodate people working long hours in war industries. By 1948, when lights were installed in Briggs Stadium in Detroit, only Wrigley Field was without them. Soon the absence of lights became a symbol of a superior sensibility to some baseball “purists.” Never mind that P. K. Wrigley had bought those materials for installing lights after the 1941 season but then had donated the steel to the war effort.