Cubed

Home > Other > Cubed > Page 25
Cubed Page 25

by Nikil Saval


  Discomfort and resistance soon followed. Sociological studies confirmed outright racism from white managers. “I keep hearing the old clichés like, ‘We don’t see why you niggers want so much’ and ‘You blacks are getting all the breaks,’ ” one black manager at a large manufacturing firm said. Another at the same firm mentioned a company-wide memo from a manager which said that “he didn’t want any more blacks in his department because they were lazy and didn’t work.”9 But more often it seemed that office workers expressed a low-level coded or euphemistic fear of environmental change—one that stemmed directly from the old pressure toward social cohesion and conformity for which the office environment was renowned. Black workers, and many people of color more generally, often found themselves coming up against vague currents of prejudice; as a result, they would experience even more acute feelings of paranoia and dread than office life already tended to encourage. When working lunches or casual get-togethers took place, they would routinely not be asked to join. One black woman reported that her peers “never invite me to informal discussions, meetings, and luncheons, and many times they discuss issues related to my job.”10 “I saw younger whites come into the organization and other whites take them aside and tell them things, share information,” one black manager said, recalling his first days in an organization. “I saw the new whites interacting with other people I didn’t even know. That wasn’t done with me or for me. Now I’m feeling really strange.” But because of his isolation, he wasn’t immediately ready to see these actions for what they were: “At first—not for a long time, really—I didn’t attribute my weird feelings to racism. I thought it was me! I thought I was doing something wrong.”11 Only when that same manager began to mention his concerns to other black workers at the company did he get confirmation of the same kinds of biases: “Glory! Man, I felt … good! I was relieved. It wasn’t just me!”12

  It wasn’t just me: though the specific feeling of being a black manager in a white corporation led to a very particular kind of isolation, the feeling of being alone and responsible for one’s fate was more likely in the office environment in general. The willingness to blame oneself for failures or mistakes was a powerful temptation. Partly this stemmed from the hopeful prognostications about knowledge work and the increasing importance of education. Boosters throughout the 1960s and 1970s, such as Drucker and the economist Gary Becker (coiner of the term “human capital”), held that education naturally led to employment and that a more deeply educated populace would lead to better and more thoughtful jobs.

  But with office work, this proved not only untrue but spectacularly so. Not only had white-collar work changed little, but much of it had arguably gotten less demanding and more rationalized. Yet to get a job in an office increasingly required higher levels of education than were strictly necessary for the job. In fact, the growth in white-collar employment by 1970 had been on the lower rungs of the ladder rather than the “knowledge work” prophesied by so many. And in various measurements of white-collar work—such as one study of 125 branch offices of a New York bank—the level of education and the level of performance of these workers were inversely related.13

  Offices were filling up with overeducated workers whose expectations were gradually running up against their actual possibilities for advancement. And the “human relations” paradigm was losing its ability to soothe them. Education had the use of creating a certain aura around work in an office that the office usually failed to fulfill. Yet rather than forming a frustrated “white-collar proletariat” and demanding changes, they tended to blame themselves in the same breath that they (borrowing some language from the student movement) blamed “the system.” In one interview, a manager in a multinational, called Howard Carver, decried the “aridity, the petty politicking and the scary power scramble” of his world. “The fact is, the company, the bureaucracy, can only use a small part of a man’s capabilities and yet it demands so much in time, in loyalty, in petty politics, in stupidities … the ratio of morons, time-wasters, timeservers, petty politicians and scared rabbits is so high here that it’s discouraging, and I’m beginning to feel only little bully boys and smart operators can claw their way through middle management.” Yet the same man blamed the failure of his career, its dead-ending in the very middle of middle management, on a small social mistake, when he wondered aloud, in the presence of a vice president, when the CEO of his company would finally retire. It turned out the vice president was secretly the right-hand man of the CEO, his eyes and ears for the rest of the company. “That’s how I blew it,” he said. “It’s been ten years since that day and while I’ve gotten on the escalator and gone up routinely, all the good posts have eluded me.” It was a minor moment of office politics, and because he knew the game, he believed he had failed. Yet his final description of the corporate world is bleak—as grim as any of the factory worker tales that littered Studs Terkel’s Working and far from the stirring tales of corporate power at mid-century:

  I’ve looked around … and I can see, not malevolence, not conspiracy, not a sinister force operating in the world for its own hidden purposes or trying to bend the public to its will nor, on the bright side, as so many would have you believe, a hard-working fulfilling existence, a set of challenges and excitements that can command the best of truly good men, but a trivialization of human effort and aspiration in a chaotic and mindless drudgery, a sidetracking of valuable human resources, and, for most truly intelligent men who have so much to offer, as I once did, in the end a career in a wasteland.14

  On the outside, the offices were changing as well, in a way that refracted the ill will inside them in a peculiar way. Until the 1970s, some brand of architectural “modernism”—usually meaning the International Style—had prevailed without question, almost monolithically consistent across the country. Developers, planners, and politicians supported modernism, at once inflexible as a form and adaptable for multipurpose use, amenable to both government and corporations, an all-purpose glassy, boxy style—or occasionally a “brutalist” concrete—for any kind of building. The building as glass envelope had gone essentially unquestioned: despite the rise in space planning, few architects devoted any kind of mind toward the interiors of these buildings.

  At the same time, despite two oil shocks, stagflation, and, in 1981, a spike in real interest rates, knocking the jobless rate up to 10 percent and precipitating what was then the worst economic crisis since the Great Depression, the blast radius of the speculative boom in office development, detonated in the 1960s, expanded implacably through the 1970s and 1980s. Nothing could stop the office space binge. Fifty-four million square feet of office space were added in New York in the 1970s, forty-six million in the 1980s. The two main towers of the World Trade Center, the tallest buildings on the planet, steadily grew to cast a long shadow on a city that, in the mid-1970s, was on the verge of bankruptcy. Designed by Minoru Yamasaki, one of the premier architects of modernism, their unbroken upward sweep of tiny windows and neo-Gothic tracery was then perceived by many critics as a bizarre and anonymous—not to say hubristic—testament to New York’s new emphasis on the banking and finance sectors of its economy. Lewis Mumford described their enormousness as “purposeless gigantism and technological exhibitionism.” Meanwhile, the critic Charles Jencks saw specters of fascism:

  The effect of extreme repetition may be monotony or a hypnotic trance: positively it can elicit feelings of the sublime and the inevitable because it so incessantly returns to the same theme. A musical figure, repeated at length, such as that in Bolero, acts not just as a form of mental torture but as a pacifier. Repetitive architecture can put you to sleep. Both Mussolini and Hitler used it as a form of thought control knowing that before people can be coerced they first have to be hypnotized and bored.15

  When they opened in 1973, their ten million square feet of space wasn’t filled and wouldn’t be for several years. In 1977, the city suffered a catastrophic blackout; the towers loomed over the city, lightless, a
symbol more forbidding than anything dreamed up by Kubrick in 2001.

  Downtown business districts all over the country added tower after tower in a desperate attempt to refill empty city coffers with a new corporate tax base. San Francisco, which had been a dense but low-rise city for decades, began to capitalize on the growing computer industry to its south and its connection with the Pacific Rim economies to its west, led by Japan. It added an exorbitant amount of office space in just a few years, prompting a movement by many residents against what they saw as a catastrophic “Manhattanization.” By 1981, the city had nearly quadrupled its annual increase in office space, to 2,156,500 square feet a year from an average of 573,000 in 1964. Only Boston had a higher proportion of office space to population. Boston itself crowned its skyline with I. M. Pei’s glassier-than-thou John Hancock Tower in 1976. Soon after it was built, it became uncomfortably clear that it was structurally unsound, unprepared for the city’s chill blast of high-speed winter winds. Like something out of a disaster movie, large panes from the building’s curtain wall began to loosen, dislodge, and careen to the ground, littering the downtown pavement with shards of glass. In any case, the slow return of corporate headquarters to metropolitan cores hadn’t stopped the flight of back-office operations out to the suburbs (possibly fleeing all the falling glass in the cities). A pastoralia of office parks was springing up in commuter corridors with delirious abandon: the Raleigh–Durham–Chapel Hill Triangle; the Boston Tech Corridor; Silicon Valley; and the D.C.-centered swath of the Northern Virginia suburbs.

  Even if the amount of office space grew, the way it was housed was starting to signal a change in the way people were thinking about architecture—with implications about how people should work as well. All precepts of modernism that had attempted to legislate how people should live, move, and labor were coming under attack. In 1961, Jane Jacobs had delivered the monumental treatise The Death and Life of Great American Cities, a devastating attack on the effect of architectural modernism on the American city. Ostensibly a brief against the planning mistakes and social costs of urban “renewal” programs, Death and Life also lodged an aesthetic critique of the way modernism had insisted on spacing superblocks—housing and office towers—against the natural, spontaneous, and time-honored order of street life. Where people like her political opponent Robert Moses had set up cities to be amenable to cars, the city of Jacobs’s imagination was rooted in tightly knit communities that depended on pedestrian life. It was a version of cities heavily tilted in favor of face-to-face interaction, small public spaces, and low-rise density over high-rise grandiosity. One could hear echoes of the critique in Robert Propst’s own thinking about “fortuitous encounters” in the office and his emphasis on flexible, “forgivable” design—design that catered to human needs, that didn’t destroy already existing cultures.

  Without quite intending it, Jacobs’s critique might have become one of the first foundation stones in the giddy edifice housing all the artistic movements gathered up under the term “postmodernism.” It had its first strong proponents in the field of architecture. In the hands of architects and critics like Charles Jencks and Robert Venturi, modernism—especially in the figure of Le Corbusier—came under attack for its blind utopianism, its willingness to ignore context and scale and landscape in favor of large-scale projects of social reengineering. Jencks cited the infamous failure of a public housing project, the Pruitt-Igoe homes in St. Louis, as the death knell of modernism. “Modern Architecture died in St. Louis on July 15, 1972, at 3:32 p.m. (or thereabouts),” he wrote in The Language of Post-modern Architecture, “when the infamous Pruitt-Igoe scheme, or rather several of its slab blocks, were given the final coup de grâce by dynamite.”16 Jencks noted that Pruitt-Igoe had been sanctified by the Le Corbusier acolyte organization, the International Congress of Modern Architecture, and had won an award from the American Institute of Architects when the buildings were inaugurated in 1951. Consisting of fourteen-story blocks spaced apart by swaths of greenery, it was perfect Corbusianism. But though hospitable to rational ideals, it was seen by many as hostile to human needs. Over time the buildings began to fall apart, and crime within them began to rise. Though the reasons for this were complex—largely deriving from the disappearance of St. Louis’s manufacturing jobs—the legend swiftly arose that it was the building’s design that had destroyed the building from within.

  Critics drew the necessary conclusion: that modernism was antihuman. And it was the commercial architecture that revealed this best. For Jencks, modernism had ignored context to such an extent that it had essentially modeled every building as an office building. He claimed that no architect stopped to ask himself, “Are I-beams and plate glass appropriate to housing?” Nor, when they subsequently and deliberately confused the language of architecture for working and living, did they realize that “the net result would be to diminish and compromise both functions by equating them: working and living would become interchangeable on the most banal, literal level, and unarticulated on a higher, metaphorical plane. The psychic overtones to these two different activities would remain unexplored, accidental, truncated.”17

  Yet the answer that the postmodernists proposed wasn’t to separate work and life more, to pursue a deeper purism, but rather to confuse everything more vigorously and in a playful spirit. One of the founding documents of the movement, Venturi’s Learning from Las Vegas (co-authored with the architects Steven Izenour and Denise Scott Brown), said it all with its title. Rather than proclaiming the tragic purity of contemporary architecture, as the modernists did, architects would borrow from the historical landscape with a kind of studious abandon, combining in a single building styles from neoclassical predecessors and neo-Gothic motifs (it was often the “revived” style, rather than the original, that attracted them). Venturi, Scott Brown, and Izenour also paid special attention to the vernacular or pop cultural landscape—kitsch hotels, classic American diners, even gas stations and hot dog stands—that had been constructed by developers or second-rate builders instead of star architects. The gaiety of this mélange was what attracted Venturi in particular, who had redescribed it, in an uncharacteristically haughty phrase, as “complexity and contradiction in architecture.” Although it attempted to look wild and populist, the effect of superficiality was a kind of diligens negligentia, a cultivated negligence: the postmodernists’ thinking was in fact being processed with deep self-consciousness through a new brand of hypertheorization of the nature of building (often taking place in the heady pages of the journal Oppositions). Men trained as modernists (and they were virtually all men) began to make their names as postmodernists: Frank Gehry, Charles Moore, Robert A. M. Stern, Michael Graves, Peter Eisenman. Their houses and projects were deliberately eclectic, scaled down, making more than halfhearted genuflections to the context of their surrounding landscape. Their position of openness to bizarre forces moving in from the fringes paralleled the movement in office design toward letting the users articulate their own spaces, such as Robert Propst’s hope that the Action Officers would decorate their walls to express their individuality.

  But while postmodernism was deployed in a smattering of houses, museums, and university buildings, it would take an office building—a symbol of corporate power—to cement its arrival as a cultural force. One figure emerged from the shadows to set the tone for postmodernist office buildings. It was Philip Johnson, erstwhile partner of Mies van der Rohe—the man who had brought the International Style to the United States. Now in his seventies, bald, ever more impish, donning thick round glasses in the style of Le Corbusier, he had outlived the modernists he had supported, confronting a future of which he would be proclaimed the living master. Whatever fervors had led him to linger in the wilds of fascism or subsequently embrace the fastidious cool of modernism, he had shed. In his grand old age, he cultivated a charmingly weak attention span. “My direction is clear: traditionalism,” he said, seemingly soppy-stern in the old modernist, aphoristic way, before co
yly and folksily describing his lax habit of combining various styles in his buildings: “I try to pick up what I like throughout history … I pick up anything, any old point in time or place.”18 Following his conviction that the truth was not found but made by personality, he had developed a conspicuous persona, one that could be easily talked about by carefully selected friends and reproduced in a fawning media. “Architecture in the main is something that is more apt to be run by popes, kings and generals than by public vote,” Johnson once said, explaining his way of working, “and so I got interested in getting things done in a grand way.”19 Johnson held court at the Four Seasons Restaurant in the Seagram Building that he had designed. Its theatrical lighting was perfect for highlighting the most powerful power lunch in the world of architecture. Commissions came to him there, though he also dispensed largesse among the younger architects who crowded around him. In a manner hitherto unknown, the figure of the revolutionary architect had become a celebrity.

  When the American Telephone and Telegraph Company—AT&T, popularly known as Ma Bell—asked him to design its new headquarters in New York, it had found the perfect choice. Johnson’s playful, witty approach to architecture’s past made him at once a paragon of the postmodernist revolt and a natural medium for corporations wanting to express their renewed power. Whatever corporate, futuristic ethos modernism had once expressed was now gone, dissipated amid a profusion of black boxes and gray civic centers that everyone now hated. Glass and concrete were the media of bureaucracy, of the creaking old American welfare state that, in the 1980s, a bold president would be given a tremendous mandate to dismantle.

  The design that Johnson (through the aegis of his firm, Johnson/Burgee) provided would be the most admired and reviled building of the new decade, a consummate symbol of a new corporate culture. Following the past masters of the skyscraper form, like Burnham and Sullivan, the building was divided like a column, with base, shaft, and capital. AT&T’s original headquarters from 1922 was also a classic skyscraper in this way, lavishly outfitted in marble, bronze, and alabaster and peppered with a profusion of ornamental columns.20 Johnson therefore gestured back to the 1920s, when he chose to cover his steel frame not, as per usual, with a glass curtain wall but rather with acres of rosy-pink granite. Consisting of stacked panels running up to ten inches thick, the facade shone richly through the long summer months of Manhattan sunshine—and required six thousand more tons of steel than usual to support.21 Other aspects of the building were exaggerated as well. He elongated the base of the building vertically, making it into a large, yawning loggia; the lobby was perhaps the most lavish entrance to any office building in the city. This was a thirty-eight-story building that rose sixty stories.22 Rather than creating an open public space as he and Mies had earlier done with the Seagram Building, Johnson opened up a corridor framed with columns. Johnson would later explain that this space “was basically tailored to AT&T—it is an imperial space. AT&T didn’t want lingerie stores in the lobby. They said ‘Make it the front door into our empire. Let’s make it so you’ll be impressed when you go by.’ ”23 And, most controversially and impishly, Johnson split the usual flat pediment with an angled inverted arch, scooped out of the air, nicknamed Chippendale after the eighteenth-century English furniture maker who used that gesture as his signature. It instantly made the building the most recognizable and infamous new addition to the crowded Manhattan skyline and earned it the nickname Chippendale Building. An aging George Nelson commented favorably, arguing that it was “high time things got dinguses on top.”24 Acerbic Village Voice critic Michael Sorkin had more choice words. “Not to put too fine a point on it, the building sucks,” he wrote in his review. “The so-called ‘post-modern’ styling in which AT&T has been tarted up is simply a graceless attempt to disguise what is really just the same old building by cloaking it in this week’s drag and by trying to hide behind the reputations of the blameless dead.”25

 

‹ Prev