Smaller Faster Lighter Denser Cheaper

Home > Other > Smaller Faster Lighter Denser Cheaper > Page 12
Smaller Faster Lighter Denser Cheaper Page 12

by Robert Bryce


  * Note that the density figure for the engine itself is a little low as it counts the weight of the entire locomotive—the wheels, etc.—while the other prime movers discussed here generally count only the weight of the engines themselves.

  9

  FROM ENIAC TO iCLOUD

  SMALLER FASTER COMPUTING

  For Apple devotees and the herd of reporters inside the Moscone Center in San Francisco, the June 6, 2011, presentation from Steve Jobs at the Worldwide Developers Conference was familiar: the Apple CEO and design visionary was presenting yet another round of products and services that would further cement the company’s position as one of the world’s most innovative technology providers.1

  But the 2011 meeting in San Francisco soon became notable for another reason: it would be Jobs’s next-to-last public appearance as the CEO of the world’s most famous technology company.2 The following day, he appeared before the Cupertino City Council to pitch the council on the design for the company’s new campus.3 Two months later, the tech titan resigned as Apple’s CEO, and two months after that, he was dead, felled by pancreatic cancer.

  At the time of the meeting, the biggest news from the Worldwide Developers Conference wasn’t Jobs’s health (which was constantly being scrutinized); it was that the visionary entrepreneur was introducing iCloud, a service that would allow users to “automatically and wirelessly store your content in iCloud and automatically and wirelessly push it to all your devices.” Jobs explained that Apple was “going to move the digital hub, the center of your digital life, into the cloud.”

  Other tech companies had been promoting their plans for the “cloud”—the industry’s name for the network of data centers that have become the digital brains of our society—but Apple’s move was different. Other companies had launched phones and tablet computers. Apple produced category killers like the iPhone and the iPad, and by announcing its move into the cloud, Apple was providing an endorsement of the concept. Apple was making ubiquitous computing real. It was removing the task of information storage out of the hands of consumers—or rather it was moving data off of the flash drives and hard drives inside mobile phones, iPads, and laptops—and moving it into massive data centers crammed with servers. Making that happen was no small chore. Apple’s iCloud data center, in Maiden, North Carolina, Jobs explained, was going to be one of the biggest in the world, with some 500,000 square feet (46,451 square meters), making it about five times as large as an average Walmart discount store.4

  Apple was aiming to dominate cloud computing for consumers despite lots of competition. Amazon, the giant online retailer, was selling its service, Cloud Drive. Google was marketing Google Drive, and Microsoft was offering SkyDrive.5 Near the time of Jobs’s announcement, each of those companies was offering about five gigabytes of storage in the cloud for free. And in mid-2013, Flickr, the photo-sharing site, began offering users 1 terabyte of storage space, which was enough to store more than 500,000 photos—all for free.6

  You can’t get any Cheaper than free. Just consider how much Cheaper computing has gotten over the past few years. In 1997, a consumer who needed 5 gigabytes of hard drive storage would have had to pay about $450.7 That money purchased only the hard drive itself with no computing capability around it. Today, that volume of storage is free. Better yet, for consumers, all of that data storage comes with no power cords, cables, or other hardware that can be tripped over, spilled on, or otherwise damaged. Our ability to keep the latest song by Lady Gaga as well as that cute video of Grandma Val’s cat is not only free; the storage itself has become invisible and, for the person using it, weightless.

  No other sector better demonstrates the march of Smaller Faster Lighter Denser Cheaper than computing. In 2013, the inventor and author Ray Kurzweil told the Wall Street Journal, that “a kid in Africa with a smartphone is walking around with a trillion dollars of computation circa 1970.”8

  Faster Cheaper: The Volume of Digital Data Created and Shared, projected to 2015

  Source: IDC, 2011.

  Cheaper electronics—whether they are desktop computers, landline phones, or smart phones—are allowing humans to exchange gargantuan quantities of information. Thanks to Faster networks, the volume of that information is growing at a staggering pace. Between 2006 and 2011, the volume of digital information—from YouTube videos to tweets—that was created and shared grew ninefold, to some 2 zettabytes of data.9 For reference, a zettabyte is 1 sextillion bytes, or 1 trillion gigabytes, or 1021 bytes. Comprehending that volume of information requires a bit of work. To put it in perspective, consider this: if we could store the Library of Congress’s entire collection in digital form, it would fill roughly 200 terabytes of hard-drive space.10 (A terabyte is 1012 bytes.)

  Recall that in 2011 we humans exchanged about 2 zettabytes of data. And by 2015, forecasters from the consulting firm IDC expect we will be exchanging 8 zettabytes. If that happens, by 2015 we will be exchanging annually about 40 million times as much data as exists in the Library of Congress.11 And much of that data exchange will be happening on a global fiber-optic network that is exchanging photons traveling at 300 million meters per second.

  The world is now networked, and it’s getting more wired every day. In 2013, Google announced that it was planning to build wireless networks using high-altitude balloons and airships so that it can provide connectivity to people living in rural areas of Africa and Asia.12 This surge in both data exchange and connectivity is a direct result of the push toward Smaller Faster. In 1965, Gordon Moore, the cofounder of Intel, famously declared that computing power—measured by the number of transistors placed on an integrated circuit—would double every two years.13 That declaration, known as Moore’s Law, has been proven right so far. Intel’s latest process can print individual lines on chips that are a thousand times thinner than a human hair.14 Pursuing such density on microprocessors means more computing power. And computing power is like sex, bandwidth, and horsepower: the more we get, the more we want.

  Today, we take cloud services and our ability to get Yelp! restaurant reviews on our mobile phones for granted. But just three generations ago, the bulk of the world’s computing was done by “computers” that is, humans who were facile with numbers and who were paid to do complex math problems all day, every day. The problem was that those humans were just too slow. The quest for Faster led the US military to design and build the world’s first general-purpose electronic computer—ENIAC, short for Electrical Numerical Integrator and Calculator.

  World War I was a conflict defined by trenches and big guns. Most of the casualties in that war were caused by artillery fire. The problem with big guns, however, has always been accuracy. Landing a 95-pound (43 kilogram) shell packed with high explosives onto a target 8 miles (13 kilometers) away requires sophisticated mathematics.15 In the late 1930s, as World War II was simmering in Europe, the US military realized it needed to improve the mathematical tables that it used to calculate the trajectories of artillery shells.16

  ENIAC at University of Pennsylvania, sometime between 1947 and 1955. Note the large air-conditioning vents on the ceiling. Then, as now, big computing facilities require lots of cooling. Source: US Army.

  The only way to create those tables was with people who worked at desks and manipulated mechanical calculators like the Marchant Silent Speed, a bulky machine with some 4,000 moving parts that could handle 10-digit numbers. After Japan bombed Pearl Harbor, the US war effort went into overdrive. By 1943, when the US military launched the Manhattan Project, the need for greater computational power became even more apparent. In fact, the computational needs that came with trying to design an atomic weapon made the ones needed for the accurate firing of artillery shells look positively puny.

  The need for something better was obvious to John von Neumann (b. 1903, d. 1957). Born in Budapest to a wealthy Jewish family, he began teaching at the University of Berlin in 1926. Four years later, he moved to the United States to take a position at the Institute for Ad
vanced Study at Princeton University. He pioneered several fields of study in mathematics and was so facile with complex subjects that the nuclear physicist Hans Bethe (who won the Nobel Prize in physics in 1967) once said that he “wondered whether a brain like von Neumann’s does not indicate a species superior to that of man.”21 Another colleague recalled von Neumann’s photographic memory. “He was able on once reading a book or article to quote it back verbatim . . . On one occasion, I tested his ability by asking him to tell me how A Tale of Two Cities started, whereupon, without any pause, he immediately began to recite the first chapter. We asked him to stop after ten to fifteen minutes.”22

  ENIAC-on-a-chip. This microchip has processing capacity equal to that of the world’s first computer, ENIAC, which was completed in 1946. Fifty years later, in 1996, a team of students at the University of Pennsylvania led by Professor Jan Van der Spiegel in the Moore School of Electrical Engineering, and backed by the National Science Foundation and Atmel Corporation, replicated the architecture and basic circuitry of ENIAC. They were able to put ENIAC’s capabilities onto a single chip that measured just 8 millimeters square, meaning it had an area of 64 square millimeters.17 Its power requirements: 0.5 watts.18 A bit of simple math shows that ENIAC-on-a-chip was about 350,000 times Smaller than ENIAC and 348,000 times more energy efficient.19 You could fit about six of these ENIACs-on-a-chip onto a single postage stamp.20 Source: Courtesy of Professor Jan Van der Spiegel, University of Pennsylvania, Moore School of Electrical Engineering.

  As one of the principal coordinators of the Manhattan Project, von Neumann arrived at Los Alamos, New Mexico, in September 1943, ready to begin work on building the first atomic bomb. At his disposal were about twenty human computers available to do the mathematics needed. Von Neumann quickly realized that if he and his colleagues—an illustrious group of physicists, mathematicians, and scientists—were going to build an atomic bomb, they were going to need more computing power, a lot more, than what could be provided by a handful of math geeks sitting at desks.23

  George Dyson, in his 2012 book, Turing’s Cathedral, explains that the team at Los Alamos needed to be able to predict how an atomic bomb might behave when detonated. He writes:

  To follow the process from start to finish required modeling the initial propagation of a detonation wave through the high explosive, the transmission of the resulting shock wave through the tamper and into the fissile material (including the reflection of that shock wave as it reached the center), the propagation of another shock wave as the core exploded, the passage of that shock wave (followed by an equally violent rarefaction wave) outward through the remnants of the previous explosion and into the atmosphere, and finally the resulting blast wave’s reflection if the bomb was at or near the ground.24

  Needless to say, the mathematics involved in modeling each of those waves was daunting. Dyson said that it was not a coincidence that the atomic bomb and the electronic computer were born at “exactly the same time. Only the collective intelligence of computers could save us from the destructive powers of the weapons they had allowed us to invent.”25

  Dyson’s explanation provides a quintessential example of both the dangers and benefits of innovation. The desire for Faster computing power gave humans the ability to create weapons capable of destroying the planet many times over. And yet that same computing power now provides us with the capability to create new materials, processes, and medicines.

  In 1943, the US military agreed to provide the funding for ENIAC. Built at the University of Pennsylvania by a team headed by J. Presper Eckert and John W. Mauchly, the massive machine was a wonder of the pre-transistor, pre-integrated circuit era. That meant it had to rely on vacuum tubes. ENIAC contained 17,468 of them, along with 10,000 capacitors, 1,500 relays, 70,000 resistors, and 6,000 manual switches. All of those devices were connected by some five million soldered joints, most of which had to be joined by hand.26

  In 1946, when ENIAC was completed, it weighed 27 tons, covered 240 square feet (22.3 square meters) of floor space, and required 174,000 watts (174 kilowatts) of power.27 The computer consumed so much electricity that when it was switched on, it allegedly caused lights in the rest of Philadelphia to momentarily dim. ENIAC’s enormous power demand was to become a hallmark of the Information Age. And that power demand takes us back to a familiar metric: power density.

  When it was first switched on, ENIAC almost certainly had the highest areal power density of any electronic machine on the planet, about 7,800 watts per square meter. That’s an astounding level of power density, particularly when compared against residential demand. An average home—and here I’m using my home in Austin, Texas, as a reference—has a power density (counting electricity only) of about 5 watts per square meter.28 The entire city of New York, counting the five boroughs and all of its land area, including Central Park, has an areal power density of about 15 watts per square meter.29

  The Incredible Shrinking Circuit: From the 8086 to Core i7

  Top view of an Intel 8086 processor, circa 1978. Source: Eric Gaba, Wikimedia Commons.

  Bottom view of an Intel Core i7 processor. Source: Wikimedia Commons.

  In 1978, the year I graduated from Bishop Kelley High School in Tulsa, Intel released the 8086 processor, the brain for the first wave of personal computers. The design of the 8086 and its almost identical brother, the 8088, would become the standard for personal computer design. As one analyst put it, the 8086 “paved the way for rapid, exponential progress in computer speed, capacity and price-performance.”30 The 8086 chip sported 29,000 transistors on circuits that were 3 microns wide.31 (A micron is one-millionth of a meter, or 1,000 nanometers.) By 2013, Intel was producing microprocessors with circuits that were just 22 nanometers wide.32 And the company’s flagship processor, the Core i7 Sandy Bridge-E, was packed with 2.27 billion transistors.33 Thus, between 1978 and 2013, Intel increased the computing power density of its best chips 78,000-fold while shrinking its circuits more than 130-fold.

  ENIAC’s computing capabilities were impressive at the time. It could perform about 5,000 additions or 400 multiplications in one second.34 It was also an enormously inefficient beast. In December 1945, during what Dyson calls a “shakedown run” of ENIAC’s capabilities—a calculation for the hydrogen bomb—a pair of scientists consumed nearly one million punch cards, which were used to temporarily store the intermediate results of their calculations.35 Needless to say, handling that many punch cards was a cumbersome process.

  While von Neumann and his colleagues understood the importance of ENIAC, von Neumann knew that a Smaller Faster more powerful computer could be built. In 1946, he declared, “I am thinking about something much more important than bombs. I am thinking about computers.” Von Neumann understood that speed in computation was to be sought for its own purpose. The value in Faster computers, he said, “lies not only in that one might thereby do in 10,000 times less time problems which one is now doing, or say 100 times more of them in 100 times less time—but rather in that one will be able to handle problems which are considered completely unassailable at present.”36

  Von Neumann convinced a variety of backers to provide the money for MANIAC, or Mathematical and Numerical Integrator and Computer, which was built at the Institute for Advanced Study at Princeton University. (At Princeton, the machine is referred to as “the IAS Computer.”) Introduced to the public in 1952, it was the first computer to utilize RAM—short for random access memory—and for that reason alone it was a milestone in the push for Faster Cheaper computing.38

  This photo, likely taken in about 1952, shows mathematician and computer pioneer John von Neumann standing in front of MANIAC, the first computer to utilize random access memory. The row of cylinders to his left are the cathode-ray tubes that provided the RAM for the machine. Author George Dyson says, “The entire digital universe can be traced directly” back to the creation of MANIAC.37 Source: Alan Richards, photographer. From the Shelby White and Leon Levy Archives Center, Instit
ute for Advanced Study, Princeton, NJ.

  Today, RAM is ubiquitous. The most common type of memory, it can be found in billions of electronic devices from dishwashers to cameras. But as Dyson points out, the first actual utilization of RAM was pivotal. He writes that thanks to RAM, “All hell broke loose . . . Random-access memory gave the world of machines access to the powers of numbers—and gave the world of numbers access to the powers of machines.”39 The quantity of information available in MANIAC’s RAM was almost laughably small, just 5 kilobytes, less memory than is required to display a simple icon on a modern computer. But back in 1952, it was nearly miraculous.

  Let’s pause here for a bit of perspective: The computer I’m using to write this chapter is a MacBook Pro, which is equipped with 8 gigabytes of RAM, or about 8 million kilobytes. Therefore, my computer has 1.6 million times as much RAM as was available on MANIAC, a machine that began operating the year after my parents were married. (Walter Bryce married Ann Mahoney in 1951.)

  Like ENIAC, MANIAC required lots of power. Dyson described MANIAC as being about the size of an industrial refrigerator.40 It measured about 6 feet high, 2 feet wide, and 8 feet long, giving it a footprint of 16 square feet (1.5 square meters). But powering it required 19,500 watts (19.5 kilowatts).41 Therefore, the areal power density of MANIAC was about 13,000 watts per square meter. That’s a remarkably high power density number when you consider that modern data centers are being built for power densities of about 6,000 watts per square meter.

  When we look at the history of ENIAC and MANIAC alongside that of modern data centers, we see a clear trend: the more computer power we want, the more electricity we need. As our computing needs grow, so too does power density. And few companies have chased density more avidly than Intel.

 

‹ Prev