How We Got to Now: Six Innovations That Made the Modern World
Page 18
Conclusion
The Time Travelers
On July 8, 1835, an English baron by the name of William King was married in a small ceremony in the western suburbs of London, at an estate called Fordhook that had once belonged to the novelist Henry Fielding. By all accounts it was a pleasant wedding, though it was a much smaller affair than one might have expected given King’s title and family wealth. The intimacy of the wedding was due to the general public’s fascination with his nineteen-year-old bride, the beautiful and brilliant Augusta Byron, now commonly known by her middle name of Ada, daughter of the notorious Romantic poet Lord Byron. Byron had been dead for a decade, and had not seen his daughter since she was an infant, but his reputation for creative brilliance and moral dissolution continued to reverberate through European culture. There were no paparazzi to hound Baron King and his bride in 1835, but Ada’s fame meant that a certain measure of discretion was required at her wedding.
After a short honeymoon, Ada and her new husband began dividing their time between his family estate in Ockham, another estate in Somerset, and a London home, beginning what promised to be a life of domestic leisure, albeit challenged by the enviable difficulties of maintaining three residences. By 1840, the couple had produced three children, and King had been elevated to earldom with Queen Victoria’s coronation list.
By the conventional standards of Victorian society, Ada’s life would have seemed any woman’s dream: nobility, a loving husband, and three children, including the all-important male heir. But as she settled into the duties of motherhood and of running a landed estate, she found herself fraying at the edges, drawn to paths that were effectively unheard-of for Victorian women. In the 1840s, it was not outside the bounds of possibility for a woman to be engaged in the creative arts in some fashion, and even to dabble in writing her own fiction or essays. But Ada’s mind was drawn in another direction. She had a passion for numbers.
When Ada was a teenager, her mother, Annabella Byron, had encouraged her study of mathematics, hiring a series of tutors to instruct her in algebra and trigonometry, a radical course of study in an age when women were excluded from important scientific institutions such as the Royal Society, and were assumed to be incapable of rigorous scientific thinking. But Annabella had an ulterior motive in encouraging her daughter’s math skills, hoping that the methodical and practical nature of her studies would override the dangerous influence of her dead father. A world of numbers, Annabella hoped, would save her daughter from the debauchery of art.
Augusta Ada, Countess Lovelace, circa 1840
For a time, it appeared that Annabella’s plan had worked. Ada’s husband had been made Earl of Lovelace, and as a family they seemed on a path to avoid the chaos and unconventionality that had destroyed Lord Byron fifteen years before. But as her third child grew out of infancy, Ada found herself drawn back to the world of math, feeling unfulfilled by the domestic responsibilities of Victorian motherhood. Her letters from the period display a strange mix of Romantic ambition—the sense of a soul larger than the ordinary reality it has found itself trapped in—combined with an intense belief in the power of mathematical reason. Ada wrote about differential calculus with the same passion and exuberance (and self-confidence) that her father wrote about forbidden love:
Owing to some peculiarity in my nervous system, I have perceptions of some things, which no one else has … an intuitive perception of hidden things;—that is of things hidden away from eyes, ears, and the ordinary senses. This alone would advantage me little, in the discovery line, but there is, secondly, my immense reasoning faculties, and my concentrative faculty.
In the late months of 1841, Ada’s conflicted feelings about her domestic life and her mathematical ambitions came to a crisis point, when she learned from Annabella that, in the years before his death, Lord Byron had conceived a daughter with his half sister. Ada’s father was not only the most notorious author of his time; he was also guilty of incest, and the offspring of this scandalous union was a girl Ada had known for many years. Annabella had volunteered the news to her daughter as definitive proof that Byron was a wretch, and that such a rebellious, unconventional lifestyle could only end in ruin.
And so, at the still young age of twenty-five, Ada Lovelace found herself at a crossroads, confronting two very different ways of being an adult in the world. She could resign herself to the settled path of a baroness and live within the boundaries of conventional decorum. Or she could embrace those “peculiarities of [her] nervous system” and seek out some original path for herself and her distinctive gifts.
It was a choice that was deeply situated in the culture of Ada’s time: the assumptions that framed and delimited the roles women could adopt, the inherited wealth that gave her the option of choosing in the first place, and the leisure time to mull over the decision. But the paths in front of her were also carved out by her genes, by the talents and dispositions—even the mania—Ada had inherited from her parents. In choosing between domestic stability and some unknown break from convention, she was, in a sense, choosing between her mother and her father. To stay settled at Ockham Park was the easier path; all the forces of society propelled her toward it. And yet, like it or not, she was still Byron’s daughter. A conventional life seemed increasingly unthinkable.
But Ada Lovelace found a way around the impasse she had confronted in her mid-twenties. In collaboration with another brilliant Victorian who was equally ahead of his time, Ada charted a path that allowed her to push the barriers of Victorian society without succumbing to the creative chaos that had enveloped her father. She became a software programmer.
—
WRITING CODE IN THE MIDDLE of the nineteenth century may seem like a vocation that would be possible only with time travel, but as chance would have it, Ada had met the one Victorian who was capable of giving her such a project: Charles Babbage, the brilliant and eclectic inventor who was in the middle of drafting plans for his visionary Analytical Engine. Babbage had spent the previous two decades concocting state-of-the-art calculators, but since the mid-1830s, he had commenced work on a project that would last the rest of his life: designing a truly programmable computer, capable of executing complex sequences of calculations that went far beyond the capabilities of any contemporary machines. Babbage’s Analytical Engine was doomed to a certain practical failure—he was trying to build a digital-age computer with industrial-age mechanical parts—but conceptually it was a brilliant leap forward. Babbage’s design anticipated all the major components of modern computers: the notion of a central processing unit (which Babbage dubbed “the mill”), of random-access memory, and of software that would control the machine, etched on the very same punch cards that would be used to program computers more than a century later.
Ada had met Babbage at the age of seventeen, in one of his celebrated London salons, and the two had kept up a friendly and intellectually lively correspondence over the years. And so when she hit her crossroads in the early 1840s, she wrote a letter to Babbage that suggested he might prove to be a potential escape route from the limitations of life at Ockham Park:
I am very anxious to talk to you. I will give you a hint on what. It strikes me that at some future time, my head may be made by you subservient to some of your purposes & plans. If so, if ever I could be worth or capable of being used by you, my head will be yours.
Babbage, as it turned out, did have a use for Ada’s remarkable head, and their collaboration would lead to one of the founding documents in the history of computing. An Italian engineer had written an essay on Babbage’s machine, and at the recommendation of a friend, Ada translated the text into English. When she told Babbage of her work, he asked why she hadn’t written her own essay on the subject. Despite all her ambition, the thought of composing her own analysis had apparently never occurred to Ada, and so at Babbage’s encouragement, she concocted her own aphoristic commentary, stitched together out of a series of extended footnotes attached to the Italian paper.
>
Charles Babbage
Those footnotes would ultimately prove to be far more valuable and influential than the original text they annotated. They contained a series of elemental instruction sets that could be used to direct the calculations of the Analytical Engine. These are now considered to be the first examples of working software ever published, though the machinery that could actually run the code wouldn’t be built for another century.
Babbage’s Analytical Engine
There is some dispute over whether Ada was the sole author of these programs, or whether she was refining routines that Babbage himself had worked out previously. But Ada’s greatest contribution lay not in writing out instruction sets, but rather in envisioning a range of utility for the machine that Babbage himself had not considered. “Many persons,” she wrote, “imagine that because the business of the engine is to give its results in numerical notation the nature of its processes must consequently be arithmetical and numerical, rather than algebraical and analytical. This is an error. The engine can arrange and combine its numerical quantities exactly as if they were letters or any other general symbols.” Ada recognized that Babbage’s machine was not a mere number cruncher. Its potential uses went far beyond rote calculation. It might even someday be capable of the higher arts:
Supposing, for instance, that the fundamental relations of pitched sounds in the science of harmony and musical composition were susceptible of such expressions and adaptations, the Engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.
To have this imaginative leap in the middle of the nineteenth century is almost beyond comprehension. It was hard enough to wrap one’s mind around the idea of programmable computers—almost all of Babbage’s contemporaries failed to grasp what he had invented—but somehow, Ada was able to take the concept one step further, to the idea that this machine might also conjure up language and art. That one footnote opened up a conceptual space that would eventually be occupied by so much of early twenty-first-century culture: Google queries, electronic music, iTunes, hypertext. The computer would not just be an unusually flexible calculator; it would be an expressive, representational, even aesthetic machine.
Of course, Babbage’s idea and Lovelace’s footnote proved to be so far ahead of their time that, for a long while, they were lost to history. Most of Babbage’s core insights had to be independently rediscovered a hundred years later, when the first working computers were built in the 1940s, running on electricity and vacuum tubes instead of steam power. The notion of computers as aesthetic tools, capable of producing culture as well as calculation, didn’t become widespread—even in high-tech hubs such as Boston or Silicon Valley—until the 1970s.
Most important innovations—in modern times at least—arrive in clusters of simultaneous discovery. The conceptual and technological pieces come together to make a certain idea imaginable—artificial refrigeration, say, or the lightbulb—and all around the world, you suddenly see people working on the problem, and usually approaching it with the same fundamental assumptions about how that problem is ultimately going to be solved. Edison and his peers may have disagreed about the importance of the vacuum or the carbon filament in inventing the electric lightbulb, but none of them were working on an LED. The predominance of simultaneous, multiple invention in the historical record has interesting implications for the philosophy of history and science: To what extent is the sequence of invention set in stone by the basic laws of physics or information or the biological and chemical constraints of the earth’s environment? We take it for granted that microwaves have to be invented after the mastery of fire, but how inevitable is it that, say, telescopes and microscopes quickly followed the invention of spectacles? (Could one imagine, for instance, spectacles being widely adopted, but then a pause of five hundred years before someone thinks of rejiggering them into a telescope? It seems unlikely, but I suppose it’s not impossible.) The fact that these simultaneous-invention clusters are so pronounced in the fossil record of technology tells us, at the very least, that some confluence of historical events has made a new technology imaginable in a way that it wasn’t before.
What those events happen to be is a murkier but fascinating question. I have tried to sketch a few answers here. Lenses, for instance, emerged out of several distinct developments: glassmaking expertise, particularly as cultivated on Murano; the adoption of glass “orbs” that helped monks read their scrolls later in life; the invention of the printing press, which created a surge in demand for spectacles. (And, of course, the basic physical properties of silicon dioxide itself.) We can’t know for certain the full extent of these influences, and no doubt some influences are too subtle for us to detect after so many years, like starlight from remote suns. But the question is nonetheless worth exploring, even if we are resigned to somewhat speculative answers, the same way we are when we try to wrestle with the causes behind the American Civil War or the droughts of the Dust Bowl era. They’re worth exploring because we are living through comparable revolutions today, set by the boundaries and opportunities of our own adjacent possible. Learning from the patterns of innovation that shaped society in the past can only help us navigate the future more successfully, even if our explanations of that past are not falsifiable in quite the same way that a scientific theory is.
—
BUT IF SIMULTANEOUS INVENTION is the rule, what about the exceptions? What about Babbage and Lovelace, who were effectively a century ahead of just about every other human being on the planet? Most innovation happens in the present tense of the adjacent possible, working with the tools and concepts that are available in that time. But every now and then, some individual or group makes a leap that seems almost like time traveling. How do they do it? What allows them to see past the boundaries of the adjacent possible when their contemporaries fail to do so? That may be the greatest mystery of all.
The conventional explanation is the all-purpose but somewhat circular category of “genius.” Da Vinci could imagine (and draw) helicopters in the fifteenth century because he was a genius; Babbage and Lovelace could imagine programmable computers in the nineteenth century because they were geniuses. No doubt all three were blessed with great intellectual gifts, but history is replete with high-IQ individuals who don’t manage to come up with inventions that are decades or centuries ahead of their time. Some of that time-traveling genius no doubt came from their raw intellectual skills, but I suspect just as much came out of the environment their ideas evolved in, the network of interests and influence that shaped their thinking.
If there is a common thread to the time travelers, beyond the nonexplanation of genius, it is this: they worked at the margins of their official fields, or at the intersection point between very different disciplines. Think of Édouard-Léon Scott de Martinville inventing his sound-recording device a generation before Edison began working on the phonograph. Scott was able to imagine the idea of “writing” sound waves because he had borrowed metaphors from stenography and printing and anatomical studies of the human ear. Ada Lovelace could see the aesthetic possibilities of Babbage’s Analytical Engine because her life had been lived at a unique collision point between advanced math and Romantic poetry. The “peculiarities” of her “nervous system”—that Romantic instinct to see beyond the surface appearances of things—allowed her to imagine a machine capable of manipulating symbols or composing music, in a way that even Babbage himself had failed to do.
To a certain extent, the time travelers remind us that working within an established field is both empowering and restricting at the same time. Stay within the boundaries of your discipline, and you will have an easier time making incremental improvements, opening the doors of the adjacent possible that are directly available to you given the specifics of the historical moment. (There’s nothing wrong with that, of course. Progress depends on incremental improvements.) But those disciplinary boundaries can also serve as blinders, keeping you from the bigger idea
that becomes visible only when you cross those borders. Sometimes those borders are literal, geographic ones: Frederic Tudor traveling to the Caribbean and dreaming of ice in the tropics; Clarence Birdseye ice fishing with the Inuits in Labrador. Sometimes the boundaries are conceptual: Scott borrowing the metaphors of stenography to invent the phonautograph. The time travelers tend, as a group, to have a lot of hobbies: think of Darwin and his orchids. When Darwin published his book on pollination four years after Origin of Species, he gave it the wonderfully Victorian title, On the Various Contrivances by Which British and Foreign Orchids are Fertilised by Insects, and on the Good Effects of Intercrossing. We now understand the “good effects of intercrossing” thanks to the modern science of genetics, but the principle applies to intellectual history as well. The time travelers are unusually adept at “intercrossing” different fields of expertise. That’s the beauty of the hobbyist: it’s generally easier to mix different intellectual fields when you have a whole array of them littering your study or your garage.
One of the reasons garages have become such an emblem of the innovator’s workspace is precisely because they exist outside the traditional spaces of work or research. They are not office cubicles or university labs; they’re places away from work and school, places where our peripheral interests have the room to grow and evolve. Experts head off to their corner offices and lecture halls. The garage is the space for the hacker, the tinkerer, the maker. The garage is not defined by a single field or industry; instead, it is defined by the eclectic interests of its inhabitants. It is a space where intellectual networks converge.