Is the Internet Changing the Way You Think?

Home > Other > Is the Internet Changing the Way You Think? > Page 7
Is the Internet Changing the Way You Think? Page 7

by John Brockman


  A Level Playing Field

  Martin Rees

  President, the Royal Society; professor of cosmology and astrophysics; master, Trinity College, University of Cambridge; author, Our Final Century: The 50/50 Threat to Humanity’s Survival

  In 2002, three Indian mathematicians—Manindra Agrawal and two of his students, Neeraj Kayal and Nitin Saxena—invented a faster algorithm for factoring large numbers, an advance that could be crucial for code breaking. They posted their results on the Web. Such was the interest that within just a day 20,000 people had downloaded the work, which became the topic of hastily convened discussions in many centers of mathematical research around the world.

  This episode—offering instant global recognition to two young Indian students—contrasts starkly with the struggles of a young Indian genius a hundred years ago. Srinivasa Ramanujan, a clerk in Bombay, mailed long screeds of mathematical formulas to G. H. Hardy, a professor at Trinity College, Cambridge. Fortunately, Hardy had the percipience to recognize that Ramanujan was not the typical green-ink scribbler who finds numerical patterns in the Bible or the pyramids and that his writings betrayed preternatural insight. Hardy arranged for Ramanujan to come to Cambridge and did all he could to foster Ramanujan’s genius; sadly, culture shock and poor health brought him an early death.

  The Internet enables far wider participation in front-line science; it levels the playing field between researchers in major centers and those in relative isolation, hitherto handicapped by inefficient communication. It has transformed the way science is communicated and debated. More fundamentally, it changes how research is done, what might be discovered, and how students learn.

  And it allows new styles of research. For example, in the old days astronomical information, even if in principle publicly available, was stored on delicate photographic plates. These were not easily accessible, and they were tiresome to analyze. Now such data (and, likewise, large datasets in genetics or particle physics) can be accessed and downloaded anywhere. Experiments, and natural events such as tropical storms or the impact of a comet on Jupiter, can be followed in real time by anyone who’s interested. And the power of huge computing networks can be deployed on large datasets.

  Indeed, scientific discoveries will increasingly be made by “brute force” rather than by insight. IBM’s Deep Blue beat Garry Kasparov not by thinking like him but by exploiting its speed to explore a huge variety of options. There are some high-priority scientific quests—for instance, the recipe for a room-temperature superconductor, or the identification of key steps in the origin of life—that may yield most readily neither to insight nor to experiment but to exhaustive computational searches.

  Paul Ginsparg’s arXiv.org archive transformed the literature of physics, establishing a new model for communication over the whole of science. Far fewer people today read traditional journals. These have so far survived as guarantors of quality. But even this role may soon be trumped by a more informal system of quality control, signaled by the approbation of discerning readers (analogous to the grading of restaurants by gastronomic critics), by blogs, or by Amazon-style reviews.

  Clustering of experts in actual institutions will continue, for the same reason that high-tech expertise congregates in Silicon Valley and elsewhere. But the progress of science will be driven by ever more immersive technology where propinquity is irrelevant. Traditional universities will survive insofar as they offer mentoring and personal contact to their students. But it’s less clear that there will be a future for the “mass university,” where the students are offered little more than a passive role in lectures (generally of mediocre quality) with minimal feedback. Instead, the Internet will offer access to outstanding lectures—and in return will offer the star lecturers (and perhaps the best classroom teachers, too) a potentially global reach.

  And it’s not just students but those at the end of their careers whose lives the Internet can transformatively enhance. We oldies, as we become less mobile, will be able to immerse ourselves—right up to the final switch-off, or until we lose our wits completely—in an ever more sophisticated cyberspace that allows virtual travel and continuing engagement with the world.

  Move Aside, Sex

  Seth Lloyd

  Quantum mechanical engineer, MIT; author, Programming the Universe

  I think less. My goal is to transfer my brain’s functions, bit by bit, to the Cloud.

  When I do think, I’m lazier. There’s no point in making the strenuous trek over to the library to find the source when you can get an expurgated electronic version on Google Books right away. And why go look up the exact theorem when you can find an approximate version on Wikipedia?

  OK, you can get burned. Math being what it is, an approximate theorem is typically an untrue theorem. Over the years, I have found most statements in purely scientific reference articles on Wikipedia to be 99.44 percent correct. It’s that last .56 percent that gets you. I just wasted three months and almost published an incorrect result because one clause in the Wikipedia statement of a theorem was wrong. It’s a lucky thing the referee caught my error. In the meanwhile, however, I had used one of the great Internet innovations, the scientific preprint archive, to post the incorrect result on the Internet for everyone to see.

  For hundreds of millions of years, sex was the most efficient method for propagating information of dubious provenance. The origins of all those snippets of junk DNA are lost in the sands of reproductive history. Move aside, sex: The World Wide Web has usurped your role. A single illegal download can disseminate more parasitic bits of information than a host of mating tsetse flies. Indeed, as I looked further afield, I found that it was not just Wikipedia that was in error: Essentially every digital statement of the clause in the theorem of interest was also incorrect. For better or worse, it appears that the only sure way to find the correct statement of a theorem is to trek to the library and find some book written by some dead mathematician—maybe even the one who proved the theorem in the first place.

  In fact, the key to correctness probably lies not so much in the fact that the book was written by that mathematician as in the fact that the book was scrupulously edited by the editor of the series who had invited the mathematician to write the book. Prose, poetry, and theorems posted on the Internet are no less insightful and brilliant than their paper predecessors; they are simply less edited. Moreover, just when we need them most, the meticulously trained editors of our newspapers, journals, and publishing houses are being laid off in droves.

  Life, too, has gone through periods of editorial collapse. During the Cambrian explosion, living systems discovered the evolutionary advantage of complex, multicellular forms. Like the digital organisms of today’s Internet, the new Cambrian life-forms rewrote the rules of habitat after habitat, evolving rapidly in the process. Finally, however, they filled their environment to its carrying capacity; at that point, just being cool, complex, and multicellular was no longer enough to ensure survival. The sharp red pencil of natural selection came out and slashed away the gratuitous sequences of DNA.

  For the moment, however, the ability of the Internet to propagate information promiscuously is largely a blessing. The preprint archives where scientific work (like my wrong paper) are posted for all to read are great levelers: Second- or third-world scientists with modems can access the unedited state of the art in a scientific field as it is produced, rather than months or years later. They, in turn, can produce and post their own unedited preprints, and so on. As long as computer memories keep doubling in capacity every year or two, those stacks of unedited information will keep doubling and doubling, too, swamping the useful and correct in a sea of extraneous bits. Eventually, the laws of physics themselves will stop this exponential explosion of memory space, and we will be forced, once more, to edit. What will happen then?

  Don’t ask me. By then, the full brain transfer to the Cloud should be complete. I hope not to be thinking at all.

  Rivaling Gutenberg
<
br />   John Tooby

  Founder of evolutionary psychology; codirector, UC Santa Barbara’s Center for Evolutionary Psychology

  Obliterating whole lineages—diatoms and dinosaurs, corals and crustaceans, ammonites and amphibians—shock waves from the Yucatán meteor impact 65 million years ago rippled through the intricate interdependencies of the planetary ecosystem, turning blankets of life into shrouds in one incandescent geological instant. Knocking out keystone species and toppling community structures, these shifts and extinctions opened up new opportunities, inviting avian and mammalian adaptive radiations and other bursts of innovation that transformed the living world—and eventually opened the way for our placenta-suckled, unprecedentedly luxuriant brains.

  What with one thing and another, now here we are: The Internet and the World Wide Web that runs on it have struck our species’ informational ecology with a similarly explosive impact, their shock waves rippling through our cultural, social, economic, political, technological, scientific, and even cognitive landscapes.

  To understand the nature and magnitude of what is to come, consider the effects of Gutenberg’s ingenious marriage of the grape press, oil-based inks, and his method for inexpensively producing movable type. Before Gutenberg, books were scarce and expensive, requiring months or years of skilled individual effort to produce a single copy. Inevitably, they were primarily prestige goods for aristocrats and clerics, their content devoted to the narrow and largely useless status or ritual preoccupations of their owners. Slow-changing vessels bearing the distant echoes of ancient tradition, books were absent from the lives of all but a tiny fraction of humanity. Books then were travelers from the past rather than signals from the present, their cargo ignorance as often as knowledge. European awareness was parochial in the strict, original sense—limited to direct experience of the parish.

  Yet a few decades after Gutenberg, there were millions of books flooding Europe, many written and owned by a new, book-created middle class, full of new knowledge, art, disputation, and exploration. Mental horizons—once linked to the physical horizon just a few miles away—surged outward.

  Formerly, knowledge of all kinds had been fixed by authority and embedded in hierarchy and was by assumption and intention largely static. Yet the sharp drop in the cost of reproducing books shattered this stagnant and immobilizing mentality. Printing rained new Renaissance texts and newly recovered classical works across Europe; printing catalyzed the scientific revolution; printing put technological and commercial innovation onto an upward arc still accelerating today. Printing ignited the previously wasted intellectual potential of huge segments of the population—people who, without printing, would have died illiterate, uneducated, without voice or legacy.

  Printing summoned into existence increasingly diversified bodies of new knowledge, multiplied productive divisions of labor, midwifed new professions, and greatly expanded the middle class. It threw up voluntary, new meritocratic hierarchies of knowledge and productivity to rival traditional hierarchies of force and superstition. In short, the release of printing technology into human societies brought into being a vast new ecosystem of knowledge—dense, diverse, rapidly changing, rapidly growing, and beyond the ability of any one mind to encompass or any government to control.

  Over the previous millennium, heretics appeared perennially, only to be crushed. Implicitly and explicitly, beyond all question, orthodoxy defined and embodied virtue. But when, after Gutenberg, heretics such as Luther gained access to printing presses, the rapid and broad dissemination of their writings allowed dissidents to muster enough socially coordinated recruits to militarily stalemate attempts by hierarchies to suppress them. Hence, the assumption of a single orthodoxy husbanded by a single system of sanctified authority was broken, beyond all recovery.

  For the same reason that Communist governments would restrict access to Marx’s and Engels’ original writings, the Church made it a death penalty offense (to be preceded by torture) to translate the Bible into the languages people spoke and understood. The radical change in attitude toward authority, and the revaluation of minds even at the bottom of society, can be seen in William Tyndale’s defense of his plan to translate the Bible into English: “I defy the Pope, and all his laws; and if God spares my life, I will cause the boy that drives the plow to know more of the Scriptures than the Pope himself.” (After his translation was printed, he was arrested, tied to the stake, and strangled.) Laymen, even plowboys, who now had access to Bibles (because they could both read and afford them) decided they could interpret sacred texts for themselves without the Church interposing itself as intermediary between book and reader. Humans being what they are, religious wars followed, in struggles to make one or another doctrine (and elite) locally supreme.

  Conflicts such as the Thirty Years’ War (with perhaps 10 million dead and entire territories devastated) slowly awakened Europeans to the costs of violent intolerance, and starting among dissident Protestant communities the recognized prerogatives of conscience and judgment devolved onto ever smaller units, eventually coming to rest in the individual (at least in some societies, and always disputed by rulers).

  Freedom of thought and speech—where they exist—were unforeseen offspring of the printing press, and they change how we think. Political assumptions that had endured for millennia became inverted, making it thinkable that political legitimacy should arise from the sanction of the governed, rather than being a natural entitlement of rulers. And science was the most radical of printing’s many offspring.

  Formerly, the social validation of correct opinion had been the prerogative of local force-based hierarchies, based on tradition and intended to serve the powerful. Even disputes in natural philosophy had been settled by appeals to the textual authority of venerated ancients such as Aristotle. What alternative could there be? Yet when the unified front of religious and secular authority began to fragment, logic and evidence could come into play. What makes science distinctive is that it is the human activity in which logic and evidence (suspect, because potentially subversive of authority) are allowed to play at least some role in evaluating claims.

  Galileo—arguably the founder of modern science—was threatened with torture and placed under house arrest not for his scientific beliefs but for his deeper heresies about what validates knowledge. He argued that along with Scripture, which could be misinterpreted, God had written another book, the book of nature—written in mathematics but open for all to see. Claims about the book of nature could be investigated using experiments, logic, and mathematics—a radical proposal that left no role for authority in the evaluation of (nonscriptural) truth. (Paralleling Tyndale’s focus on the literate lay public, Galileo wrote almost all of his books in Italian rather than in Latin.) The Royal Society, founded two decades after Galileo’s death, chose as their motto Nullius in verba (“On the authority of no one”), a principle strikingly at variance with the pre-Gutenberg world.

  The assumptions (e.g., I should be free to think about and question anything), methods (experimentation, statistical inference, model building), and content (evolutionary biology, quantum mechanics, the computational Theory of Mind) of modern thought are unimaginably different from those held by our ancestors living before Gutenberg. All this—to simplify slightly—because of a drop in the cost of producing books.

  So what is happening to us, now that the Internet has engulfed us? The Internet and its cybernetic creatures have dropped, by many more orders of magnitude, the cost in money, effort, and time of acquiring and publishing information. The knowledge (and disinformation) of the species is migrating online, a click away.

  To take just first-order consequences, we see all around us transformations in the making that will rival or exceed the printing revolution—for example, heating up the chain reactions of scientific, technical, and economic innovation by pulling out the moderating rods of distance and delay. Quantity, Stalin said, has a quality all its own. The Internet also unleashes monsters from the i
d—our evolved mental programs are far more easily triggered by images than by propositions, a reality that jihadi Websites are exploiting in our new round of religious wars.

  Our generation is living through this transformation, so although our cognitive unconscious is hidden from awareness, we can at least report on our direct experience on how our thinking has shifted before and after. I vividly remember my first day of browsing: firing link after link after link, suspended in an endless elation as I surveyed possibility after possibility for twenty hours straight—something I still feel.

  Now my browsing operates out of two states of mind. The first is broad, rapid, intuitive scanning, where I feel free to click without goals, in order to maintain some kind of general scientific and cultural awareness without drowning in the endless sea. The second is a disciplined, focused exploration, where I am careful to ignore partisan pulls and ad hominem distractions, to dispense with my own sympathies or annoyance, to strip everything away except information about causation and paths to potential falsification or critical tests.

  Like a good Kuhnian, I attempt to pay special attention to anomalies in my favored theories, which are easier to identify now that I can scan more broadly. More generally, it seems that the scope of my research has become both broader and deeper, because both cost less. Finally, my mind seems to be increasingly interwoven into the Internet; what I store locally in my own brain seems more and more to be metadata for the parts of my understanding that are stored on the Internet.

  The Shoulders of Giants

  William Calvin

  Neurophysiologist; emeritus professor, University of Washington School of Medicine; author, Global Fever: How to Treat Climate Change

 

‹ Prev