Book Read Free

You Could Look It Up: The Reference Shelf From Ancient Babylon to Wikipedia

Page 13

by Jack Lynch


  The logarithm made possible an ingenious invention, the slide rule—a convenient way of approximating logarithms without a table. The numbers on a ruler are spaced according to their logarithms, so the physical positioning of the rulers takes the place of looking up numbers in a book. The earliest slide rule was devised by Edmund Gunter around the year 1620, but a great leap forward came from William Oughtred in the 1620s and ’30s. The result “would be the faithful companion of every scientist and engineer for the next 350 years, proudly given by parents to their sons and daughters upon graduation from college”14—a tradition brought to an end only by the invention of the pocket calculator in the 1970s. What had served the world as a calculator was actually a reference book reduced to a few notched sticks.

  Logarithms have real-world applications, but they are generated according to the principles of pure mathematics. They tell us things about numbers, not about the world, and should therefore be true everywhere and always. A mathematician in another universe, charged with calculating base-10 logarithms, should come up with the same answers as ours. There is nothing empirical about them. Other tables of numbers, though, are dependent on real-world facts, and though expressed in the language of mathematics, they are not really mathematical tables. Astronomy, for instance, is an eminently empirical science, since the only way to tell the location of the stars is to look at them.

  Monuments such as Stonehenge and the Goseck Circle tell us people have been keeping track of astronomical phenomena for at least seven thousand years. They got pretty good at it in ancient Babylon, Egypt, Greece, and India—much better than we might expect, since they had to rely on the naked eye to keep track of stars, comets, and planets. Claudius Ptolemy collected the best astronomical observations of his day in his Prokheiroi kanones (Handy Tables), which a modern historian calls “the first mass produced mathematical table.”15 Not until the early modern era did people fully appreciate that the natural world behaves according to laws that can be understood as mathematical relations. As Galileo famously put it, “Philosophy is written in this grand book, the universe, which stands continually open to our gaze. But the book cannot be understood unless one first learns to comprehend the language and read the letters in which it is composed. It is written in the language of mathematics.”16 It was one of the most important discoveries our species ever made.

  The sixteenth and seventeenth centuries were a time of rapidly progressing knowledge. Astronomers now had telescopes; more and more observations and formulas were accumulating, and reference books were there to provide relief. Stars are relatively easy to track, because they are so far away that they seem immobile. For practical purposes, the stars are fixed. The astronomical bodies that are closer to us, though, really do seem to move; we can witness this over the course of hours rather than over centuries. Only complicated calculations can tell where the moon will be visible at any given time—likewise for all the planets.

  The ancients in many cultures worked on simple tables, and in 1080 a group of astronomers from Toledo, Spain, compiled tables that allowed astronomers to predict the location of the sun, moon, and planets with unprecedented precision. In the thirteenth century a group assembled by Alfonso X of Castile worked to update the Toledan tables, and the result was named for the patron. The Alphonsine tables gave mathematical descriptions of the relations between the sun, the moon, and the planets relative to the fixed stars. After circulating in manuscript for decades, the tables were printed in 1483, and new versions appeared over the next three centuries.

  But, while they were adequate for many purposes, the Toledan and Alphonsine tables had serious flaws, and others sought to improve on them. The Danish nobleman Tycho Brahe was the best observational astronomer of the sixteenth century, and his ability to collect data over decades was unmatched. While he was still a student, he realized that the older tables were inaccurate, and he published a series of works based on careful observation of the heavens. In 1592 he produced a catalog of 777 stars—not only the largest such catalog, but the first wholly original one to appear in the West since Ptolemy’s a millennium and a half earlier. Tycho, however, never published most of his observations. He died in 1601, just fifty-four years old, and his Astronomiæ instauratæ progymnasmata appeared after his death.

  Legend says Tycho’s dying request to his assistant, Johannes Kepler, was to publish his observations, but twenty-three years passed between Tycho’s death and the completion of the tables, and then another three years before the book appeared. Kepler disagreed with his predecessor on the geocentric or heliocentric model of the solar system, and he reworked many of Tycho’s equations and observations to make them consistent with the brilliant system devised by Polish canon Nicolaus Copernicus almost a century earlier. At length Kepler produced in 1627 the Tabulæ Rudolphinæ (Rudolphine Tables), named for Holy Roman Emperor Rudolf II, Kepler’s onetime patron, who had died in 1612.

  The Tables were more accurate than any of their predecessors, both because of the careful observations made by both Brahe and Kepler and because they were built on a firmer foundation of Copernican astronomy. Simple to use, at least compared to other astronomical tables, they made it possible to determine the longitude of planets at any time, past, present, or future, and they provided answers that were an order of magnitude more precise than their predecessors. Readers could consult Kepler’s formula by using the logarithms that had been discovered only a few years earlier. Napier’s work had been published in 1614; Kepler was reading him by 1617, and he saw at once their potential.

  The early numerical tables, like so many early examples of reference genres, were the works of individual scholars working without substantial assistance. But just as dictionaries and encyclopedias eventually grew too large for lone talents, so did numerical tables eventually require substantial teams.

  At the end of the eighteenth century, the French engineer Gaspard Clair François Marie Riche de Prony oversaw one such workshop. Inspired by Adam Smith’s recently published Treatise on the Wealth of Nations (1776), he assembled a team of some sixty unemployed hairdressers to do carry out his instructions. (In the wake of the French Revolution there was less call for high-end hairdressers, not least because, thanks to Citizen Joseph-Ignace Guillotin’s eponymous invention, fewer aristocratic heads needed dressing.) The work was carried out on an industrial scale: in just two years, de Prony and his hairdressing computers calculated 10,000 sines to twenty-five decimal places, 2,000 logarithms of sines and tangents to fourteen decimal places, 10,000 logarithms of the proportions of sines and tangents, and the logarithms of numbers from 1 to 10,000 to nineteen decimal places and of numbers from 10,000 to 20,000 to fourteen places. The work filled seventeen folio volumes of manuscript, though they sat unpublished for ninety years.17

  The production of tables achieved assembly-line efficiency in the late 1930s. The American Works Progress (later Projects) Administration, founded in 1935 to provide jobs for “employable workers” during the Great Depression, established the Mathematical Tables Project in 1938 as one of its “small useful projects.” Useful it was, but hardly small: it was one of the largest-scale computing operations in the pre-ENIAC age, headed by a Polish-born mathematician, Gertrude Blanch, who supervised 450 clerks.18 Just as de Prony had learned a lesson from Adam Smith, Blanch took her cue from Henry Ford—she gave each group of workers a single task: some did only addition, some only subtraction. The best were trusted with long division. The resulting tables of logarithms and other functions were published in twenty-eight volumes; in some of them, no one to this day has discovered a single error.

  TITLE: Tabulæ Rudolphinæ, quibus astronomicæ scientiæ, temporum longinquitate collapsæ restauratio continetur; a Phœnice illo astronomorum Tychone, ex illustri & generosa Braheorum in regno Daniæ familiâ oriundo Equite, primum animo concepta et destinata anno Christi M D LXIV

  COMPILER: Johannes Kepler (1571–1630)

  ORGANIZATION: Three sections: the Ptolemaic stars, the stars
identified by Brahe, and the southern hemisphere’s stars identified by Pieter Dircksz Keyser; thereafter by logarithmic sines of each minute of the quadrant

  PUBLISHED: Ulm, Germany: Jonas Sauer, September 1627

  PAGES: 247

  ENTRIES: 1,440 stars and 75,000 pieces of information

  SIZE: 13¾″ × 9″ (35 × 23 cm)

  AREA: 213 ft2 (19.9 m2)

  WEIGHT: 3 lb. 3 oz. (1.45 kg)

  But the work of Briggs, of Kepler, of de Prony, of Blanch was all rendered obsolete in the last third of the twentieth century, because of something no one could have foreseen in the 1940s, let alone in the 1620s—fast, plentiful, and cheap computers. When the newly invented electronic digital computers were first brought to bear on elaborate calculations, nearly everyone took it for granted that the computers’ job would be to prepare accurate tables. The machines would tirelessly print long tables of logarithms, sines, and cosines, but the comparatively simple operations of adding and subtracting those numbers would continue to be done by hand. Computing time was far too expensive to waste on trivial calculations like adding 3.377306 to 3.213517. According to an often-repeated story, IBM president Thomas J. Watson declared in 1943, “I think there is a world market for maybe five computers.” The remark is almost certainly apocryphal, but it reflects the assumptions of the early digital days: computers were large and expensive, and the mechanical task of manipulating those numbers should be entrusted to minimally skilled laborers. We are left with the strange paradox that mathematical tables were rendered entirely obsolete by the computer, although tables were the main reason computers were invented. The computer itself—the transformative technology of the last sixty years—is an unintentional byproduct of the reference book.

  CHAPTER 8 ½

  TO BRING PEOPLE TOGETHER

  Societies

  Reading and writing reference books is generally solitary work. Merriam-Webster’s headquarters in Springfield, Massachusetts, has a famously quiet working environment, dating back to Philip Gove’s editorship of the Third New International, where even whispers can earn dirty looks. It’s one of the few remaining places where two honest-to-goodness phone booths can be found, to ensure that conversations don’t disturb the lexicographers hard at work. What Peter Sokolowski of Merriam calls “a powerful culture of silence” keeps the place as hushed as a library.

  Fortunately for those who thrive only in society, though, there are more companionable outlets for indulging in lexicography and even lexicophilia. The national academies that sponsored many of the great national dictionaries are, for the most part, still busy, centuries after their founding, and they can sometimes be positively rowdy compared to the Merriam offices. One of the oldest private organizations, unsupported by any state, is the Philological Society of London; the group, founded in 1842, called for a “new English dictionary” and set in motion the project that became the Oxford English Dictionary. Organizations like the Royal Geographical Society, the American Geographical Society, and the Société de Géographie have promoted cartographical projects. One of the largest scholarly groups for the study of dictionaries is EURALEX, the European Association for Lexicography, founded in 1983; it spawned a series of sibling organizations: AUSTRALEX, the Australasian Association (1990), AFRILEX, the African Association (1995), and ASIALEX, the Asian Association (1997).

  The friendliest society of the lot, though, is the Dictionary Society of North America, a group founded “to bring together people interested in dictionary making, study, collection, and use.” The DSNA was born in 1975 at a colloquium on the history of English dictionaries held at Indiana State University in Terre Haute, Indiana—still the home of one of the best collections of dictionaries in the world. After toying with a number of names—the Society for the Study of English Dictionaries, the Society for the Study of Dictionaries and Lexicography, the Lexic Society, and the Lexicographical Society of America—they settled on the Dictionary Society of North America. The society now boasts more than four hundred members from around the world, and while professional lexicographers and academics are well represented in the membership directory, so, too, are librarians, journalists, book collectors, and plain old enthusiasts.1

  The group meets every other year for a gathering that is part scholarly conference and part egghead bacchanal. Meetings have been held sometimes in big cities (Philadelphia, Montreal, Las Vegas), sometimes in smaller university towns (Ann Arbor, Urbana, Durham). The 2011 meeting in Montreal attracted a diverse group of working lexicographers, educators, historians, literary scholars, even computer scientists from all over the world. There were representatives from the Oxford English Dictionary and from Merriam-Webster; others included graduate students and professors emeriti, some in shorts and sandals, others in natty tailored suits and bow ties. The presentations ranged from explorations of the typography of Robert Cawdrey’s Table Alphabeticall to dictionaries of Caribbean creole, from debates over the layout of learner’s dictionaries to the practical difficulties of transcribing hip-hop lyrics—sometimes the only evidence of the first occurrence of a new word or sense is a bad bootleg recording of a rap concert from the 1970s.

  A reporter covering the event—mostly out of bemusement—went on to identify the lexicographer and language columnist Ben Zimmer, editor of the pathbreaking Visual Thesaurus, as “a major geek.” In some circles that might have led to a libel suit, but most of the DSNA participants embraced the nerdiness of the event, even performing dictionary-related songs at the conference-ending banquet. Peter Sokolowski, editor at large for Merriam-Webster and editor of Merriam-Webster’s French–English Dictionary, was excited by one of the more technical presentations and tweeted to his thousands of followers, “Just heard a talk on French verb classification. Geek out.”

  Anyone who has read this far in this book without being a member of the DSNA should head at once to http://www.dictionarysociety.com/. The subscription to the annual journal, Dictionaries, is worth the price of admission.

  CHAPTER 9

  THE INFIRMITY OF HUMAN NATURE

  Guides to Error

  Index librorum prohibitorum

  1559

  Sir Thomas Browne

  Pseudodoxia Epidemica

  1646

  To care about truth is also to care about falsehood. Any system dedicated to the discovery and dissemination of truth will at some point need to deal with departures from that truth. A few antireference books, therefore, offer dutiful catalogs of the things we should not believe.

  Nowhere is the quest for truth more urgent than in religion, and some of the bitterest struggles with error happen in the religious arena. Modern democratic states have tended toward a live-and-let-live policy, but through most of our history, authorities have been substantially less forgiving. Christianity, for example, has a long history of wrestling with, and often suppressing, error. Jesus was the way, the truth, and the life—but finding that way, knowing that truth, and living that life required first knowing what Jesus actually said. This gave the early figures in the Church the difficult task of figuring out which writings constituted the inspired Scripture. The Hebrew and the Christian Bibles are not unified books, but rather miscellaneous writings collected long after they were written. The pieces were written over more than a millennium, with the earliest dating from the eleventh or tenth century B.C.E. and the latest in the Christian Bible from the end of the first century C.E. But these few dozen books are not the only writings from that period to survive: many other candidates for inclusion were circulating. Identifying the divinely inspired ones was difficult.

  The so-called Muratorian fragment, a list of books of the Bible, was written in the seventh century but seems to be a copy of something much earlier, perhaps from the late second century C.E., between about 170 and 200. This fragment gives the earliest known version of the Christian canon, which both codified the accepted books and rejected others. Churches were forbidden to use the latter in the liturgy. The canon, in other words, is
both a list of sacred books and a list of banned books.

  The bans continued with later writings on religious matters. Any books that promoted heresies such as Montanism and Marcionism in the second century and Arianism in the fourth were in due course proscribed by the Church. And once Christianity became the official religion of the Roman Empire, a series of ecumenical councils, beginning with the First Council of Nicaea in 325, went further in delineating the true and the false, with the concomitant banning of anything that did not make the cut. Pope Anastasius banned Origen’s works in the late fourth century, and Pope Gelasius in 496 issued the Gelasian Decree, with a list of authentic Scripture, recommended reading, and heretical and apocryphal books. Early Christians found a justification for their exclusion of offensive doctrine in the Bible itself. Acts 19:19 reads, “Many of them also which used curious arts brought their books together, and burned them before all men.” If the Apostles could burn books, surely the Church was within its rights in doing the same.

  Christians had been registering and tabulating heresy since the earliest days of the Church, but two developments in fifteenth- and sixteenth-century Germany made the project much more urgent. The first was Gutenberg’s invention of the printing press around 1450. The number of books, both the number of new titles and the number of copies of each title, boomed, and the authorities grew uncomfortable. In 1469, Pope Innocent VIII went so far as to decree that all books had to be approved by religious authorities before they could be read; François I of France went further and banned all printed books, with a death penalty for printers, in order to be certain that nothing slipped through. The second development was the Protestant Reformation, which began when Martin Luther nailed his famous ninety-five theses to the church door at Wittenberg in 1517. Now there was a huge new category of heresy, one that threatened a mortal wound to Holy Mother Church. That both of these revolutions, printing and Protestantism, happened in the same region and within a few decades of each other is one of the most significant conjunctions in the history of Europe.

 

‹ Prev