Book Read Free

The Calculus Diaries: How Math Can Help You Lose Weight, Win in Vegas, and Survive a Zombie Apocalypse

Page 3

by Jennifer Ouellette

Even if the weapon proved impractical, the exercise of creating it gave Archimedes some valuable insights into geometric curves. His array of flat mirrors formed a makeshift parabola out of straight lines; together they approximated a parabolic curve. Magnify any curved line sufficiently, and it looks more and more like a straight line with each level of magnification. Archimedes realized he could view a circle, for example, dynamically as an accumulation of an infinite number of smaller pieces added together—triangles, again, in this case—rather than as a static, unchanging whole.

  This is the method he used to prove how to find the area of a circle: half the product of its radius and its circumference. It’s now a standard maxim in geometry textbooks. It worked out so well that Archimedes later adapted Eudoxus’s method to calculate the volume of a sphere (a three-dimensional circle) by enclosing it in a cylinder. He considered this solution his greatest achievement, even asking that his tomb be adorned with a sphere contained in a cylinder. Historical records indicate that Cicero, while visiting Syracuse in 75 B.C., located the tomb of Archimedes, which did indeed feature a sphere inside a cylinder.

  The problem with the method of exhaustion is that the process literally could go on forever. One would never be able to calculate the exact area under a given curve, because how can one draw an infinite number of rectangles or triangles? Managing infinity is a crucial achievement of calculus. The ancient Greeks had an imperfect understanding of the concept of infinity, as do most of us encountering calculus for the first time. It’s not something easily grasped by our finite human minds. So Archimedes’ methodology still fell short of actually inventing integral calculus. Perhaps he might have done so, had he not run afoul of that hotheaded Roman soldier. “Killing Archimedes was one of the biggest Roman contributions to mathematics,” Charles Seife drily observes in Zero: The Biography of a Dangerous Idea. “The Roman era lasted for about seven centuries. In all that time, there were no significant mathematical developments.”

  PICTURE THIS

  While European mathematics languished in the medieval wilderness, a veritable renaissance was brewing in the East—specifically the rise of Baghdad as a cultural mecca for science and mathematics in the ninth century. The driving force behind this intellectual rebirth was the caliph Hārūn ar-Rashīd, who ruled the Islamic Empire from 786 to 809. He insisted on translating the greatest ancient works on math and science from around the world into Arabic—not just the work of the ancient Greeks, but also the achievements of scholars in India, South Asia, and China. His successor, Abu Jafar al-Ma’mūn, went one step further and established the House of Wisdom (Bayt al-Hikma), a scholarly “think tank” to bring together the Islamic world’s greatest minds.

  One of those minds belonged to Abu Jafar al-Kwarizmi, whom we can blame for the development of modern algebra. He dreamed up how to use an equation to describe an unknown, the original x factor. He’s the guy who invented that tedious exercise of “balancing” both sides of an equation by adding, subtracting, or dividing by the same amount on both sides, a plague for high school students to this day. He called his brainchild “comparing and restoring.” Since the Arabic word for “restoring” is al-jabr, today we know this discipline as algebra.

  Al-Kwarizmi did this without the benefit of one little character, literally, that we’ve come to take for granted. The equal sign didn’t exist until the sixteenth century.7 He didn’t use modern algebraic notation, either. Instead, he expressed his unknowns in words rather than variables, and his equations in sentences. In essence, that is what a mathematical equation is: a sentence reduced to a symbolic shorthand so that the quantities can be more easily manipulated. Algebra is about symbols, while geometry is about shapes, yet they share a mathematical connection, even though it would take another several hundred years after al-Kwarizmi’s work before East and West merged. Two French mathematicians, Pierre de Fermat and René Descartes, definitively proved the geometry-algebra connection in the early seventeenth century, thereby forging a crucial link in the development of calculus.

  The son of a leather merchant, Fermat was a lawyer by profession, working as a counselor to Parliament in Toulouse. He rose quickly through the ranks, aided by the high death rate of that era, when outbreaks of the plague swept through the city frequently. Fermat himself contracted the plague at one point, but proved to be one of the lucky few to survive. Eventually he became a judge near Toulouse, at a time when heretical priests were routinely burned at the stake. I’d surmise that the intellectually minded Fermat appreciated the fact that judges were discouraged from social interactions, lest they be swayed in their verdicts by conflicts of interest. This freed him to spend most evenings holed up in his study, poring over mathematical proofs to his heart’s content.

  Sometime in the 1620s, Fermat first encountered a work of Apollonius of Perga called Plane Loci, exploring two-dimensional curves. Fermat set about proving (in the rigorous, mathematical sense) some of his ancient colleague’s results. He discovered that geometric “statements” of the ancient Greeks could also be rendered algebraically—essentially translating them into x’s, y’s, and the other accoutrements of symbolic equations.

  Any geometric object—a square, a triangle, a curved line—can be represented by an equation. These are the formulae we all had to memorize in geometry class: a circle is y 2 + x2 = a 2, for example, while Archimedes’ killer parabola is y = ax2. Points on a graph are noted as sets of numbers inside parentheses (x, y) representing a point in space. The x indicates how far along the horizontal axis a point is located from a point of origin (0). The y does the same on the vertical axis. If you generate enough points from the equation and connect the dots, you end up with a curve. The more points you plot on your Cartesian grid, the smoother the resulting curve will be.

  We know these as Cartesian coordinates because the introverted Fermat procrastinated on polishing his work into a publishable format; his ideas didn’t appear until 1637, with the publication of Introduction to Plane and Solid Loci. That same year, Descartes covered much of the same ground in a separate treatise entitled simply, Geometry. Born in 1596, Descartes lost his mother to tuberculosis when he was barely one year old. His father, a member of his provincial parliament, trusted his son’s education in philosophy and mathematics to the Jesuit priests at a college in La Flèche. But after earning a degree in law in 1616, Descartes “abandoned the study of letters,” opting instead to travel the world to gain as varied experience as possible.

  Yet he retained an interest in philosophy and mathematics, and actively pursued knowledge in both. One day, the story goes, he lay on his bed watching a fly buzzing through the air. Descartes realized that its position at any moment could be described by three numbers representing its distance along each of three intersecting, mutually perpendicular axes (corresponding to the lines formed by the intersection of the room’s walls in a corner). This insight formed the basis of the Cartesian coordinate system. Descartes—along with Fermat—used this coordinate system to turn figures and shapes into equations and numbers.

  While both Fermat and Descartes independently conceived of the underlying notion of translating between curves and algebraic expressions, people liked Descartes’ treatise a bit better, mostly because his notation was easier to use. But Fermat is the one who realized that it worked both ways: He could also turn an equation into a graph, and work with the resulting curve to glean insights that might not be readily apparent from simply studying the abstract algebra.

  Most notably, Fermat realized that converting the expression into geometry made it easier to find the largest and smallest value within a given range—the maximum and minimum, as we call them today. At any point on a curve, it is possible to draw a straight line that just touches it at exactly that point, called the tangent. You simply study the line that is tangent to the curve at the point of interest and determine its slope.

  If the slope of that tangent line is positive (slanting upward from left to right), the expression is increasing; if
negative (slanting downward), the expression is decreasing. The steeper the slope, the faster the expression is increasing or decreasing. Where are the maxima and minima? Wherever the slope of the tangent line flattens out to zero (becomes horizontal) along that curve. Like Archimedes before him, who stopped just short of inventing integral calculus, Fermat came within a hair’s breadth of inventing differential calculus.

  So the area under a curve corresponds to the integral, while the slope of the tangent line to a point on that curve corresponds to the derivative. With the merging of algebra and geometry, the stage was set for calculus to make its grand entrance. Ultimately, the credit for inventing calculus is given jointly to Isaac Newton and Gottfried Wilhelm Leibniz, who independently made their revolutionary discoveries in the 1660s and 1670s, giving rise to an epic intellectual battle for the title of Inventor of Calculus.

  CLASH OF THE TITANS

  Isaac Newton hardly needs an introduction. He is almost universally recognized as the father of modern physics via his masterpiece, the Principia, as well as his work on the nature of light published in Opticks toward the end of his illustrious career. The Principia is inarguably one of the most influential scientific books ever written—eighteenth-century mathematician Joseph-Louis Lagrange declared it “the greatest production of a human mind”—yet it is one of the least read. Three volumes of mathematical theory on the nature of gravity and the laws of motion, rendered in excruciatingly pedantic seventeenth-century Latin prose and chock-full of equations, are hardly summer beach reading. Apparently Newton made it deliberately difficult “to avoid being baited by little smatterers in mathematics.” The Great Newton despised dilettantes.

  The son of a yeoman farmer in Lincolnshire, England, who could neither read nor write, Newton was born in 1642, two months after the death of his father, and so premature and small that hardly anyone expected him to survive. His mother, Hannah, married a clergyman named Barnabas Smith when Isaac was only three years old and promptly moved away with her new husband to start a new family, leaving young Isaac behind with his grandparents.

  Hannah wanted him to become a farmer, and when the boy was seventeen, he was expected to take over the family farm. But he proved disastrous at minding the sheep or cows, feeding the chickens, or taking produce to market. Invariably he would be found sprawled under a shady tree with a book, jotting his thoughts down in a notebook, or jumping from one spot to another in the field, trying to determine the length of those jumps. He invented methods for producing chalk and gold ink, and a technique “to make birds drunk,” as well as a phonetic alphabet; he “contrived water wheels and dams” and dabbled in magic tricks. In short, he did anything but the various chores a competent farmer must master.

  Hannah relented and packed Newton off to Cambridge University to pursue the life of the mind, where he earned his undergraduate degree in science and math in 1665. His graduate studies were interrupted by the outbreak of the plague in Cambridge. Students and professors alike fled the city, and Newton returned home for the ensuing year, until the panic (and danger) had passed. He later described this period as “the prime of my age for invention and minded mathematics and [natural] philosophy more than at any time since.” He wasn’t exaggerating. Not only did he work out his three laws of motion and a universal theory of gravity; he also invented the mathematical tool he needed to achieve those insights: calculus.

  “Rather than thinking of a curve as a simple geometrical shape or construction on paper, Newton began to think of curves in real life—not as static structures like buildings or windmills, but as dynamic motions with variable quantities,” Jason Bardi writes in The Calculus Wars. Take that famous (and possibly apocryphal) anecdote about Newton observing an apple falling from a tree and coming up with his critical insights into gravity. The position and speed of the apple are changing at every moment8: The apple is still on the tree at what physicists call time zero. (That’s shorthand for “the value of the variable t for time is 0.”) A fraction of a second later, it has started its fall, and another fraction of a second finds it midway from branch to ground, and so forth. The apple’s descent progresses in tiny increments (then called infinitesimals) until it hits the ground or Newton’s head. Plot each tiny point describing position versus time along a Cartesian grid, connect the dots, and you end up with one half of a parabolic curve.

  Once he plotted a curve, Newton drew on Fermat’s prior work and figured out how to find the slope of the tangent line for any point along that curve—the derivative, which he called the fluxion. Then he realized that finding the area under the curve (the integral) represented the process in reverse. Newton’s key insight was the connection between the derivative and integral. Finding the area under a curve (integration) is the reverse of finding the slope of a tangent line (differentiation). That is the fundamental theorem of calculus.

  Newton noticed other intriguing connections: The apple’s velocity is the derivative of its position, while its acceleration is the derivative of its velocity. This also works for the integral. Add up the accumulated rate of acceleration over time, and you get the apple’s velocity; add up the accumulated velocity over time, and you get the apple’s position. Thanks to the fundamental theorem of calculus, it is possible to change one problem into another problem. If we have an equation that tells us the position of a falling apple, from that we can deduce the equation for the velocity of the apple at any given moment of its fall.

  What made Newton’s method so revolutionary was its universality: The same equations that can be applied to the speed and position of a falling apple are also applicable to the planets orbiting the sun, the rate at which a cup of coffee cools, how interest accumulates in a savings account—any system in which one quantity is changing with respect to another. So calculus is a nimble beast, a flexible tool that, with lots of practice and a bit of creativity, can take you from a situation where you only have a little bit of information, to one where you have deduced a lot more information.

  In modern calculus, these quantities—position, velocity, acceleration, and so forth—are known as functions, a concept that didn’t exist in Newton’s time. Here’s the kind of textbook definition that, while technically correct, conveys very little actual meaning to the beginning calculus student: “A function is a set of ordered pairs where, for every value of x, there is only one corresponding value for y.” But another way to think of the function is as a link between cause and effect. The variables x and y, for instance, are wholly interdependent, such that, if a change occurs in one of them (cause, or the independent variable), the other changes in response (effect, or the dependent variable). Calculus describes this rate of change. In economics, price is a function of market supply and demand, rising and falling with the whims of consumer appetites. In physics, potential energy is a function of height: The apple’s potential energy is dependent on how high it is in the tree’s branches, and as the apple falls, that potential energy is converted into kinetic energy.

  In the case of Newton’s apple, the position function is the entire collection of points that, taken together, describe the apple’s position at every single instant during its fall. A similar set of points plotted out for the apple’s velocity at any given moment in time comprises the velocity function. But a function is far more than the sum of its parts: It transcends them.

  Functions are powerful tools because they confer the power of prediction. You no longer need to perform a new calculation to determine the position or velocity of that apple at each moment in time. With the function, you know the apple’s position or velocity at every possible moment in time.

  Historians generally agree that Newton was the first to state the fundamental theorem of calculus and was also the first to apply derivatives and integrals in a single work (although he didn’t use those terms). The problem is that like Fermat, he suffered from publication procrastination. Fermat’s dilly-dallying left the field wide open for Descartes to sweep in and claim shared credit for linking alg
ebra and geometry. Newton didn’t publish any of his work on calculus until 1704, in an essay entitled “On the Quadrature of Curves” in the back of Opticks—quadratures being a fancy name for the areas under curves. By that time, Gottfried von Leibniz’s version of calculus was already causing a stir in Western Europe. While Fermat and Descartes had a few testy exchanges, on the whole they maintained an air of civility in their mathematical debates. In contrast, Newton’s procrastination led to one of the most bitter controversies in scientific history, dubbed the calculus wars.

  Leibniz was born in Germany in 1646, and he was a stellar student even as a very young child. “Precocious” could have been his middle name (in reality, it was Wilhelm). His father died when he was six, so Leibniz was raised by his mother, who encouraged her son’s intellectual bent. By eight, he was working his way through his father’s substantial library, teaching himself Latin and Greek so he could read the great works of Aristotle and other philosophers. He entered the University of Leipzig at age fifteen and left two years later with his degree in law. Conspicuously absent from his formal education was any study of mathematics; he was entirely self-taught in that discipline.

  A chance meeting with the Dutch scientist Christian Huygens ignited Leibniz’s interest in the study of geometry and the mathematics of motion; he described their meeting as “opening a whole new world” to him. He pursued these interests in his spare time, inventing in 1671 a handy little machine called the step reckoner. A forerunner of the modern calculator, the device could add, subtract, multiply, divide, and even extract square roots. His reasoning: “It is unworthy of excellent men to lose hours like slaves in the labor of calculation, which could be safely relegated to anyone else if machines were used.” Why waste perfectly good brainpower on lowly arithmetic?

 

‹ Prev