The Math Book

Home > Other > The Math Book > Page 20
The Math Book Page 20

by DK


  Significant contributions to calculus were also made in Switzerland by the brothers Jacob and Johann Bernoulli, who coined the term “integral” in 1690. Scottish mathematician Colin Maclaurin published his Treatise on Fluxions in 1742, promoting and furthering Newton’s methods, and attempting to make them more rigorous. In this work, Maclaurin applies calculus to the study of infinite series of algebraic terms. Meanwhile Swiss mathematician Leonhard Euler, a close friend of Johann Bernoulli’s sons, was influenced by their ideas on the subject. In particular, he applied the idea of infinitesimals to what is known as the exponential function, ex. This ultimately led to “Euler’s identity”, eiπ+ 1 = 0, an equation that connects five of the most fundamental mathematical quantities (e, i, π, 0, and 1) in a very simple way.

  As the 18th century progressed, calculus proved increasingly useful as a tool for describing and understanding the physical world. In the 1750s, Euler, working in collaboration with French mathematician Joseph-Louis Lagrange, used calculus to provide an equation—the Euler–Lagrange equation—for understanding both fluid (gas and liquid) and solid mechanics. In the early 1800s, French physicist and mathematician Pierre-Simon Laplace developed electromagnetic theory with the help of calculus.

  Assuming I know our instantaneous speed at every possible moment, can I then use that information to determine how far we’ve traveled? Calculus says I can.

  Jennifer Ouellette

  American science writer

  Isaac Newton’s Opticks, a treatise about the reflections and refractions of light, published in 1704, contains the first details of his work in the area of calculus.

  When the values successively assigned to the same variable indefinitely approach a fixed value, so as to end by differing from it as little as desired, this fixed value is called the limit.

  Augustin-Louis Cauchy

  Formalizing the theories

  The various developments in calculus were formalized in 1823 when Augustin-Louis Cauchy formally stated the fundamental theorem of calculus. In essence, this states that the process of differentiation (working out rates of change of a variable represented by a curve) is the inverse of the process of integration (calculating the area beneath a curve). Cauchy’s formalization allowed calculus to be regarded as a unified whole, dealing with infinitesimals in a consistent way using universally agreed notation.

  The field of calculus was further developed later in the 1800s. In 1854, German mathematician Bernhard Riemann formulated criteria for which functions would be integrable or not, based on defining finite upper and lower limits for the function.

  The notation of modern calculus

  Invented by Newton for differentiation.

  ∫

  Invented by Leibniz for integration.

  dy/dx

  Invented by Leibniz for differentiation.

  f'

  Invented by Lagrange for differentiation.

  Ubiquitous applications

  Many advances in physics and engineering have relied on calculus. Albert Einstein used it in his theories of special and general relativity in the early 20th century, and it has been applied extensively in quantum mechanics (dealing with the motion of subatomic particles). Schrödinger’s wave equation, a differential equation published in 1925 by Austrian physicist Erwin Schrödinger, models a particle as a wave whose state can only be determined by using probability. This was groundbreaking in a scientific world that had up until then been governed by certainty.

  Calculus has many important applications today; it is used, for instance, in search engines, construction projects, medical advances, economic models, and weather forecasts. It is difficult to imagine a world without this all-pervasive branch of mathematics, as it would most certainly be one without computers. Many would argue that calculus is the most important mathematical discovery in the last 400 years.

  GOTTFRIED LEIBNIZ

  Born in Leipzig, Germany, in 1646, Gottfried Leibniz was raised in an academic family. His father was a professor of moral philosophy, while his mother was the daughter of a professor of law. In 1667, after completing his university studies, Leibniz became an advisor on law and politics to the Elector of Mainz, a role that enabled him to travel and meet other European scholars. After his employer’s death in 1673, he took up the role of librarian to the Duke of Brunswick in Hanover.

  Leibniz was a celebrated philosopher as well as a mathematician. He never married and died in 1716 to little fanfare. His successes had been overshadowed by his calculus dispute with Newton and were only recognized several years after his death.

  Key works

  1666 On the Art of Combination

  1684 New Method for Maximums and Minimums

  1703 Explanation of Binary Arithmetic

  See also: The Rhind papyrus • Zeno’s paradoxes of motion • Calculating pi • Decimals • The problem of maxima • The area under a cycloid • Euler’s number • Euler’s identity

  IN CONTEXT

  KEY FIGURE

  Gottfried Leibniz (1646–1716)

  FIELDS

  Number theory, logic

  BEFORE

  c. 2000 BCE Ancient Egyptians use a binary system of doubling and halving to carry out multiplication and division.

  c. 1600 English mathematician and astrologer Thomas Harriot experiments with number systems, including binary.

  AFTER

  1854 George Boole uses binary arithmetic to develop Boolean algebra.

  1937 Claude Shannon shows how Boolean algebra could be implemented using electronic circuits and binary code.

  1990 A 16-bit binary code is used to code pixels on a computer screen, allowing it to display more than 65,000 colors.

  In everyday life we are used to the base-10 counting system with its familiar ten digits, 0 to 9. When we count from 10 onward, we put a 1 in the “tens” column and a 0 in the “units” column, and so on, adding columns for hundreds, thousands, and beyond. The binary system is a base-2 counting system and employs just two symbols, 0 and 1. Rather than increasing in multiples of 10, each column represents a power of 2. So the binary number 1011 is not 1,011 but 11 (from right to left: one 1, one 2, no 4s, and one 8).

  Binary choices are black and white; in any column there is only ever 1 or 0. This simple “on or off” concept has proved vital in computing, for example, where every number can be represented by a series of switchlike on/off actions.

  Binary numbers are written as 1s and 0s, using a base-2 system. This chart shows how to write the numbers 1 to 10, from the base-10 system, as both binary numbers and binary visuals—which is how a computer would process them—where 1 is “on” and 0 is “off.”

  Binary power revealed

  In 1617, Scottish mathematician John Napier announced a binary calculator based on a chessboard. Each square had a value, and that square’s value was “on” or “off” depending on whether a counter was placed on the square. The calculator could multiply, divide, and even find square roots, but was considered a mere curiosity.

  Around the same time, Thomas Harriot was experimenting with number systems, including the binary system. He was able to convert base-10 numbers to binary and back again, and could also calculate using binary numbers. However, Harriot’s ideas remained unpublished until long after his death in 1621.

  The potential of binary numbers was finally realized by German mathematician and philosopher Gottfried Leibniz. In 1679, he described a calculating machine that worked on binary principles, with open or closed gates to let marbles fall through. Computers work in a similar way, using switches and electricity rather than gates and marbles.

  Leibniz outlined his ideas on the binary system in 1703 in Explanation of Binary Arithmetic, showing how 0s and 1s could represent numbers and so simplify even the most complex operations into a basic binary form. He had been influenced by correspondence with missionaries in China, who introduced him to the I Ching, an ancient Chinese book of divination. The book divided reality into the two opposing poles of
yin and yang—one represented as a broken line, the other as an unbroken line. These lines were displayed as six-line hexagrams, combined into a total of 64 different patterns. Leibniz saw links between this binary approach to divination and his work with binary numbers.

  Above all, Leibniz was driven by his religious faith. He wanted to use logic to answer questions about God’s existence and believed that the binary system captured his view of the Universe’s creation, with 0 representing nothingness and 1 representing God.

  Reckoning by twos, that is, by 0 and 1… is the most fundamental way of reckoning for science, and offers up new discoveries, which are… useful, even for the practice of numbers.

  Gottfried Leibniz

  The teaching and commentaries on the I Ching of ancient Chinese philosopher Confucius (551–479 BCE) influenced the work of Leibniz and other 17th–18th-century scientists.

  Bacon’s cipher

  English philosopher and courtier Francis Bacon (1561–1626) was a great dabbler in cryptography, or the science of deciphering codes. He developed what he called a “biliteral” cipher, which used the letters a and b to generate the entire alphabet— a = aaaaa, b = aaaab, c = aaaba, d = aaabb, and so on. If you substitute 0 for a and 1 for b, this becomes a binary sequence. It is an easy code to break, but Bacon realized that a and b do not have to be letters—they can be any two different objects— “… as by bells, by trumpets, by lights and torches… and any instruments of like nature.” It was an ingeniously adaptable cipher, which Bacon could use to “make anything signify anything.” A secret message could be hidden in a group of objects or numbers, or even musical notation. Samuel Morse’s dot–dash telegraph code, which revolutionized communication in the 1800s, and the on/off encoding in a modern computer both have parallels with Bacon’s cipher.

  See also: Positional numbers • The Rhind papyrus • Decimals • Logarithms • The mechanical computer • Boolean algebra • The Turing machine • Cryptography

  INTRODUCTION

  By the late 1600s, Europe had become established as the cultural and scientific center of the world. The Scientific Revolution was well under way, inspiring a new, rational approach not only to the sciences, but to all aspects of culture and society. The Age of Enlightenment, as this period came to be known, was a time of significant sociopolitical change, and produced an enormous increase in the spread of knowledge and education during the 1700s. It was also a period of considerable progress in mathematics.

  Swiss giants

  Building on the work of Newton and Leibniz, whose ideas were finding practical application in physics and engineering, the brothers Jacob and Johann Bernoulli further developed the theory of calculus in their “calculus of variations” and several other mathematical concepts discovered in the 1600s. The elder brother, Jacob, is recognized for his work on number theory, but he also helped develop probability theory, introducing the law of large numbers.

  Along with their mathematically gifted children, the Bernoullis were the leading mathematicians of the early 1700s, making their home town of Basel in Switzerland a center of mathematical study. It was here that Leonhard Euler, the next, and arguably greatest, Enlightenment mathematician, was born and educated. Euler was a contemporary and friend of Daniel and Nicholas Bernoulli, Johann’s sons, and at an early age proved himself a worthy successor to Jacob and Johann. Aged only 20, he suggested a notation for the irrational number e, for which Jacob Bernoulli had calculated an approximate value.

  Euler published numerous books and treatises, and worked in every field of mathematics, often recognizing the links between apparently separate concepts of algebra, geometry, and number theory, which were to become the basis for further fields of Mathematical study. For example, his approach to the seemingly simple problem of planning a route through the city of Königsberg, crossing each of its seven bridges only once, uncovered much deeper concepts of topology, inspiring new areas of research.

  Euler’s contributions to all fields of mathematics, but in particular calculus, graph theory, and number theory, were enormous, and he was also influential in standardizing mathematical notation. He is especially remembered for the elegant equation known as “Euler’s identity,” which highlights the connection between fundamental mathematical constants such as e and π.

  Other mathematicians

  The Bernoullis and Euler tended to eclipse the achievements of the many other mathematicians of the 1700s. Among them was Christian Goldbach, a German contemporary of Euler’s. In the course of his career, Goldbach had befriended other influential mathematicians, including Leibniz and the Bernoullis, and corresponded regularly with them about their theories. In a letter to Euler, he proposed the conjecture for which he is best known, that every even integer greater than 2 can be expressed as the sum of two primes, which remains unproven to this day.

  Others contributed to the development of the growing field of probability theory. Georges-Louis Leclerc, Comte de Buffon, for example, applied the principles of calculus to probability, and demonstrated the link between pi and probability, while another Frenchman, Abraham de Moivre described the concept of normal distribution, and Englishman Thomas Bayes proposed a theorem of the probability of events based on knowledge of the past.

  In the latter part of the 18th century, France became the European center of mathematical enquiry, with Joseph-Louis Lagrange in particular emerging as a significant figure. Lagrange had made his name working with Euler, but later made important contributions to polynomials and number theory.

  New frontiers

  As the century drew to a close, Europe was reeling from political revolutions that had toppled the monarchy in France and given birth to the United States of America. A young German, Carl Friedrich Gauss, published his fundamental theorem of algebra, marking the beginning of a spectacular career and a new period in the history of mathematics.

  IN CONTEXT

  KEY FIGURE

  Isaac Newton (1642–1727)

  FIELD

  Applied mathematics

  BEFORE

  c.330 BCE Aristotle believes it takes force to maintain motion.

  c.1630 Galileo Galilei conducts experiments on motion and finds that friction is a retarding force.

  1674 Robert Hooke writes An attempt to prove the motion of the Earth and hypothesizes what will become Newton’s first law.

  AFTER

  1905 Albert Einstein presents his theory of relativity, which challenges Newton’s view of the force of gravity.

  1977 Voyager 1 is launched. With no friction or drag in space, the craft keeps going due to Newton’s first law, and exits the Solar System in 2012.

  In using mathematics to explain the movement of the planets and of objects on Earth, Isaac Newton fundamentally changed the way we see the Universe. He published his findings in 1687 in the three-volume Philosophiae Naturalis Principia Mathematica (Mathematical Principles of Natural Philosophy), often called the Principia for short.

  Newton’s second and third law help explain how scales work. When we weigh ourselves, our weight (the mass of an object multiplied by gravity) is a force, now measured in newtons. Newtons can be converted into measurements of mass, such as pounds.

  How the planets move

  By 1667, Newton had already developed early versions of his three laws of motion and knew about the force needed to enable a body to move in a circular path. He used his knowledge of forces and German astronomer Johannes Kepler’s laws of planetary motion to deduce how elliptical orbits were related to the laws of gravitational attraction. In 1686, English astronomer Edmond Halley persuaded Newton to write up his new physics and its applications to planetary motion.

  In his Principia, Newton used mathematics to show that the consequences of gravity were consistent with what had been observed experimentally. He analyzed the motion of bodies under the action of forces and posited gravitational attraction to explain the movement of the tides, projectiles, and pendulums, and the orbits of planets and comets.
>
  Laws of motion

  Newton began Principia by stating his three laws of motion. The first says that a force is needed to create motion, and that this force may be from the gravitational attraction between two bodies or an applied force (such as when a snooker cue strikes a ball). The second law explains what is happening when objects are in motion. Newton said that the rate of change of momentum (mass ˗ velocity) of a body is equal to the force acting on it. If a graph is plotted showing velocity against time, then the gradient at any point is the rate of acceleration (any change in velocity).

  Newton’s third law says that if two objects are in contact, the reaction forces between them cancel out, each pushing on the other with an equal force, but in opposing directions. An object resting on a table pushes down on it, and the table pushes back with an equal force. If this were not true, the object would move. Until Einstein’s theory of relativity, the whole of mechanical physics was based on Newton’s three laws of motion.

  ISAAC NEWTON

  Isaac Newton was born on Christmas Day in 1642 in Lincolnshire, England, and was brought up in early childhood by his grandmother. Newton studied at Trinity College, Cambridge, where he showed a fascination for science and philosophy. During the Great Plague in 1665–1666, the university was forced to close, and it was during this period that he formulated his ideas on fluxions (rates of change at a given point in time).

 

‹ Prev