Book Read Free

Extraordinary Origins of Everyday Things

Page 21

by Charles Panati


  A selection of medicinal and toilet soaps available in the nineteenth century. Soap changed little from Phoenician times until the invention of detergents.

  To understand the revolution caused by detergents, it is necessary to examine the evolution and importance of soap as a world commodity.

  Soap has always been made with fats. The Phoenicians in 600 B.C. concocted the world’s first soap by blending goat fat with wood ash. Inveterate traders who sailed the Mediterranean, the Phoenicians introduced soap to the Greeks and the Romans, and according to the Roman writer Pliny the Elder, sold it as a laxative to the Gauls.

  Soap manufacturing was a flourishing business in eleventh-century Venice, and at one point in history the tax on soap was so high that people secretly manufactured their own bars in the darkness of night. The nineteenth-century German chemist Baron Justus von Liebig argued that the wealth of a nation and its degree of civilization could be measured by the quantity of soap it consumed.

  It was during von Liebig’s time that the first commercial cleanser appeared. Adding to soap abrasive, nondissolving substances—something as fine as talc or chalk, or as coarse as pumice or ground quartz—produced an excellent scouring material. With a chick on the front of a red-and-yellow wrapper, Bon Ami, invented in 1886, became one of the most popular of the early heavy-duty cleansers.

  Chemists by then had begun to unravel the mystery of how soap cleans. Soap is composed of molecules with two distinctly different arms: one arm “loves” to grasp onto water molecules, while the other arm “fears” water and latches onto molecules of dirt or grease. Thus, when rinse water is washed away, it takes with it dirt and grease. Chemists dubbed soap’s water-loving arm “hydrophilic” and its water-fearing arm “hydrophobic.” But soap’s preeminence as an all-purpose cleansing agent was about to be challenged.

  In 1890, a German research chemist, A. Krafft, observed that certain short-chained molecules, themselves not soapy substances, when coupled with alcohol, lathered up like soap. Krafft had produced the world’s first detergent, but at that time the discovery excited no one and remained merely a chemical curiosity.

  Then came World War I. The Allied blockade cut off Germany’s supply of natural fats used to manufacture lubricants. Soap’s fats were substituted, and soap itself became a scarce commodity in the country. Two German chemists, H. Gunther and M. Hetzer, recalled Krafft’s chemical curiosity and concocted the first commercial detergent, Nekal, which they believed would serve only as a wartime substitute for soap. But the detergent’s advantages over soap quickly became apparent. By 1930, much of the industrialized world was manufacturing a wide range of synthetic detergents that left no scum, no residue, and were far superior in many respects to soap.

  A very popular all-purpose household detergent of that period was Spic-and-Span. Though a thoroughly modern product, it derived its name from a sixteenth-century Dutch expression used by sailors in speaking of a new ship: Spiksplinternieuw, meaning the ship was new in every spike and splinter of wood. The British later Anglicized the phrase to “Spick and Spannew,” and U.S. sailors Americanized it to “spic and span.” The sailor’s expression “spic and span” entered the vernacular once it became a trade name, and has since applied to anything spotlessly clean or brand new.

  In 1946, the first successful clothes-washing detergent for the home debuted. Trade named Tide, it appeared just when every housewife in America was deciding she could not live without an automatic washing machine. Tide’s success was rapid and it became the forerunner of the many delicate detergents that would soon crowd supermarket shelves.

  Whiteners and Brighteners. Although it did not require arm-twisting advertising to convince homemakers that for many jobs detergent was superior to soap, one factor that helped launch early detergents was the addition of fluorescent brightening agents that were supposed to get a garment “whiter than white.”

  It was another German chemist, Hans Krais, who in 1929 conceived the idea of combining tiny trace quantities of fluorescent substances, actually dyes, into detergents.

  These chemicals enter a garment’s fibers during a wash and do not pull free during rinsing. They become part of the body of the fabric. And they produce their brightening magic through a simple chemical process. When the garment is worn in sunlight, the fluorescent substances convert the sun’s invisible ultraviolet rays into slightly bluish visible light. This causes the garment to reflect more light than it otherwise would. The net effect is that the garment appears brighter, though it is no cleaner.

  Hans Krais recognized an additional advantage to the fluorescent chemicals. The tone of the extra light reflected from them lies toward the blue side of the spectrum, and thus complements any natural yellowishness present in the fibers, making them look not only brighter but also whiter. The German chemical company I. G. Farben—which had manufactured Nekal—received the first patents for “optical brighteners” as well.

  Detergents have by no means ousted soap from the field of personal hygiene, but in industrialized countries, consumption of detergents exceeds that of soaps three to one. Today the country with the highest per capita consumption of soap and detergents is the United States; it is followed closely by Switzerland and West Germany. The countries with the lowest soap consumption are Finland, Greece, and Ireland. (See also “Soap,” page 217.)

  Chlorine Bleach: 1744, Sweden

  From the earliest written records, there is evidence that people bleached their clothing five thousand years ago, although the process was tedious and protracted and required considerable space—often entire fields, where clothes were laid out in the sun to whiten and dry.

  The Egyptians, in 3000 B.C., produced—and highly prized—white linen goods; the naturally brownish fabric was soaked in harsh alkaline lyes. Timing was critical to prevent the garment from decomposing into shreds.

  In the thirteenth century, the Dutch emerged as the leading exponents of the bleaching craft, retaining a near monopoly of the industry until the eighteenth century. Most European fabric that was to be used in making white garments was first sent to Holland to be bleached. The Dutch method was only slightly more sophisticated than the one employed by the ancient Egyptians.

  Dutch dyers soaked fabric in alkaline lyes for up to five days. They then washed it clean and spread it on the ground to dry and sun-fade for two to three weeks. The entire process was repeated five or six times; then, to permanently halt the “eating” effect of the alkaline lye, the chemical was neutralized by soaking the fabric in a bath of buttermilk or sour milk, both being acidic. The complete process occupied entire fields and ran on for several months.

  By the early eighteenth century, the British were bleaching bolts of fabric themselves. The only real difference in their method was that dilute sulfuric acid was substituted for buttermilk. A new and simple chemical bleaching compound was needed and many chemists attempted to produce it. In 1774, Swedish researcher Karl Wilhelm Scheel found the base chemical when he discovered chlorine gas, but it took another chemist, Count Claude Louis Berthollet, who two decades later would be appointed scientific adviser to Napoleon, to realize that the gas dissolved in water to produce a powerful bleach.

  In 1785, Berthollet announced the creation of eau de Javel, a pungent solution he perfected by passing chlorine gas through a mixture of lime, potash, and water. But eau de Javel was never bottled and sold; instead, every professional bleacher of that era had to combine his own ingredients from scratch, and the chlorine gas was highly irritating to the tissues of the eyes, nose, and lungs. Bleaching now required less time and space, but it involved an occupational hazard.

  The situation was improved in 1799. A Scottish chemist from Glasgow, Charles Tennant, discovered a way to transform eau de Javel into a dry powder, ushering in the era of bleaching powders that could simply be poured into a wash. The powders not only revolutionized the bleaching industry; they also transformed ordinary writing paper: For centuries it had been a muddy yellowish-brown in c
olor; Tennant’s chlorine bleach produced the first pure-white sheets of paper. By 1830, Britain alone was producing 1,500 tons of powdered bleach a year. Whites had never been whiter.

  Glass Window: A.D. 600, Germany

  The Romans were the first to draw glass into sheets for windows, around 400 B.C., but their mild Mediterranean climate made glass windows merely a curiosity. Glass was put to more practical purposes, primarily in jewelry making.

  Following the invention of glass blowing, around 50 B.C., higher-quality glass windows were possible. But the Romans used blown glass to fashion drinking cups, in all shapes and sizes, for homes and public assembly halls. Many vessels have been unearthed in excavations of ancient Roman towns.

  The Romans never did perfect sheet glass. They simply didn’t need it. The breakthrough occurred farther north, in the cooler Germanic climates, at the beginning of the Middle Ages. In A.D. 600, the European center of window manufacturing lay along the Rhine River. Great skill and a long apprenticeship were required to work with glass, and those prerequisites are reflected in the name that arose for a glassmaker: “gaffer,” meaning “learned grandfather.” So prized were his exquisite artifacts that the opening in the gaffer’s furnace through which he blew glass on a long rod was named a “glory hole.”

  Glassmakers employed two methods to produce windows. In the cylinder method, inferior but more widely used, the glassmaker blew molten silica into a sphere, which was then swung to and fro to elongate it into a cylinder. The cylinder was then cut lengthwise and flattened into a sheet.

  In the crown method, a specialty of Normandy glassmakers, the craftsman also blew a sphere, but attached a “punty” or solid iron rod to it before cracking off the blowing iron, leaving a hole at one end. The sphere would then be rapidly rotated, and under centrifugal force the hole would expand until the sphere had opened into a disk. Crown glass was thinner than cylinder glass, and it made only very small window panes.

  During the Middle Ages, Europe’s great cathedrals, with their towering stained-glass windows, monopolized most of the sheet glass manufactured on the Continent. From churches, window glass gradually spread into the houses of the wealthy, and still later, into general use. The largest sheet, or plate, of cylinder glass that could be made then was about four feet across, limiting the size of single-plate windows. Improvements in glass-making technology in the seventeenth century yielded glass measuring up to thirteen by seven feet.

  Gaffers producing glass windows by two ancient methods.

  In 1687, French gaffer Bernard Perrot of Orleans patented a method for rolling plate glass. Hot, molten glass was cast on a large iron table and spread out with a heavy metal roller. This method produced the first large sheets of relatively undistorted glass, fit for use as full-length mirrors.

  Fiberglass. As its name implies, fiberglass consists of finespun filaments of glass made into a yarn that is then woven into a rigid sheet, or some more pliant textile.

  Parisian craftsman Dubus-Bonnel was granted a patent for spinning and weaving glass in 1836, and his process was complex and uncomfortable to execute. It involved working in a hot, humid room, so the slender glass threads would not lose their malleability. And the weaving was performed with painstaking care on a jacquard-type fabric loom. So many contemporaries doubted that glass could be woven like cloth that when Dubus-Bonnel submitted his patent application, he included a small square sample of fiberglass.

  Safety Glass. Ironically, the discovery of safety glass was the result of a glass-shattering accident in 1903 by a French chemist, Edouard Benedictus.

  One day in his laboratory, Benedictus climbed a ladder to fetch reagents from a shelf and inadvertently knocked a glass flask to the floor. He heard the glass shatter, but when he glanced down, to his astonishment the broken pieces of the flask still hung together, more or less in their original contour.

  On questioning an assistant, Benedictus learned that the flask had recently held a solution of cellulose nitrate, a liquid plastic, which had evaporated, apparently depositing a thin coating of plastic on the flask’s interior. Because the flask appeared cleaned, the assistant, in haste, had not washed it but returned it directly to the shelf.

  As one accident had led Benedictus to the discovery, a series of other accidents directed him toward its application.

  In 1903, automobile driving was a new and often dangerous hobby among Parisians. The very week of Benedictus’s laboratory discovery, a Paris newspaper ran a feature article on the recent rash of automobile accidents. When Benedictus read that most of the drivers seriously injured had been cut by shattered glass windshields, he knew that his unique glass could save lives.

  As he recorded in his diary: “Suddenly there appeared before my eyes an image of the broken flask. I leapt up, dashed to my laboratory, and concentrated on the practical possibilities of my idea.” For twenty-four hours straight, he experimented with coating glass with liquid plastic, then shattering it. “By the following evening,” he wrote, “I had produced my first piece of Triplex [safety glass]—full of promise for the future.”

  Unfortunately, auto makers, struggling to keep down the price of their new luxury products, were uninterested in the costly safety glass for windshields. The prevalent attitude was that driving safety was largely in the hands of the driver, not the manufacturer. Safety measures were incorporated into automobile design to prevent an accident but not to minimize injury if an accident occurred.

  It was not until the outbreak of World War I that safety glass found its first practical, wide-scale application: as the lenses for gas masks. Manufacturers found it relatively easy and inexpensive to fashion small ovals of laminated safety glass, and the lenses provided military personnel with a kind of protection that was desperately needed but had been impossible until that time. After automobile executives examined the proven performance of the new glass under the extreme conditions of battle, safety glass’s major application became car windshields.

  “Window.” There is a poetic image to be found in the origin of the word “window.” It derives from two Scandinavian terms, vindr and auga, meaning “wind’s eye.” Early Norse carpenters built houses as simply as possible. Since doors had to be closed throughout the long winters, ventilation for smoke and stale air was provided by a hole, or “eye,” in the roof. Because the wind frequently whistled through it, the air hole was called the “wind’s eye.” British builders borrowed the Norse term and modified it to “window.” And in time, the aperture that was designed to let in air was glassed up to keep it out.

  Home Air-Cooling System: 3000 B.C., Egypt

  Although the ancient Egyptians had no means of artificial refrigeration, they were able to produce ice by means of a natural phenomenon that occurs in dry, temperate climates.

  Around sundown, Egyptian women placed water in shallow clay trays on a bed of straw. Rapid evaporation from the water surface and from the damp sides of the tray combined with the nocturnal drop in temperature to freeze the water—even though the temperature of the environment never fell near the freezing point. Sometimes only a thin film of ice formed on the surface of the water, but under more favorable conditions of dryness and night cooling, the water froze into a solid slab of ice.

  The salient feature of the phenomenon lay in the air’s low humidity, permitting evaporation, or sweating, which leads to cooling. This principle was appreciated by many early civilizations, which attempted to cool their homes and palaces by conditioning the air. In 2000 B.C., for instance, a wealthy Babylonian merchant created his own (and the world’s first) home air-conditioning system. At sundown, servants sprayed water on the exposed walls and floor of his room, so that the resultant evaporation, combined with nocturnal cooling, generated relief from the heat.

  Cooling by evaporation was also used extensively in ancient India. Each night, the man of the family hung wet grass mats over openings on the windward side of the home. The mats were kept wet throughout the night, either by hand or by means of a perf
orated trough above the windows, from which water trickled. As the gentle warm wind struck the cooler wet grass, it produced evaporation and cooled temperatures inside—by as much as thirty degrees.

  Two thousand years later, after the telephone and the electric light had become realities, a simple, effective means of keeping cool on a muggy summer’s day still remained beyond the grasp of technology. As late as the end of the last century, large restaurants and other public places could only embed air pipes in a mixture of ice and salt and circulate the cooled air by means of fans. Using this cumbersome type of system, the Madison Square Theater in New York City consumed four tons of ice a night.

  The problem bedeviling nineteenth-century engineers was not only how to lower air temperature but how to remove humidity from warm air—a problem appreciated by ancient peoples.

  Air-Conditioning. The term “air-conditioning” came into use years before anyone produced a practical air-conditioning system. The expression is credited to physicist Stuart W. Cramer, who in 1907 presented a paper on humidity control in textile mills before the American Cotton Manufacturers Association. Control of moisture content in textiles by the addition of measured quantities of steam into the atmosphere was then known as “conditioning the air.” Cramer, flipping the gerundial phrase into a compound noun, created a new expression, which became popular within the textile industry. Thus, when an ambitious American inventor named Willis Carrier produced his first commercial air conditioners around 1914, a name was awaiting them.

  An upstate New York farm boy who won an engineering scholarship to Cornell University, Carrier became fascinated with heating and ventilation systems. A year after his 1901 graduation, he tackled his first commercial air-cooling assignment, for a Brooklyn lithographer and printer. Printers had always been plagued by fluctuations in ambient temperature and humidity. Paper expanded or contracted; ink flowed or dried up; colors could vary from one printing to the next.

 

‹ Prev