Extraordinary Origins of Everyday Things

Home > Other > Extraordinary Origins of Everyday Things > Page 36
Extraordinary Origins of Everyday Things Page 36

by Charles Panati


  Contact Lenses: 1877, Switzerland

  The first person to propose a contact lens system was the Italian painter, sculptor, architect, and engineer Leonardo da Vinci. In his sixteenth-century Code on the Eye, da Vinci described an optical method for correcting poor vision by placing the eye against a short, water-filled tube sealed at the end with a flat lens. The water came in contact with the eyeball and refracted light rays much the way a curved lens does. Da Vinci’s use of water as the best surface to touch the eye is mirrored today in the high water content of soft contact lenses.

  The acute sensitivity of the human eye means that only an extremely smooth foreign surface can come in contact with it. For centuries, this eliminated contact lenses of glass, which even after polishing remained fairly coarse.

  In the 1680s, French opticians attempted a novel approach to the problem. They placed a protective layer of smooth gelatin over the eyeball, then covered it with a small fitted glass lens. The gelatin represented an attempt to use a medium with high water content. The French lens possessed a major flaw, for it frequently fell out of the wearer’s eye. It remained experimental.

  The first practical contact lenses were developed in 1877 by a Swiss physician, Dr. A. E. Fick. They were hard lenses. Thick, actually. And not particularly comfortable. The glass was either blown or molded to the appropriate curvature, polished smooth, then cut into a lens that covered not only the cornea but the entire eyeball. Wearing them took a serious commitment to vanity. Fick’s lenses, however, demonstrated that vision, in most cases, could be corrected perfectly when refracting surfaces were placed directly on the eye. And they proved that the eye could learn to tolerate, without irreparable damage, a foreign object of glass.

  Glass remained the standard material of hard lenses until 1936. That year, the German firm I. G. Farben introduced the first Plexiglas, or hard plastic, lens, which quickly became the new standard of the industry. It was not until the mid-1940s that American opticians produced the first successful corneal lens, covering only the eye’s central portion. The breakthrough ushered in the era of modern contact lens design. Since that time, scientists have ingeniously altered the physical and chemical composition of lenses, often in an attempt to achieve a surface that duplicates, as closely as possible, the composition of the human eye.

  Today factors other than high water content are regarded as essential in a good lens (such as permeability to oxygen so that living eye cells may breathe). Still, though, with an instinctive belief in the comfort of water against the eye, many wearers seek out lenses that are up to 80 percent liquid—even though a lens of less water content might provide better vision correction. Da Vinci, with his 100 percent liquid lens, perhaps realized the psychological appeal of having only water touch the delicate surface of the eyeball.

  Stimulants: Pre-2737 B.C., China

  To achieve altered states of consciousness in religious rites, ancient man used naturally occurring plant stimulants. One of the earliest, and mildest, of recorded stimulants was strongly brewed tea. Although the origins of the beverage are shrouded in Oriental folklore, the legendary Chinese emperor Shen Nung is said to have discovered the kick of tea. An entry in Shen’s medical diary, dated 2737 B.C., declares that tea not only “quenches thirst” but “lessens the desire to sleep.”

  Tea’s stimulant, of course, is caffeine. And the drug, in the form of coffee, became one of the most widely used, and abused, early pick-me-ups. After the discovery of the effects of chewing coffee beans in Ethiopia in A.D. 850, the drug became an addiction in the Near and Middle East. And as coffee spread throughout Europe and Asia, its stimulant effect merited more social and medical comment than its taste.

  Caffeine’s use today continues stronger than ever. Aside from occurring naturally in coffee, tea, and chocolate, caffeine is added to cola drinks and a wide range of over-the-counter drugs. If your medicine chest contains Anacin-3, Dexatrim, Dristan Decongestant, Excedrin, NoDoz, or Slim (to mention a few), you have a caffeine-spiked analgesic or diet aid on the shelf.

  Why is caffeine added?

  In decongestants, it counters the soporific effects of the preparations’ active compounds. In analgesics, caffeine actually enhances (through a mechanism yet unknown) the action of painkillers. And in diet aids, the stimulant is the active ingredient that diminishes appetite. Safe in moderate doses, caffeine can kill. The lethal dose for humans is ten grams, or about one hundred cups of coffee consumed in four hours.

  In this century, a new and considerably more potent class of synthetic stimulants entered the medicine chest.

  Amphetamines. These drugs were first produced in Germany in the 1930s. Their chemical structure was designed to resemble adrenaline, the body’s own fast-acting and powerful stimulant. Today, under such brand names as Benzedrine, Dexedrine, and Preludin (to list a few), they represent a multimillion-dollar pharmaceutical market.

  Commonly known as “speed” or “uppers,” amphetamines were discovered to give more than an adrenaline rush. They produce a degree of euphoria, the ability to remain awake for extended periods, and the suppression of appetite by slowing muscles of the digestive system. For many years, they replaced caffeine as the primary ingredient in popular dieting aids. While their role in weight loss has greatly diminished, they remain a medically accepted mode of treatment for hyperactivity in children and such sleep disorders as narcolepsy.

  In the 1930s, amphetamines existed only in liquid form and were used medically as inhalants to relieve bronchial spasms and nasal congestion. Because of their easy availability, they were greatly abused for their stimulant effects. And when they were produced in tablets, the drugs’ uses and abuses skyrocketed. During World War II, the pills were issued freely to servicemen and widely prescribed to civilians in a cavalier way that would be regarded today as irresponsibility bordering on malpractice.

  By the 1960s, physicians recognized that amphetamines carried addictive risks. The condition known as amphetamine psychosis, which mimics classic paranoid schizophrenia, was identified, and by the end of the decade, legislation curtailed the use of the drugs. Any amphetamine on a medicine chest shelf today is either a prescription drug or an illegal one.

  Sedatives: 1860s, Germany

  Apples and human urine were the main and unlikely ingredients that composed the first barbiturate sedatives, developed in Germany in the 1860s. And the drugs derived their classification title “barbiturate” from a Munich waitress named Barbara, who provided the urine for their experimental production.

  This bizarre marriage of ingredients was compounded in 1865 by German chemist Adolph Baeyer. Unfortunately, the specific reasoning, or series of events, that led him to suspect that the malic acid of apples combines with the urea of urine to induce drowsiness and sleep has been lost to history. What is well documented, however, is the rapid public acceptance of sedatives—to calm anxiety, cure insomnia, and achieve a placid euphoria.

  The period from Baeyer’s discovery to the commercial production of barbiturates spans almost four decades of laboratory research. But once the chemical secrets were unlocked and the ingredients purified, the drugs began to appear rapidly. The first barbiturate sleeping drug, barbital, bowed in 1903, followed by phenobarbital, then scores of similarly suffixed drugs with varying degrees of sedation. Drugs like Nembutal and Seconal acquired street names of “yellow jackets” and “nebbies” and spawned a large illicit drug trade.

  All the barbiturates worked by interfering with nerve impulses within the brain, which, in turn, “calmed the nerves.” Insomniacs alone, in America estimated to number over fifty million, created a huge market. But while sedatives provided a needed respite from wakefulness for many people, they often became addictive.

  Of the many prescription sedatives found in American medicine chests today, one in particular merits mention for its outstanding use and abuse.

  Valium. In 1933, drug researchers discovered a new class of nonbarbiturate sedatives. Known as benzodiazepines, the
y would soon acquire commercial brand names such as Librium and Valium, and Valium would go on to top the federal government’s list of the twenty most abused drugs in America, surpassing both heroin and cocaine.

  During the first decade following their discovery, benzodiazepines did not attract much attention from drug companies. The belief was that barbiturates were safe, effective, and not terribly addictive, and thus there was no need for an entirely new class of sedating drugs.

  Then medical opinion changed. In the mid-1950s, experiments revealed that benzodiazepines, in substantially smaller doses than barbiturate sedatives, were capable of inducing sleep in monkeys. In addition, the drugs not only sedated; they also diminished aggressive tendencies. Drug companies, learning of the surprising laboratory results with monkeys, began conducting human tests, and in 1960 the world was introduced to the first nonbarbiturate sedative, Librium. Three years later, Valium debuted.

  Known as “minor tranquilizers” (compared with the more potent Thorazine, a “major” tranquilizer), Librium and Valium began to be prescribed in record quantities. The reputation of barbiturates by that time had been grimly besmirched, and the new drugs seemed safer, less addictive. They were liberally dispensed as antianxiety agents, muscle relaxants, anticonvulsants, sleeping pills, and as a harmless treatment for the symptoms of alcohol withdrawal. Valium became an industry in itself.

  In time, of course, medical opinion again changed. The benzodiazepines are extremely important and useful drugs, but they, too, possess a great potential for abuse. Today chemists are attempting to tailor-make a new classification of nonaddictive sedatives and painkillers with only a single-purpose function. In the meantime, Americans continue to consume more than five billion sedatives a year, making Valium and its sister drugs almost as familiar a medicine chest item as aspirin.

  Aspirin: 1853, France

  For a fever, physicians in the ancient world recommended a powder made from the bark of the willow tree. Today we know that the bark contains a salicylic compound, related to aspirin, though not as effective, and causing greater gastrointestinal irritation and possible bleeding.

  Aspirin, acetylsalicylic acid, is a man-made variation of the older remedy. It is the world’s most widely used painkiller and anti-inflammatory drug, and it was prepared in France in 1853, then forgotten for the next forty years—rediscovered only when a German chemist began searching for a cure for his father’s crippling arthritis.

  Alsatian chemist Charles Frederick von Gerhardt first synthesized acetylsalicylic acid in 1853, at his laboratory at the University of Montpellier. But from his own limited testing, he did not believe the drug to be a significant improvement over the then-popular salicin, an extract from the bark of the willow tree and the meadowsweet plant, a botanical relative of the rose. Aspirin was ignored, and sufferers of fevers, inflammations, and arthritis continued to take salicin.

  In 1893, a young German chemist, Felix Hoffman, at the Farbenfabriken Bayer drug firm, had exhausted all the known drugs in attempting to ease his father’s rheumatoid arthritis. Hoffman knew of the synthetic type of salicin, and in desperation prepared a batch and tested it on his father. To his astonishment, the man-made derivative palliated the disease’s crippling symptoms and almost completely ameliorated its pain.

  Chemists at Bayer, in Düsseldorf, realized Hoffman had hit on an important new drug. Deciding to produce the compound from the meadowsweet plant, Spiraea ulmaria, the company arrived at the brand name Aspirin by taking the “a” from acetyl,“spir” from the Latin Spiraea, and “in” because it was a popular suffix for medications.

  First marketed in 1899 as a loose powder, Aspirin quickly became the world’s most prescribed drug. In 1915, Bayer introduced Aspirin tablets. The German-based firm owned the brand name Aspirin at the start of World War I, but following Germany’s defeat, the trademark became part of the country’s war reparations demanded by the Allies. At the Treaty of Versailles in June 1919, Germany surrendered the brand name to France, England, the United States, and Russia.

  For the next two years, drug companies battled over their own use of the name. Then, in a famous court decision of 1921, Judge Learned Hand ruled that since the drug was universally known as aspirin, no manufacturer owned the name or could collect royalties for its use. Aspirin with a capital A became plain aspirin. And today, after almost a century of aspirin use and experimentation, scientists still have not entirely discovered how the drug achieves its myriad effects as painkiller, fever reducer, and anti-inflammatory agent.

  Chapter 11

  Under the Flag

  Uncle Sam: 1810s, Massachusetts

  There was a real-life Uncle Sam. This symbol of the United States government and of the national character, in striped pants and top hat, was a meat packer and politician from upstate New York who came to be known as Uncle Sam as the result of a coincidence and a joke.

  The proof of Uncle Sam’s existence was unearthed only a quarter of a century ago, in the yellowing pages of a newspaper published May 12, 1830. Had the evidence not surfaced, doubt about a real-life prototype would still exist, and the character would today be considered a myth, as he was for decades.

  Uncle Sam was Samuel Wilson. He was born in Arlington, Massachusetts, on September 13, 1766, a time when the town was known as Menotomy. At age eight, Sam Wilson served as drummer boy on the village green, on duty the April morning of 1775 when Paul Revere made his historic ride. Though the “shot heard round the world” was fired from nearby Lexington, young Sam, banging his drum at the sight of redcoats, alerted local patriots, who prevented the British from advancing on Menotomy.

  As a boy, Sam played with another youthful patriot, John Chapman, who would later command his own chapter in American history as the real-life Johnny Appleseed. At age fourteen, Sam joined the army and fought in the American Revolution. With independence from Britain won, Sam moved in 1789 to Troy, New York, and opened a meat-packing company. Because of his jovial manner and fair business practices, he was affectionately known to townsfolk as Uncle Sam.

  It was another war, also fought against Britain on home soil, that caused Sam Wilson’s avuncular moniker to be heard around the world.

  During the War of 1812, government troops were quartered near Troy. Sam Wilson’s fair-dealing reputation won him a military contract to provide beef and pork to soldiers. To indicate that certain crates of meat produced at his warehouse were destined for military use, Sam stamped them with a large “U.S.” —for “United States,” though the abbreviation was not yet in the vernacular.

  On October 1, 1812, government inspectors made a routine tour of the plant. They asked a meat packer what the ubiquitously stamped “U.S.” stood for. The worker, himself uncertain, joked that the letters must represent the initials of his employer, Uncle Sam. The error was perpetuated. Soon soldiers began referring to all military rations as bounty from Uncle Sam. Before long, they were calling all government-issued supplies property of Uncle Sam. They even saw themselves as Uncle Sam’s men.

  The first Uncle Sam illustrations appeared in New England newspapers in 1820. At that time, the avuncular figure was clean-shaven and wore a solid black top hat and black tailcoat. The more familiar and colorful image of Uncle Sam we know today arose piecemeal, almost one item at a time, each the contribution of an illustrator.

  Solid red pants were introduced during Andrew Jackson’s presidency. The flowing beard first appeared during Abraham Lincoln’s term, inspired by the President’s own beard, which set a trend at that time. By the late nineteenth century, Uncle Sam was such a popular national figure that cartoonists decided he should appear more patriotically attired. They adorned his red pants with white stripes and his top hat with both stars and stripes. His costume became an embodiment of the country’s flag.

  Uncle Sam at this point was flamboyantly dressed, but by today’s standards of height and weight he was on the short side and somewhat portly.

  It was Thomas Nast, the famous German-born
cartoonist of the Civil War and Reconstruction period, who made Uncle Sam tall, thin, and hollow-cheeked. Coincidentally, Nast’s Uncle Sam strongly resembles drawings of the real-life Sam Wilson. But Nast’s model was actually Abraham Lincoln.

  The most famous portrayal of Uncle Sam—the one most frequently reproduced and widely recognized—was painted in this century by American artist James Montgomery Flagg. The stern-faced, stiff-armed, finger-pointing figure appeared on World War I posters captioned: “I Want You for U.S. Army.” The poster, with Uncle Sam dressed in his full flag apparel, sold four million copies during the war years, and more than half a million in World War II. Flagg’s Uncle Sam, though, is not an Abe Lincoln likeness, but a self-portrait of the artist as legend.

  A nineteenth-century meat-packing plant in upstate New York; birthplace of the Uncle Sam legend.

  During these years of the poster’s peak popularity, the character of Uncle Sam was still only a myth. The identity of his prototype first came to light in early 1961. A historian, Thomas Gerson, discovered a May 12, 1830, issue of the New York Gazette newspaper in the archives of the New-York Historical Society. In it, a detailed firsthand account explained how Pheodorus Bailey, postmaster of New York City, had witnessed the Uncle Sam legend take root in Troy, New York. Bailey, a soldier in 1812, had accompanied government inspectors on the October day they visited Sam Wilson’s meat-packing plant. He was present, he said, when a worker surmised that the stamped initials “U.S.” stood for “Uncle Sam.”

 

‹ Prev