Book Read Free

Uncle John’s Unsinkable Bathroom Reader

Page 46

by Bathroom Readers' Institute


  In Venezuela, Garside met 23-year-old Endrina Perez in a Caracas shopping mall. Though they spoke different languages, they developed a deep connection. Perez became Garside’s running partner; he gave her the name “Runningwoman.” (Perez wasn’t the first girl that Garside “befriended” on his journey—there are reports that he’d had at least five girlfriends up to that point, including one woman who followed him on a bicycle for 500 miles through Australia.) Garside and Perez then ran northward through Central America, reportedly making it from Mexico City to the United States—through mostly mountainous terrain—in just 10 days.

  COAST TO COAST

  In September 2000, they crossed the California border from Tijuana. Suddenly, thanks to an article in Sports Illustrated, the Runningman was a minor celebrity. Through his Web site, Garside had been offered sponsorships (Odor-Eaters picked him up for a while) and places to stay. The biggest boost came from a car dealer who provided Team Runningman with a support van that Perez drove while Garside ran.

  Like the fictional Forrest Gump, Garside was joined on his journey through the U.S. by other runners, cyclists, and curious onlookers. A skateboarder pledged to follow Garside from San Francisco all the way across the country, but didn’t actually make it very far. In fact, no one could keep up with Garside’s amazing stamina. But the demanding journey was starting to take a toll on the 34-year-old runner’s mind and body. When Garside arrived in New York, he told reporters, “I’m desperate to finish now. I’ve had enough. I’m missing things like never having more than one pair of shoes or set of clothes. And I’m dying to get a decent cup of tea.”

  About 12% of the Earth’s surface is continually covered in snow.

  THE HOME STRETCH

  Exhausted, Garside decided to skip Africa and Antarctica, having already more than met the four-continent minimum that the record required. But he still had to get back to where he started, so he resumed the trip in Europe. His journey ended in June 2003 in the same place where it officially began: New Delhi, under the India Gate. All told, Garside’s run lasted five years and eight months, took him through 29 countries for a total distance of over 30,000 miles, and exhausted 50 pairs of running shoes. Through donations, Garside made and spent a total of £170,000 ($300,000). After arriving home in England, all that was left to do was wait for the Guinness people to examine the proof and declare him the world record holder.

  But a shadow was hanging over the Runningman’s quest, threatening not only the record, but Garside’s reputation as well. Did he really run the entire distance? Did all of those amazing adventures really happen to him? There were many who claimed that he cheated and lied his way across the world…and they had evidence to back it up.

  For Part II of the Runningman’s story,

  jog on over to page 512.

  FAMOUS DADS OF FAMOUS PEOPLE

  • Uma Thurman’s father was the first American to be ordained a Buddhist monk.

  • Rachel Weisz’s father invented the artificial respirator.

  • Ron Laurie, father of Hugh Laurie (House) won the gold medal for Rowing in the 1948 Olympics.

  • Liv Tyler, the daughter of Steven Tyler of Aerosmith, didn’t meet her real father until she was a teenager. She grew up thinking her father was the man that helped raise her, who was also a 1970s rock star: Todd Rundgren.

  First president’s wife to be known as the First Lady: Dolley Madison, after her death.

  PROJECT ORION

  Back in the 1940s and ’50s, it seemed that someday everything—cars, houses, ships, planes, you name it—would be powered by atomic energy. That never happened, of course, but here’s an example of what might have been.

  THE NEXT BIG THING

  Even before the Manhattan Project produced the world’s first atomic bomb in 1945, many of the scientists assigned to the project were already thinking about how they could harness the power of the atom—and the atom bomb—for other purposes. The long-term effects of radiation were not yet completely understood, and the scientists were eagerly looking for peacetime applications. A number of them wanted to find a way to use it to power rockets, but nuclear energy’s greatest asset, its enormous power, also presented the biggest challenge. Nuclear fuel contains about 10 million times as much energy as the equivalent amount of rocket fuel, but nuclear reactors generate so much heat that they would melt even the sturdiest rocket engines. And besides—atomic bombs would have destroyed any combustion chamber built to contain them.

  THINKING OUTSIDE THE BOX

  Nuclear-powered rocketry didn’t appear to show much promise until a scientist named Stanislaw Ulam, formerly of the Manhattan Project, came up with the idea of detonating atom bombs behind a spacecraft instead of inside of one. The idea may sound silly, kind of like throwing sticks of dynamite out of the trunk of a car to propel it down the road, but Ulam thought it would work.

  In a secret paper published in 1955, he and another scientist, Cornelius Everett, proposed building a spacecraft that looked kind of like a pogo stick mounted atop a trash can lid: Atom bombs would be ejected out the rear of the spacecraft to a distance of 100 to 200 feet away and then detonated. The force of the blast would strike the trash can lid (Ulam and Everett called it the “pusher plate”), giving the spacecraft a sharp, powerful shove. Shock absorbers in the pogo stick section would absorb the force of the blow, protecting the crew and the ship from damage. This process would be repeated over and over again during launch or whenever the spacecraft was accelerating or decelerating, with an average detonation rate of one atom bomb per second. Like a pogo stick, the spacecraft would boing! boing! boing! its way through space. Ulam and Everett called their idea “nuclear pulse propulsion.”

  FM radio waves can leave Earth’s atmosphere. AM radio waves cannot.

  Nuclear pulse propulsion might have remained the stuff of science fiction, had the Soviet Union not beaten the United States into space by launching the world’s first artificial satellite, Sputnik, in October 1957. Suddenly America was interested in any idea, no matter how loony, that might help it win the space race against the Russians. In 1958 a government agency called the Advanced Research Products Agency (ARPA) awarded the General Atomic company of San Diego a $1 million-a-year contract to develop the idea. The program was code-named Project Orion.

  NOT EXACTLY ROCKET SCIENCE

  When the rocket fuel you’re using has limitless power, there’s no need to make the spacecraft small and light, as was the case with the Apollo and Space Shuttle programs. A lot of the money spent on those programs was used to figure out how to reduce the weight of the items needed for a mission, so they could all be crammed aboard a single vehicle with very limited cargo capacity.

  With the Project Orion spacecraft, the opposite was true—bigger really was better, because an enormous amount of mass was needed to absorb both the physical shock and the radiation produced by all those exploding atom bombs. The Project Orion craft would be built from plate steel or similarly strong materials and would be as sturdy as a submarine or a battleship. It might even be built in a shipyard, or at least built by workers skilled in shipbuilding.

  That was the beauty of Project Orion, not to mention one of its strongest selling points: Very little of the design depended on technology that still needed to be developed. (Even the device that ejected the bombs out the back of the ship was existing technology—it was taken from Coca-Cola vending machines.) The United States had learned how to build atom bombs, and it had long known how to build ships. All that remained to be done was to build a ship sturdy enough to be blasted straight up in the air and into outer space.

  Hoover Dam contains enough concrete to pave a two-lane highway from San Francisco to New York.

  Project Orion missions would have been cheap, too: It was estimated that the cost of lifting one pound of cargo into orbit with atomic bombs would have been as little as 1% the cost of launching it aboard a rocket powered by conventional fuel.

  HIGH-RISE

  One pl
an produced by Project Orion called for a ship that was as tall as a 15-story building, weighing about 200 times as much as the space shuttle (the pusher plate alone would have weighed 1,000 tons). Two stages of shock absorbers separating the pusher plate from the rest of the spacecraft would have stood several stories tall. The crew quarters would also have been enormous, with enough room for the essentials—a galley, sleeping quarters, and so on—and plenty left over for things like a library, an exercise room…and a bomb shelter.

  The radiation emitted when the atom bombs were being detonated would have been so intense that the crew would have had to retreat into an especially protective room—probably deep in the center of the spacecraft, surrounded on all sides by storage tanks and supplies—and then wait there until the bombs had stopped. Time spent in the bomb shelter would have been minimal; the scientists estimated that five minutes’ worth of atom bombs (about 800 bombs) were enough to launch the ship into Earth’s orbit, and another minute or two of detonations in orbit (about 200 bombs) were enough to send it to the Moon.

  TO THE MOON AND BEYOND

  The cargo capacity of the ship was so great that it could have carried enough equipment to establish a permanent base on the very first trip to the Moon. And that was just the beginning: The ship was designed to carry 2,600 nukes, enough to take it to another planet and back, so after an inaugural trip to the Moon it could have remained in Earth’s orbit until NASA was ready to send it on a much longer trip. Interplanetary travel would not have been far off. The program’s motto: “Mars by 1965, Saturn by 1970.”

  There was almost no limit to how big such ships could be or how ambitious their missions were. Project Orion even contemplated the possibility of building a city-sized, 8 million-ton “inter-stellar ark” capable of traveling at 10% the speed of light, which would enable it to reach Proxima Centauri, the star closest to the Sun, in less than 50 years. And all it would have needed to get there was 1,080 giant atomic bombs weighing three tons apiece.

  THANKS…BUT NO THANKS

  As ambitious as Project Orion was, it never did win acceptance from the highest levels of government—how many public officials would have been willing to sign off on a spacecraft that set off hundreds of atomic bombs in the atmosphere as it blasted its way into space? Even the most optimistic estimates conceded that every time an Orion spacecraft left the launch pad, it would release enough radiation to kill 10 people from cancer and sicken thousands more even if the ship blasted off from the Nevada desert or some other location many miles from civilian population centers. And that was the kill ratio for a successful mission. How many people might have been killed by a disaster similar to the Challenger explosion if the shuttle had been carrying 2,600 atom bombs?

  Even if Project Orion was technologically possible, politically it was dead on arrival. After NASA was created in 1958, the federal government began divvying up existing space programs between NASA, which took over the civilian projects, and the Air Force, which took over anything with a military application. Neither of them wanted Orion. NASA didn’t want to touch nuclear-powered programs, not even after General Atomic down-sized its design for use as the second stage of the Saturn V rockets that would soon take astronauts to the Moon. As few as three Saturn V rockets could have carried the materials to assemble an Orion vehicle in orbit, a vehicle that could take astronauts all the way to Mars in as little as 125 days. But NASA still said no.

  THE NAME’S FAMILIAR

  Project Orion sputtered along until 1963, when the signing of the Limited Nuclear Test Ban Treaty, which outlawed the exploding of atomic bombs in the atmosphere, killed it for good. About the only part of the program that survives is the name, and that’s only a coincidence: When NASA began designing a new generation of spacecraft to replace the space shuttle when it retires in 2010, they named the new spacecraft Orion—but in honor of the constellation, not the forgotten atom bomb spaceship.

  Largest joint in the human body: the knee.

  WHERE’D THEY GO?

  Remember when a milkman dropped off a quart each morning? Did a TV repairman ever come to the house to fix your set? Today a lot of once-common jobs are just distant memories.

  TV REPAIRMEN. Before TV there was radio…and radio repairmen. When TV replaced radio as the dominant form of entertainment in the early 1950s, many radio fixers switched over to TVs. Like early radios, TVs were expensive and made up of a lot of tubes, wires, and other parts that could be replaced, but fixing one was specialized work that had to be done by trained professionals. As the average household income increased (from $4,100 in 1955 to $30,000 in 1990), the price of TVs went down (factoring in inflation, a TV cost half as much in 1990 as it did in 1955). By the 1990s, it no longer made sense to pay someone to fix a TV set. If a repairman charged $20 an hour (plus the cost of parts), even a minor repair could cost at least $60. It made more sense to do what most people do today: throw it away and buy a new one. In 1978 there were more than 20,000 TV repair facilities in the United States. Today, there are fewer than 5,000. That number will continue to dwindle as more and more Americans replace their old, difficult- and expensive-to-fix tube-based TVs with stable, tubeless high-definition televisions.

  CHANDLERS. Ancient Romans created wicked candles from papyrus rolled in melted beeswax or animal fat. As the only source of artificial light for centuries, candles became a major commodity in western Europe, where guilds of candle makers (also called chandlers) handcrafted their candles and sold them door to door. As a profession, candle-making began to die out in the 1830s, when inventor Joseph Morgan devised an automated candle-making machine (a piston ejected candles out of a mold as they solidified) that could produce 1,500 in an hour. Then came the death knell: the light bulb, introduced in 1879, which quickly made chandlers obsolete.

  MILKMEN. According to the Department of Agriculture, more than half of the milk sold in the United States in 1950 (along with eggs, ice cream, and butter) was delivered to homes by milk-men, all employed by independent local dairies. By 1963 deliveries had dropped to a third of all milk sold. As of 2001, the last year statistics are available, it’s less than 0.4%. Milkmen began to disappear in the late 1960s when dairies were operated by large national franchises and 98% of American homes had refrigerators, reducing the need for daily milk delivery. But the real milkman killer was probably the supermarket. Stores use milk as a “loss leader”; for instance, milk can cost as much as $4 per gallon wholesale, but they sell it for as little as $2, making up the difference with the profit on other items. That’s an economic system with which small dairies can’t compete.

  “If you steal a clean slate, does it go on your record?”—Anonymous

  BRICK MAKERS. Brick-making began in ancient Persia, and the process remained virtually the same for centuries. Brick makers dug for clay and left it exposed to the elements over winter, allowing the freeze-thaw-freeze-thaw cycle to tenderize the clay. In the spring, they ground the clay into a powder, placed it in a soaking pit with water to get it to the right consistency, then mixed it by hand. Next, a chunk of clay was rolled in sand, placed in a mold, left to dry in the sun for two weeks, then fired in a wood- or coal-burning kiln. As it did with candle-making, the Industrial Revolution of the 19th century ended artisan brick-making. Ever since, bricks have been made in factories with temperature control and other processes that make all bricks uniform.

  COBBLERS. Also known as shoemakers, cobblers arrived in the Americas with the first English colonists that settled at Jamestown (the voyage was funded in part by a London shoemaker’s guild). They were fully established by 1616, and cobblers went door to door soliciting made-to-order shoes. It wasn’t until the 1750s that shoemakers abandoned to-order shoemaking and simply made whatever shoes they wanted to (in various sizes) and offered them for sale. And up until the 1840s, all shoes were made by hand with roughly the same tools used for centuries—knives to cut the leather, pliers to stretch it, awls to punch holes in it, and a needle and thread to stitch it into a shoe
. The trade survives today, but only for repairs. Most shoemaking has been mechanized since 1846—the year Elias Howe invented the sewing machine.

  A group of ferrets is a business, a group of gerbils is a horde, and a group of hedgehogs is a prickle.

  FOUNDING FOOD-ERS

  This article is the best thing since sliced bread.

  OTTO ROHWEDDER

  Rohwedder—a jeweler from Davenport, Iowa—was convinced that the world needed evenly sliced bread, and in 1912 started working on plans and prototypes for his first bread slicer. Bad news: they were lost in a fire in 1917. Good news: since he had to start over, this time he addressed bakers’ complaints that presliced bread would dry out and consumers wouldn’t want to buy it. So in 1928 he came up with a new machine that would slice and wrap each loaf of bread. The following year, the stock market crashed, sending the country into the Great Depression. Rohwedder had to sell his invention to the Micro-Westco Company, but they hired him as a VP to work in the newly formed Rohwedder Bakery Machine Division. The concept of sliced bread still didn’t hit…until the Wonder Bread Company began nationwide sales of sliced, packaged bread in 1930 (using its own version of Rohwedder’s machine). Sliced bread suddenly began to catch on in a big way—along with the pop-up toaster, which had been invented in 1926, but wasn’t a very popular consumer item until presliced bread came along. During World War II, the federal government banned sliced bread, claiming that the use of waxed paper for wrapping it was wasteful. But the no-slicing ban lasted just three months: Public outcry was so loud that it forced the government to remove the ban (and anyway, the savings turned out to be minimal). Want to see the original model of Rohwedder’s 1928 bread slicer? It’s in the Smithsonian Institution in Washington, D.C.

 

‹ Prev