by Bob Berman
Two previous models of the machine had hardware locks that prevented the device from accidentally switching to its high-dose setting. But the Therac-25 relied solely on software for protection against mishaps.
Because newly produced electronic devices typically have one error per five hundred lines of code, and because the Therac-25 had 101,000 lines of code, errors should have been anticipated.
As it turned out, a latent bug on a single line of code upped the radiation intensity. It was supposed to order the machine to increase the beam strength by a factor of six. Instead it commanded it to increase the radiation by 10E6, an exponential increase of six, which meant a millionfold increase. In practice, the machine couldn’t deliver such power, so it responded by “maxing out” its radiation to full power. This “merely” boosted the dose by a factor of one hundred.
At the medical facility, the operator saw the message “malfunction 54” appear on the screen. However, the machine’s instruction manual contained no information about what this might mean. So the operator simply overrode the error message by typing the letter p, for “proceed.” The company’s initial insistence that nothing could possibly go wrong kept the machines in place until new hardware protections and direct-dosage monitors eventually solved the problem.
The Therac-25 aside, fatal radiation encounters outside the realm of medical treatment have been extremely rare. But one unfolded in 1946. Louis Alexander Slotin, born in 1910, was a Canadian physicist who worked on the Manhattan Project at the famous (but at the time still secret) Los Alamos National Laboratory. Slotin’s job and expertise, in the years just before and after the production of the first atomic bombs, involved experimenting with enriched uranium and plutonium to determine their critical-mass values—and then actually assembling them as weapons. He became known as the “chief armorer of the United States.”
Slotin’s hazardous “criticality testing” involved bringing subexplosive quantities of those fissile materials to just below the critical-mass value, at which point a runaway atomic explosion would ensue. Imagine if flirting with the possibility of a nuclear chain reaction was part of your job every day. Scientists referred to this kind of work as “tickling the dragon’s tail.” It was Slotin, on July 16, 1945, who assembled the core for the device used in the first detonation of a nuclear weapon, held at Alamogordo, New Mexico, in the now famous test code-named Trinity.
Because he had done it several times before, perhaps Slotin was getting just a bit too casual. On May 21, 1946, with seven other Los Alamos experts standing off to the side observing him, Slotin placed two half spheres of beryllium around a fourteen-pound plutonium core. Slotin was holding the upper nine-inch beryllium hemisphere as though it were a bowling ball, with his left thumb stuck through a hole drilled in it, all the while keeping the half spheres separated with the blade of his screwdriver. It was critical to keep the hemispheres from approaching each other lest they trigger a nuclear reaction. Normally spacers or shims would have been used, but just as professional electricians often assemble live wires that you or I would never go near without first killing the circuit breaker, Slotin was performing with the mien of someone very comfortable with his job.
At 3:20 p.m., the screwdriver slipped, and the upper beryllium hemisphere fell downward. It fell less than an inch, and it only slipped for a single second—but an immediate “critical reaction” ensued. The sudden flood of radiation made the air glow a blue color—later recognized as the Cherenkov effect, caused by subatomic particles moving faster than the speed of light through air.
Simultaneously, everyone in the room felt a wave of great heat. Slotin later explained that he’d also tasted an intense sourness in his mouth. He immediately yanked his hand upward and threw the beryllium hemisphere onto the floor, which abruptly ended the chain reaction. But it was far too late. He must have already suspected that he’d received a fatal radiation dose. Indeed, he started vomiting before he even got to the hospital, and despite nonstop intensive care, including intravenous fluids, he died nine days later. His symptoms revealed what happens to any animal body when its cells are destroyed by radiation. His sad ordeal included swollen hands, severe diarrhea, intestinal paralysis, gangrene, and finally the failure of vital organs.
Of the seven others in the room, the person standing closest to the plutonium also needed hospitalization for almost a month, but he survived, although the accident left him with permanent neurological and vision problems. He died twenty years later, at the age of fifty-four. Another of the observers also had his life cut short and died nineteen years later, at age forty-two, of acute myeloid leukemia—a typical consequence of very high radiation exposure.
Such exposure to lethal radiation has, thankfully, been very uncommon except among Hiroshima and Nagasaki survivors. As much as they suffered, the vast majority of these survivors were exposed to far less than 300 rems, or 3,000 millisieverts, of radiation, a level that’s fatal to half the people exposed to it.* Yet these were exactly the estimated exposures given to some patients who were subjected to early X-rays. Even when exposure reaches the level of 100–200 rems, or 1,000–2,000 millisieverts, people initially live through the experience, but their cancer risk is greatly increased.
And these were the levels that physicists started self-delivering during the early years of the twentieth century. The reason was simple: for more than two decades before that, there were as many “experts” who thought radiation was salutary as there were scientists who believed it might be harmful. Some early medical applications of radiation were in use for decades after the first nuclear weapons were developed. In 1981, two Montana “health spas” distributed pamphlets advertising the benefits of radon gas in curing “arthritis, sinusitis, migraine, eczema, asthma, hay fever, psoriasis, allergies, diabetes, and other ailments.” The ads explained that sitting in abandoned mine shafts (after paying an entrance fee, of course) and breathing radioactive gases causes your joints to loosen and alleviates various aches and pains. The pamphlets failed to mention that at that time, the medical community had known for more than ten years that radon gas in uranium mines was causing miners to suffer a 500 percent increase in their risk of lung cancer.
Where is radiation found? Absolutely everywhere. It comes up from the ground and rains down from the sun and stars. Our atmosphere blocks some of it, but the higher up you go, the more you get. The average person at sea level gets 360 millirems (or 3.6 millisieverts), per year, of which 82 percent comes from natural sources. But thanks to the exploding use of CT scans during the past quarter century, some authorities now say that that the true average US radiation dose is more like 600 mrem annually.
Natural radiation is responsible for some of the spontaneous tumors that have always plagued the human race. But there is ongoing scientific debate over whether very low doses can harm us. Tibetans and Peruvians who live at high altitudes and therefore receive much more radiation than those living at low altitudes do not suffer from higher rates of leukemia, and a major 2006 French study showed no increased cancer incidence in children who live near nuclear power plants. There is experimental evidence from animal studies showing that exposure to radiation can cause genetic defects. However, studies of the survivors of Hiroshima and Nagasaki give no indication that this is the case in humans. Perhaps surprisingly, considering all the knowledge collected on radiation effects, there is still no definite consensus as to whether low levels of ongoing exposure to natural background radiation carries any health risk—even though risk has been demonstrated for exposure at a level just a few times higher. To be specific, consider the natural annual background rate of 360 mrem, meaning the amount one normally gets before receiving radiation from artificial sources, such as medical X-rays. This is widely considered to be harmless. But a single CT scan can deliver twice that much radiation, and medical authorities now assess a person’s future cancer risk from a single CT scan as one in two thousand, which is certainly not zero.
But anyone concerned about
radiation can easily calculate his or her personal annual dose. Start with the biggest sources:
Award yourself 26 mrem just for living on the surface of the earth. And don’t blame Gaia—there is no planet in the entire solar system that gets less radiation than we do. We’re actually a relatively safe haven.
Add 5 mrem for each thousand-foot elevation of your home. If you live in Denver you have to add 25–30 mrem. If your home is in Leadville, Colorado, up at around ten thousand feet, you receive 60 mrem more than the folks in Boston, at sea level, do.
Is your home stone, brick, or concrete? Add 7 mrem. These materials are naturally slightly radioactive. Only wood-frame homes are essentially free of radiation. Your real-estate agent never mentioned that, did she? Do you have a below-grade basement, which usually means a cellar without any windows or a cellar with narrow windows placed high up, near the basement’s ceiling? If so, you have a huge potential danger if radon is present. The likelihood of this varies greatly depending on which area of the country you live in. For example, Northern California and northern New York State have virtually none. But most of southern New York State has high radon levels just below the ground. If your home is on such a spot, and cracks in the basement are letting it in, add at least 250 mrem annually. This is a biggie—your single greatest radiation source. Add 40 mrem if you drink water and eat food. Obviously, this is unavoidable. The most radioactive foods are bananas, potatoes, beer, low-sodium salt (salt substitutes), red meat, lima beans, and Brazil nuts, though the radium in Brazil nuts isn’t absorbed by the body.
Add 50 mrem for the natural radiation emanating from within your own body, just as it emanates from all those banana splits and their glowing potassium. (A banana’s potassium-40 doesn’t really glow. Sorry.)
Add 1 mrem from radiation in the air left over from those atomic tests in the 1950s, conducted in northern Russia, New Mexico, and on several South Pacific islands. If you were alive back then, you knew the world’s politicians were screwing around with everyone’s health. Just be thankful for the 1963 Limited Nuclear Test Ban Treaty. Before it was signed, our atmosphere’s global carbon-14 had doubled, to around fifty tons. Since then, it’s returned to almost its natural quantity, which is half that amount.
One side effect of the change in atmospheric carbon-14 is that it has enabled researchers to use a technique called bomb pulse dating for determining the birth year of any individual. That’s right—they can tell how old you are simply by measuring the amount of carbon-14 in your tooth enamel and in the lenses of your eyes. Researchers have been using such radiometric dating for the last half century. This method relies on the fact that half of any sample of carbon-14 changes into nitrogen in 5,730 years, and that every living or once-living organism, including the cotton clothing worn by entombed pharaohs, has carbon-14.
Out of every trillion ordinary, non-radioactive carbon atoms in the air and therefore in our bodies, there is one of carbon-14 (14C), which acts chemically like ordinary carbon but has two extra neutrons in its nucleus. When we die, we stop taking new carbon in. The most stable, commonest carbon, carbon-12, lasts forever, but half our carbon-14 is gone in 5,730 years. Half the remaining sample (meaning three-quarters of the original) is gone after two half-lives, or 11,460 years.
So by measuring the ratio of 14C to 12C, we can tell how long ago a plant or animal died.
The atmosphere’s ratio of normal 12C to radioactive 14C remains quite steady over time. But thanks to nuclear testing in the late 1940s through the 1950s, far more carbon-14 was suddenly in the air. Back then, the media generally sounded louder concerns about strontium-90 than they did about 14C. And for good reason: fallout from seventeen years of intense atmospheric nuclear testing dispersed strontium-90 throughout the entire globe. However, with a half-life of 28.8 years, around 75 percent of it decayed away by 2017.
Similarly, nuclear bomb testing also released large amounts of cesium-137—which emits prodigious amounts of gamma radiation, the worst kind—into the atmosphere. But with a half-life of thirty years, that, too, has mostly decayed.
We would have had a bigger problem with carbon-14, given its worrisome 5,730-year half-life. But a wonderful mechanism came to our rescue: removal from the air by natural processes. This carbon has mostly been absorbed into the earth and the seas, so we’re no longer breathing the bulk of it as airborne radioactive carbon dioxide.
The testing of nuclear weapons between 1945 and 1963 unleashed more gamma rays than the earth had cumulatively received since before the Roman Empire. It also blew fifty tons of radioactive carbon-14 into the atmosphere—roughly double what’s present in the air naturally. (U.S. Army Photographic Signal Corps)
We also receive tiny amounts of radiation that only true hypochondriacs need to think about. Still, if you fall into that category, be aware that:
You get 1 mrem for each one thousand miles you travel by jet. A single coast-to-coast round-trip gives you 6 mrem. Think of it as frequent-flyer radiation.
Add 40 mrem for each medical or dental X-ray you have. Not much. Again, it’s those CT scans that present a significant hazard, especially whole-body scans.
The following sources are examples of radiation so minor that you can safely ignore it when you see it mentioned in some media story designed to give readers or viewers a scare:
Wearing an LCD watch delivers .06 mrem annually.
Living within fifty miles of a coal-fired power plant gives you .03 mrem. (That’s because coal and soot are slightly radioactive.)
Having two smoke detectors in the house: .02 mrem.
Living within fifty miles of a nuclear power plant: .009 mrem.
Being in the vicinity of the machine while your airport luggage is X-rayed: .002 mrem.
Scientific debate continues concerning the effect of supersmall amounts of radiation. The Mayo Clinic, the National Cancer Institute, the Health Physics Society, and the vast majority of the world’s epidemiologists have concluded that very low doses of radiation produce no health consequences at all. None. Zilch. Zero. But other scientists believe that very low doses (in the 1 mrem range) might create some small effect along the lines of one cancer death per forty million people.
So let’s say you live in Denver. You have radon in your basement, fly twenty thousand miles a year, and get one full-body CT scan per year. Should you worry? Well, you’ve probably upped your chance of getting cancer someday by a factor of one in a thousand. But because everyone already has around a 25 or 20 percent chance of getting cancer, the increased risk is still relatively minor.
If radiation concerns you, get your basement tested for radon and, if necessary, install a venting fan. That’s a relatively painless way to reduce your radiation exposure by hundreds of millirems. Avoiding unnecessary CT scans, and maybe even skipping a few of those commercial flights, could cut out another 100–1,000 mrems per year.
Oh, yes, one more thing. Don’t even think about moving to Mars. Martian colonists may receive enough radiation in two years to destroy 13 percent of their brains.
Most of us would probably prefer to use our lifetime radiation allowance a little at a time. Soaking up the tropical sun, say, or flying to Bali. But the time may come when a few of us may squander it in a single shot—for example, by becoming an astronaut on an interplanetary odyssey. A two-year mission to Mars would expose you to more than the government’s lifetime radiation allowance for nuclear power plant workers. It’s a problem that will have to be solved if ever we are to colonize the Red Planet.
This presents no small problem for future astronauts. Probably the most hazardous environment is Jupiter, which has an enormous radiation-trapping magnetosphere. Its attractive moon Europa, replete with warm salt-water oceans, is the likeliest place for us to look for extraterrestrial life. But on Europa’s ice-covered surface, a person in a space suit would get a lethal radiation dose every ten seconds, which is the same as what you get from standing thirty feet from the core of a one-gigawatt nuclear reactor.
But we’ll let future generations worry about that.
CHAPTER 16
The Atomic Quartet
It may be argued that there were more fundamental discoveries made during the 1890s than there were during any other decade in history. In 1897, in the excitement following Wilhelm Röntgen’s announcement of his discovery of X-rays, the world quickly learned of Joseph John Thomson’s discovery of the electron, the first-ever subatomic particle. At this same time, Lord Kelvin and the Scotsman William Ramsay were discovering new elements almost monthly. One of those elements, helium, when heated in a lab and viewed through their spectroscope, displayed a set of bright colored lines that, like a fingerprint, perfectly matched the only unidentified solar emissions up to that point and thus revealed the last unknown substance existing in the sun. Because of this, the new element was named for the Greek god of the sun, Helios. They found that it is also nothing less than the second-most-prevalent element in the universe. Another of these “noble gases” (so called because they don’t deign to sully themselves by combining with oxygen or anything else but generally remain in their pure state) is argon, discovered by Ramsay in 1894. Argon, a major component of the air we breathe, makes up nearly 1 percent of the atmosphere and is surpassed in abundance only by nitrogen and oxygen, both of which had been discovered more than two hundred years earlier.
Ramsay even passed high-voltage currents through these gases to create the first neon tubes (neon was yet another of his discoveries), and thus he became single-handedly responsible for the nocturnal surrealism that soon dominated the commercial districts of the world’s cities. Ramsay discovered more elements than any human being either before or since.