Strange Glow

Home > Other > Strange Glow > Page 20
Strange Glow Page 20

by Timothy J Jorgensen


  TRIPLE JEOPARDY

  The island of Bikini itself, the actual site of most of the bomb test detonations, wasn’t resettled until 1969. This occurred once an AEC blue-ribbon panel had estimated the dose that returning islanders would receive from ingesting local food contaminated with radioactivity and deemed the levels to be safe. Unfortunately, the panel based its recommendation to resettle Bikini on a report with a misplaced decimal point—a typographical error that underestimated coconut consumption a hundredfold. Scientist Gordon Dunning, who authored the report, freely admitted, “We just plain goofed.”46

  Since coconut was the Bikinians’ major foodstuff, the typographical error meant that the repatriated islanders had actually ingested massive amounts of radioactivity, perhaps more than any other known population. When the problem was discovered in 1978, they were evacuated from their home island once again.

  This decimal point error was the third catastrophic error that the islanders suffered at the hands of the experts. As we’ve seen, the first error was when Los Alamos nuclear physicists erroneously assumed that lithium-7 would not contribute to a fusion reaction—a mistake that resulted in doubling the hydrogen bomb’s yield, thus driving radioactive fallout farther than anyone expected and right onto the displaced islanders’ laps. The second error was the failure of doctors to appreciate that the physiology of the human thyroid put it at high risk from radioactive iodine. Thus, danger to the islanders was denied or downplayed three times, and the islanders paid the price for mistaken expert opinions.

  When all was said and done, the Marshall Islanders living on fallout-contaminated atolls had breathed, absorbed, drunk, and eaten considerable amounts of radioactivity. In the 1960s, thyroid cancers began to appear, and then other cancers were diagnosed, including leukemias (blood cancers). United States government physicians provided ongoing medical care. During the course of this treatment, Brookhaven National Laboratory scientists collected the islanders’ health data for analysis. Health data was also collected from the natives of Kwajalein Atoll, which was not subject to the fallout, in order to have comparison data on an otherwise similar, but unexposed, group (i.e., a control population). This data set remains the largest study of the health effect of fallout radioactivity on humans, and future safety limits for fallout radioisotope exposures would be based largely on findings from the Marshall Island health study. The Marshall Islanders became to fallout what the radium girls were to radium. And both groups were used by the rest of us as miners’ canaries.

  The Brookhaven study of the Marshall Islanders officially ended in 1998. Much was learned about the risks posed by radioactivity absorbed into the body. Currently, the Department of Energy (DOE) continues to provide annual health screenings for radiation-exposed Marshallese at clinics located both in the Marshall Islands and the United States. The DOE includes information of the islanders’ ongoing health conditions in its annual report to Congress.

  In June 1983, the United States and the Republic of the Marshall Islands entered into an agreement in which the former recognized the contributions and sacrifices made by the Marshallese as a result of US nuclear testing. A Marshall Islands Nuclear Claims Tribunal was established to dispense compensation for personal injury and property damage to those deemed to have health problems caused by the testing. As of 2003, over $83 million in personal injury awards had been awarded to a total of 1,865 individuals who claimed a total of 2,027 radiation-related ailments.47 By May 2009, all funds awarded by the US Congress were completely exhausted, with $45.8 million in unpaid personal injury claims still owed the Marshallese fallout victims.48 Many have died while waiting to be paid. It is estimated that over 50% of the valid claimants died before realizing the full amount of their awarded compensation.49 Currently, the tribunal exists in name only; that is, until Congress restores funding, which by all accounts is extremely unlikely to happen.

  THE SALT OF THE EARTH

  Biological and chemical weapons have never proved to be effective weapons of mass destruction largely because they are hard to control. Once biological and chemical weapons are unleashed, environmental conditions can throw the weaponized agents back in the faces of their users. Such was the case during the Battle of Loos, France, in World War I, when shifting wind blew chlorine gas released on German soldiers back into the trenches of the British soldiers. If a weapon’s lethality can’t be controlled, it isn’t of much military use. In that regard, radioactive fallout presents a similar military problem; it can come back and get you. After seeing how fallout-contaminated battleships were as incapacitated by radioactivity as had they been sunk, Admiral William H. P. Blandy (1890–1954), the first commander of the Bikini Atoll nuclear bomb tests, remarked, “This is a form of poison warfare.”50

  Uncontrollable fallout is a major obstacle to effective use of nuclear weapons. But if unwanted atmospheric fallout were minimized, while radioactivity in the targeted blast area were maximized, it might provide yet another means of making nuclear weapons more lethal to enemy combatants. Furthermore, the targeted radioactivity could make the enemy’s lands uninhabitable, thus thwarting recovery and deterring them from future aggression. This concept is reminiscent of the idea of the Roman general, Scipio Aemilianus (185–129 BC), who allegedly salted the earth of defeated Carthage at the end of the Third Punic War (146 BC) in order to avert a Fourth Punic War.51

  Neutrons provide an opportunity to salt the earth in another manner due to one of their unique physical properties. As it turns out, any high-speed neutrons that don’t happen to hit an atomic nuclei while traveling through matter eventually slow down to a stop and become “free” neutrons. Nature, however, doesn’t like free neutrons. Neutrons are supposed to live in atomic nuclei, not float around on their own without companion protons. So, once these neutrons stop moving, nearby atomic nuclei suck them up. The nuclei of some elements are better at doing this than others; nevertheless, they all do it to some extent, and the result is always the same. The nucleus with an additional neutron is now neutron rich. That is, it has moved farther away from the one-to-one neutronproton ratio that stable nuclei prefer. And yes, as you may have guessed, most of these neutron-rich nuclei are then radioactive. The extra neutron bloats their stomachs, so the nuclei want to convert the neutron to a proton, and spit out a beta particle, to relieve their indigestion. These brand-new beta emitters have a range of half-lives, mostly short. There are some, however, like cobalt-60, that have half-lives as long as a year or more. Quite long enough to salt the earth.

  This unique ability of neutrons to make elements radioactive led physicist Leó Szilárd in 1950 to propose a concept for a new weapon usually called the cobalt bomb (or sometimes salted bomb).52 Such a nuclear bomb would have mixed within its warhead large quantities of cobalt-59, a nonradioactive metal that is very efficient at absorbing stopped neutrons. When this happens, the stable cobalt-59 becomes radioactive cobalt-60, with a half-life of five years. Not long enough to make land permanently uninhabitable, but likely long enough to put it out of commission for the remainder of the war.

  No cobalt bombs are known to exist, but the concept has captured the public’s imagination. Cobalt bombs, or other types of salted bombs, have made regular appearances throughout pop culture, including some novels (On the Beach, by Nevil Shute), movies (Goldfinger and Dr. Strangelove), and television programs (Star Trek).

  A BITE WORSE THAN ITS BARK

  Before we leave hydrogen bombs and the fallout problems they pose, one other thing should be mentioned. Their efficiency in killing enemy troops can be greatly enhanced. Hydrogen bombs are like fission bombs—they destroy mostly by shock waves and fire. And although this may be a moot point to the civilian population, enemy armies can make manned fortifications that are resistant to shock waves and fire. It is, therefore, conceivable that soldiers might survive even a hydrogen bombing and then carry on the battle in the resulting wasteland. Neutrons can provide a solution to this problem of combatants who refuse to die.

 
; Neutrons, as we know, are both highly penetrating to lead (and other metallic shielding) and highly lethal compared to other types of radiation. Capitalizing on these two characteristics of neutrons, weapon scientists have been able to produce what are called enhanced radiation bombs with greatly increased yields of neutrons. This is achieved by using atomically inert shell casings, devices called x-ray mirrors, and other modifications to fusion bombs that allow a maximum number of neutrons to escape from the blast. These neutrons are able to penetrate armor and kill hiding soldiers by radiation sickness, even when the shock waves and fire do not.

  Such enhanced radiation weapons are more familiarly called neutron bombs, and urban legend has it that they kill people but spare buildings. Unfortunately for buildings, this is not true. Neutron bombs are just as damaging to buildings, but are better able to seek out and kill any humans hiding inside.

  Although neutron bombs are still fusion bombs, they are considered tactical weapons, rather than strategic weapons,53 because they can be built with smaller kilotonnages (tens to hundreds of kilotons) than conventional fusion bombs (thousands of kilotons) and are, therefore, more the size of very large fission bombs. As a result, individual neutron bombs conceivably could be used tactically to take out portions of large cities, as opposed to very large fusion bombs (e.g., 50,000 KT; 50 MT) that would completely destroy an area the size of the state of Rhode Island.

  DIVER DOWN

  The native Bikinians have still not returned to Bikini Atoll because of concerns about residual radioactivity that remains in the wild foodstuffs. In 1997, however, Bikini Atoll was opened to brief visits and tourism. With environmental radiation levels just slightly above the normal background, the Bikini Islands are now quite safe to visit and have become a popular destination for both scuba divers and fly fishermen. Bikini offers divers the opportunity to explore hundreds of different wrecks of historic American warships that were sunk in the atoll lagoon as part of the nuclear testing program. The lagoon waters are shallow and warm, with magnificent underwater visibility. For fly fishermen, Bikini offers an abundance of game fish, which have thrived and multiplied, unthreatened by any human presence.

  CHAPTER 9

  AFTER THE DUST SETTLES: MEASURING THE CANCER RISK OF RADIATION

  When you can measure what you are speaking about and express it in numbers you know something about it, but when you cannot … your knowledge is of a meager and unsatisfactory kind.

  —Lord Kelvin

  The average human has one breast and one testicle.

  —Des MacHale

  THE COLOR PURPLE

  Purple spots. That’s what the Japanese atomic bomb survivors feared most. In the years following the atomic bombings of Japan, the survivors began to realize that they were not completely out of the woods as far as health risks were concerned. Even those who never had any symptoms of radiation sickness were still prone to higher than normal rates of death. And the prelude to their death often took the form of small purplish blotches (petechiae) on their skin. Just as a sneeze was once feared to be a precursor to the plague by sixth-century Europeans, purple skin spots were thought to be harbingers of leukemia by the postwar Japanese. The difference was that the Europeans were wrong about the sneeze.

  Dr. Fumio Shigeto was exposed to the bomb’s radiation while waiting for a streetcar on that fateful August morning of 1945. He had just arrived in Hiroshima the week before, having recently been appointed the new vice-director of the Red Cross Hospital. Suffering only minor injuries from the blast, he made his way to work, where his subordinate, Dr. Terufumi Sasaki, was already busy treating bomb victims.

  It was Dr. Shigeto who first discovered that all the hospital’s x-ray film had been exposed by the bombing. In his younger days, he had spent some time studying radiology. He immediately grasped the implication of the exposed film, and was one of the first doctors at the scene to understand that many of the bomb victims were suffering from radiation sickness. Still, this would not be the only health effect of radiation that he was first to recognize.

  Dr. Shigeto continued to work at the hospital for many years. Since the patients came from the immediate vicinity, over time he saw many patients who had survived the bombing. In 1948, he noticed before anyone else that leukemia rates were elevated among the bomb survivors.1 Even the ones who had never previously exhibited any signs of radiation sickness sometimes succumbed to leukemia and died.

  Initially, Dr. Shigeto’s reports of elevated leukemia rates among bomb survivors were met with skepticism from the Japanese Ministry of Health and Welfare, as well as American public health officials. The Americans were particularly suspicious of reports from Japanese doctors about new health problems related to radiation. It was thought that the Japanese had a vested interest in exaggerating these problems, because they could then push for bomb victim reparations from the US government. In the postwar years, the bar was high for new claims about radiation health effects, especially if they came from the Japanese. Anticipating the doubters, Shigeto had religiously kept his own statistics on leukemia incidence among his patients, and his data suggested that leukemia was, in fact, on the rise.2 With time, as the leukemia rates reached unprecedented levels, it finally became well accepted that radiation can produce leukemia.

  By 1953, the elevated leukemia rates in the atomic bomb survivors had peaked, and began to subside the following year. Nevertheless, starting in 1955, increased rates of many other types of cancer were becoming apparent. The rates would keep rising until finally, in 1982, these cancers too would start to abate. In the end, a wide variety of cancers seemed to be elevated among atomic bomb survivors, and radiation came to be seen as an unselective carcinogen, with an ability to induce cancer in most all body tissues. This distinguished radiation from the various chemical carcinogens, each of which tended to produce cancer in a limited spectrum of tissues. Radiation did not discriminate. Virtually all tissues seemed to be at some level of risk for cancer.

  When a 19-year-old Marshall Islander, Lekoj Anjain, who had been an infant on Rongelap Atoll in 1954 when the Bravo hydrogen bomb test occurred, developed an acute case of leukemia in 1972, it was widely acknowledged that his radiation exposure was the likely cause.3 He was flown to Brookhaven National Laboratory in New York, where the leukemia diagnosis was confirmed, and was then transferred to the National Institutes of Health in Maryland, where he was treated. Unfortunately, his disease was incurable at the time, and he died at the end of the year.

  Contracting leukemia at age 19 is not unheard of, but it is rare. (The median age of leukemia occurrence is 65.) His was the first Marshall Islander death attributable to radiation, and cause of death was leukemia, consistent with the Japanese experience. This supported the contention that leukemias are the first cancers to appear in a population exposed to high radiation doses.

  THE MOST IMPORTANT PEOPLE LIVING

  By the time the atomic bomb survivors first started coming down with cancer, mankind’s experience with high-dose radiation exposures was vast: uranium miners, the first x-ray and radium patients, early radiologists, radium dial painters, the two atomic bombings in Japan, and Marshall Island fallout. In all of these encounters with radiation, the legacy was a lot of suffering for many people. Yes, it was quite evident that high doses of ionizing radiation could produce cancer. But exactly how dangerous was it? And what about exposures to lower doses of radiation that everyone is subjected to? Were they something to worry about as well?

  The different populations of radiation victims held the answers to these questions, but many scientists, many years of study, and a tremendous amount of money would be required to gather the necessary data and wring out useful information about cancer risk. Nonetheless, the epidemiologists—scientists who study the patterns of disease within populations—saw a great scientific opportunity to answer fundamental questions that people had wondered about since radiation was first discovered, and to answer those questions in a measureable way.

  Of
all the radiation-exposed populations, the atomic bomb survivors held the most promise for highly precise and reliable findings. There are a number of scientific reasons for this that are beyond our scope here, but we can elaborate on the two main reasons: (1) Hiroshima and Nagasaki had very large numbers of people who represented all ages and both sexes, and were exposed simultaneously to a wide range of radiation doses over their whole bodies; and (2) individual doses for bomb survivors could potentially be determined very accurately based on an individual’s exact location at the time the bomb detonated (with some corrections made for building structures and ground contours).4 Just as most New Yorkers know exactly where they were standing when the World Trade Center was attacked on September 11, 2001, most atomic bomb survivors knew their precise location when the atomic bombs were detonated, and that information revealed their radiation dose.

  These two factors—large numbers of people who had reliable individual dose estimates—translated into very good statistics. And good statistics meant that the epidemiological studies would have the power to answer scientific questions without having to worry that the study’s findings were driven merely by chance.5 When you’re trying to determine the association between any particular exposure and any particular health outcome, statistical power is the name of the game. The atomic bomb population had more statistical power than any radiation population study either before or since, and it is unlikely ever to be surpassed in either the quantity or quality of data.

  All that was needed to capitalize on this unique opportunity to better understand radiation’s risk to human health was to track the exposed people over their entire lives; then it could be determined exactly what health effects cropped up among these people and, further, the rates at which these health effects appeared could be measured. No small feat. If the rates proved to be elevated in a radiation-exposed population relative to an unexposed control population, radiation could be inferred to be the likely cause. Should the study be successful, the product would be a complete catalog of all of radiation’s health effects. What’s more, there would also be a reliable estimate of the risk per unit radiation dose (i.e., the risk rate) for each health outcome.

 

‹ Prev