Abstract
Analyses of cancer mortalities among Japanese atomic bomb survivors, limited to the lowest DS86 (1950–1985) dose groups, suggest that excess radiogenic risks per unit dose for single exposures below 20 cGy may be two to three times higher than that in the 20- to 200-cGy range. Low dose exposures of survivors down to a few centigrays have resulted in excess leukemias, and prenatal exposures have resulted in brain damage. In normal populations, in utero exposures and exposures of children to X rays at doses below and just above 1 cGy have resulted in excess cancers. Some, though not all, studies of nuclear workers exposed to much lower fractionated doses found excess radiogenic risk values per unit dose more than three times those for the 0- to 20-cGy dose range among the Japanese survivors. These positive findings of cancers are in conflict with assumptions that low dose, low rate exposures have a reduced biological effectiveness compared with linear extrapolations from higher doses [dose rate effectiveness factor (DREF) hypothesis]. Several radiation studies on human or hybrid human cells found that, in contrast with some high dose animal studies (the main basis for the DREF hypothesis), mutation yields are linearly related to accumulated exposures and are not diminished by fractionation. Genetic effects among nuclear workers, not found among the offspring of atomic bomb survivors, have been linked to doses in the centigray range. First-day mortality, neonatal mortality, and stillbirth rates have also been linked to contamination from radioactive fallout. Several of these apparent discrepancies could be reconciled if more complex radiative interactions with biological functions were recognized (for example, the established, nonlinear, radiation-related effects on cell membranes). They would result in supralinear, convex, dose-effect curves at the lowest doses, becoming linear at medium doses. Several relevant epidemiologic and mutagenetic studies on human cells are summarized briefly in this report.