Concept: Radiation poisoning
Evaluation of radiation doses and associated risk from the Fukushima nuclear accident to marine biota and human consumers of seafood
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 5 years ago
Radioactive isotopes originating from the damaged Fukushima nuclear reactor in Japan following the earthquake and tsunami in March 2011 were found in resident marine animals and in migratory Pacific bluefin tuna (PBFT). Publication of this information resulted in a worldwide response that caused public anxiety and concern, although PBFT captured off California in August 2011 contained activity concentrations below those from naturally occurring radionuclides. To link the radioactivity to possible health impairments, we calculated doses, attributable to the Fukushima-derived and the naturally occurring radionuclides, to both the marine biota and human fish consumers. We showed that doses in all cases were dominated by the naturally occurring alpha-emitter (210)Po and that Fukushima-derived doses were three to four orders of magnitude below (210)Po-derived doses. Doses to marine biota were about two orders of magnitude below the lowest benchmark protection level proposed for ecosystems (10 µGy⋅h(-1)). The additional dose from Fukushima radionuclides to humans consuming tainted PBFT in the United States was calculated to be 0.9 and 4.7 µSv for average consumers and subsistence fishermen, respectively. Such doses are comparable to, or less than, the dose all humans routinely obtain from naturally occurring radionuclides in many food items, medical treatments, air travel, or other background sources. Although uncertainties remain regarding the assessment of cancer risk at low doses of ionizing radiation to humans, the dose received from PBFT consumption by subsistence fishermen can be estimated to result in two additional fatal cancer cases per 10,000,000 similarly exposed people.
Radiation dose rates now and in the future for residents neighboring restricted areas of the Fukushima Daiichi Nuclear Power Plant
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 4 years ago
Radiation dose rates were evaluated in three areas neighboring a restricted area within a 20- to 50-km radius of the Fukushima Daiichi Nuclear Power Plant in August-September 2012 and projected to 2022 and 2062. Study participants wore personal dosimeters measuring external dose equivalents, almost entirely from deposited radionuclides (groundshine). External dose rate equivalents owing to the accident averaged 1.03, 2.75, and 1.66 mSv/y in the village of Kawauchi, the Tamano area of Soma, and the Haramachi area of Minamisoma, respectively. Internal dose rates estimated from dietary intake of radiocesium averaged 0.0058, 0.019, and 0.0088 mSv/y in Kawauchi, Tamano, and Haramachi, respectively. Dose rates from inhalation of resuspended radiocesium were lower than 0.001 mSv/y. In 2012, the average annual doses from radiocesium were close to the average background radiation exposure (2 mSv/y) in Japan. Accounting only for the physical decay of radiocesium, mean annual dose rates in 2022 were estimated as 0.31, 0.87, and 0.53 mSv/y in Kawauchi, Tamano, and Haramachi, respectively. The simple and conservative estimates are comparable with variations in the background dose, and unlikely to exceed the ordinary permissible dose rate (1 mSv/y) for the majority of the Fukushima population. Health risk assessment indicates that post-2012 doses will increase lifetime solid cancer, leukemia, and breast cancer incidences by 1.06%, 0.03% and 0.28% respectively, in Tamano. This assessment was derived from short-term observation with uncertainties and did not evaluate the first-year dose and radioiodine exposure. Nevertheless, this estimate provides perspective on the long-term radiation exposure levels in the three regions.
Analyses of (131)I, (137)Cs and (134)Cs in airborne aerosols were carried out in daily samples in Vilnius, Lithuania after the Fukushima accident during the period of March-April, 2011. The activity concentrations of (131)I and (137)Cs ranged from 12 μBq/m(3) and 1.4 μBq/m(3) to 3700 μBq/m(3) and 1040 μBq/m(3), respectively. The activity concentration of (239,240)Pu in one aerosol sample collected from 23 March to 15 April, 2011 was found to be 44.5 nBq/m(3). The two maxima found in radionuclide concentrations were related to complicated long-range air mass transport from Japan across the Pacific, the North America and the Atlantic Ocean to Central Europe as indicated by modelling. HYSPLIT backward trajectories and meteorological data were applied for interpretation of activity variations of measured radionuclides observed at the site of investigation. (7)Be and (212)Pb activity concentrations and their ratios were used as tracers of vertical transport of air masses. Fukushima data were compared with the data obtained during the Chernobyl accident and in the post Chernobyl period. The activity concentrations of (131)I and (137)Cs were found to be by 4 orders of magnitude lower as compared to the Chernobyl accident. The activity ratio of (134)Cs/(137)Cs was around 1 with small variations only. The activity ratio of (238)Pu/(239,240)Pu in the aerosol sample was 1.2, indicating a presence of the spent fuel of different origin than that of the Chernobyl accident.
A cataract is a clouding of the lens that reduces light transmission to the retina, and it decreases the visual acuity of the bearer. The prevalence of cataracts in natural populations of mammals, and their potential ecological significance, is poorly known. Cataracts have been reported to arise from high levels of oxidative stress and a major cause of oxidative stress is ionizing radiation. We investigated whether elevated frequencies of cataracts are found in eyes of bank voles Myodes glareolus collected from natural populations in areas with varying levels of background radiation in Chernobyl. We found high frequencies of cataracts in voles collected from different areas in Chernobyl. The frequency of cataracts was positively correlated with age, and in females also with the accumulated radiation dose. Furthermore, the number of offspring in female voles was negatively correlated with cataract severity. The results suggest that cataracts primarily develop as a function of ionizing background radiation, most likely as a plastic response to high levels of oxidative stress. It is therefore possible that the elevated levels of background radiation in Chernobyl affect the ecology and fitness of local mammals both directly through, for instance, reduced fertility and indirectly, through increased cataractogenesis.
Radiation cataracts develop as a consequence of the effects of ionizing radiation on the development of the lens of the eye with an opaque lens reducing or eliminating the ability to see. Therefore, we would expect cataracts to be associated with reduced fitness in free-living animals.
An assessment of the external and internal radiation exposure levels, which includes calculation of effective doses from chronic radiation exposure and assessment of long-term radiation-related health risks, has become mandatory for residents living near the nuclear power plant in Fukushima, Japan. Data for all primary and secondary children in Minamisoma who participated in both external and internal screening programs were employed to assess the annual additional effective dose acquired due to the Fukushima Daiichi nuclear power plant disaster. In total, 881 children took part in both internal and external radiation exposure screening programs between 1st April 2012 to 31st March 2013. The level of additional effective doses ranged from 0.025 to 3.49 mSv/year with the median of 0.70 mSv/year. While 99.7% of the children (n = 878) were not detected with internal contamination, 90.3% of the additional effective doses was the result of external radiation exposure. This finding is relatively consistent with the doses estimated by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). The present study showed that the level of annual additional effective doses among children in Minamisoma has been low, even after the inter-individual differences were taken into account. The dose from internal radiation exposure was negligible presumably due to the success of contaminated food control.
A record-based case-control study of natural background radiation and the incidence of childhood leukaemia and other cancers in Great Britain during 1980-2006
- Leukemia : official journal of the Leukemia Society of America, Leukemia Research Fund, U.K
- Published over 6 years ago
We conducted a large record-based case-control study testing associations between childhood cancer and natural background radiation. Cases (27 447) born and diagnosed in Great Britain during 1980-2006 and matched cancer-free controls (36 793) were from the National Registry of Childhood Tumours. Radiation exposures were estimated for mother’s residence at the child’s birth from national databases, using the County District mean for gamma rays, and a predictive map based on domestic measurements grouped by geological boundaries for radon. There was 12% excess relative risk (ERR) (95% CI 3, 22; two-sided P=0.01) of childhood leukaemia per millisievert of cumulative red bone marrow dose from gamma radiation; the analogous association for radon was not significant, ERR 3% (95% CI -4, 11; P=0.35). Associations for other childhood cancers were not significant for either exposure. Excess risk was insensitive to adjustment for measures of socio-economic status. The statistically significant leukaemia risk reported in this reasonably powered study (power ∼50%) is consistent with high-dose rate predictions. Substantial bias is unlikely, and we cannot identify mechanisms by which confounding might plausibly account for the association, which we regard as likely to be causal. The study supports the extrapolation of high-dose rate risk models to protracted exposures at natural background exposure levels.
In this report, we have reviewed the basic features of the accident processes and radioactivity releases that occurred in the Chernobyl accident (1986) and in the Fukushima-1 accident (2011). The Chernobyl accident was a power-surge accident that was caused by a failure of control of a fission chain reaction, which instantaneously destroyed the reactor and building, whereas the Fukushima-1 accident was a loss-of-coolant accident in which the reactor cores of three units were melted by decay heat after losing the electricity supply. Although the quantity of radioactive noble gases released from Fukushima-1 exceeded the amount released from Chernobyl, the size of land area severely contaminated by (137)Cesium ((137)Cs) was 10 times smaller around Fukushima-1 compared with around Chernobyl. The differences in the accident process are reflected in the composition of the discharged radioactivity as well as in the composition of the ground contamination. Volatile radionuclides (such as (132)Te-(132)I, (131)I, (134)Cs and (137)Cs) contributed to the gamma-ray exposure from the ground contamination around Fukishima-1, whereas a greater variety of radionuclides contributed significantly around Chernobyl. When radioactivity deposition occurred, the radiation exposure rate near Chernobyl is estimated to have been 770 μGy h(-1) per initial (137)Cs deposition of 1000 kBq m(-2), whereas it was 100 μGy h(-1) around Fukushima-1. Estimates of the cumulative exposure for 30 years are 970 and 570 mGy per initial deposition of 1000 kBq m(-2) for Chernobyl and Fukusima-1, respectively. Of these exposures, 49 and 98% were contributed by radiocesiums ((134)Cs + (137)Cs) around Chernobyl and Fukushima-1, respectively.
The activity concentrations of natural radionuclides in soils from the area affected by uranium mining at Stara Planina Mountain in Serbia were studied and compared with the results obtained from an area with no mining activities (background area). In the affected area, the activity concentrations ranged from 1.75 to 19.2 mg kg(-1) for uranium and from 1.57 to 26.9 mg kg(-1) for thorium which is several-fold higher than those in the background area. The Th/U, K/U, and K/Th activity ratios were also determined and compared with data from similar studies worldwide. External gamma dose rate in the air due to uranium, thorium, and potassium at 1 m above ground level in the area affected by uranium mining was found to be 91.3 nGy h(-1), i.e., about two-fold higher than that in background area. The results of this preliminary study indicate the importance of radiological evaluation of the area and implementation of remedial measures in order to prevent further dispersion of radionuclides in the environment.
An assessment of the radiological situation due to exposure to gamma radiation, radon and thoron was carried out at selected former uranium mining and processing sites in the Central Asian countries of Kazakhstan, Kyrgyzstan, Uzbekistan and Tajikistan. Gamma dose rate measurements were made using various field instruments and radon/thoron measurements were carried out using discriminative radon ((222)Rn)/thoron ((220)Rn) solid state nuclear track detectors (SSNTD). The detectors were exposed for an extended period of time, including at least three seasonal periods in a year, in different outdoor and indoor public and residential environments at the selected uranium legacy sites. The results showed that gamma, Rn and Tn doses were in general low, which consequently implies a low/relatively low radiological risk. The major radiation hazard is represented by abandoned radioactive filtration material that was being used as insulation by some Minkush residents (Kyrgyzstan) for a longer period of time. Annual radiation doses of several hundred mSv could be received as a consequence of using this material domestically. In addition, the gamma and Rn/Tn dose rates at Digmai, Tajikistan, could reach values of several 10 mSv/a. The doses of ionizing radiation deriving from external radiation (gamma dose rate), indoor radon and thoron with their short-lived progenies in several cases exceeded the recommended annual effective dose threshold level of 10 mSv. At none of the sites investigated did the individual annual effective doses exceed 30 mSv, the internationally recommended value for considering intervention. Current doses of ionizing radiation do not represent a serious hazard to the health of the resident public, but this issue should be adequately addressed to further reduce needless exposure of the resident public to ionizing radiation.