Concept: Nuclear weapon
The contamination of Japan after the Fukushima accident has been investigated mainly for volatile fission products, but only sparsely for actinides such as plutonium. Only small releases of actinides were estimated in Fukushima. Plutonium is still omnipresent in the environment from previous atmospheric nuclear weapons tests. We investigated soil and plants sampled at different hot spots in Japan, searching for reactor-borne plutonium using its isotopic ratio (240)Pu/(239)Pu. By using accelerator mass spectrometry, we clearly demonstrated the release of Pu from the Fukushima Daiichi power plant: While most samples contained only the radionuclide signature of fallout plutonium, there is at least one vegetation sample whose isotope ratio (0.381 ± 0.046) evidences that the Pu originates from a nuclear reactor ((239+240)Pu activity concentration 0.49 Bq/kg). Plutonium content and isotope ratios differ considerably even for very close sampling locations, e.g. the soil and the plants growing on it. This strong localization indicates a particulate Pu release, which is of high radiological risk if incorporated.
For reliable detection of explosives, a combination of methods integrated within a single measurement platform may increase detection performance. However, the efficient field testing of such measurement platforms requires the use of inexplosive simulants that are detectable by a wide range of methods. Physical parameters such as simulant density, elemental composition and crystalline structure must closely match those of the target explosive. The highly discriminating bulk detection characteristics of nuclear quadrupole resonance (NQR) especially constrain simulant design. This paper describes the development of an inexplosive RDX simulant suited to a wide range of measurement methods, including NQR. Measurements are presented that confirm an RDX NQR response from the simulant. The potential use of the simulant for field testing a prototype handheld NQR-based RDX detector is analyzed. Only modest changes in prototype operation during field testing would be required to account for the use of simulant rather than real explosive.
The 2011 Fukushima disaster led to increases in multiple risks (e.g., lifestyle diseases and radiation exposure) and fear among the public. Here, we assessed the additional risks of cancer caused by radiation and diabetes related to the disaster and the cost-effectiveness of countermeasures against these conditions. Our study included residents of the cities of Minamisoma and Soma (10-40 km and 35-50 km north of the Fukushima Daiichi (N° 1) Nuclear Power Station, respectively). We used the loss of life expectancy (LLE) as an indicator to compare risks between radiation exposure and diabetes. We also estimated the cost-effectiveness of radiation-related countermeasures, including restricted food distribution, decontamination, and whole-body counter tests and interventions. Metformin therapy was selected as a representative management for diabetes. The diabetes-related LLEs among residents were 4.1 (95% confidence interval: 1.4-6.8) ×10-2 years for the whole population and 8.0 (2.7-13.2) ×10-2 years for 40s to 70s in a scenario that considered the additional incidence of diabetes during the first 10 years. The cancer-related LLEs caused by lifetime exposure to radiation were 0.69 (2.5-97.5 percentile: 0.61-0.79) ×10-2 years for the whole population and 0.24 (0.20-0.29) ×10-2 years for 40s to 70s. The diabetes-related LLEs among residents in the above-mentioned scenario were 5.9-fold and 33-fold higher than those attributed to average radiation among the whole population and among the 40s to 70s age groups, respectively. The costs per life-years saved of the radiation countermeasures (i.e., restricted food distribution, decontamination, and whole-body counter tests and interventions) were >1 to >4 orders of magnitude higher than those of general heath checkups and conventional management for diabetes. Our findings indicate that countermeasures to mitigate diabetes are warranted. Policy-makers' and individuals' understanding of multiple risks after any disaster will be essential to saving the lives of victims.
- FASEB journal : official publication of the Federation of American Societies for Experimental Biology
- Published almost 5 years ago
Tendons are often injured and heal poorly. Whether this is caused by a slow tissue turnover is unknown, since existing data provide diverging estimates of tendon protein half-life that range from 2 mo to 200 yr. With the purpose of determining life-long turnover of human tendon tissue, we used the (14)C bomb-pulse method. This method takes advantage of the dramatic increase in atmospheric levels of (14)C, produced by nuclear bomb tests in 1955-1963, which is reflected in all living organisms. Levels of (14)C were measured in 28 forensic samples of Achilles tendon core and 4 skeletal muscle samples (donor birth years 1945-1983) with accelerator mass spectrometry (AMS) and compared to known atmospheric levels to estimate tissue turnover. We found that Achilles tendon tissue retained levels of (14)C corresponding to atmospheric levels several decades before tissue sampling, demonstrating a very limited tissue turnover. The tendon concentrations of (14)C approximately reflected the atmospheric levels present during the first 17 yr of life, indicating that the tendon core is formed during height growth and is essentially not renewed thereafter. In contrast, (14)C levels in muscle indicated continuous turnover. Our observation provides a fundamental premise for understanding tendon function and pathology, and likely explains the poor regenerative capacity of tendon tissue.-Heinemeier, K. M., Schjerling, P., Heinemeier, J., Magnusson, S. P., Kjaer, M. Lack of tissue renewal in human adult Achilles tendon is revealed by nuclear bomb (14)C.
Weapons-grade uranium and plutonium could be used as nuclear explosives with extreme destructive potential. The problem of their detection, especially in standard cargo containers during transit, has been described as “searching for a needle in a haystack” because of the inherently low rate of spontaneous emission of characteristic penetrating radiation and the ease of its shielding. Currently, the only practical approach for uncovering well-shielded special nuclear materials is by use of active interrogation using an external radiation source. However, the similarity of these materials to shielding and the required radiation doses that may exceed regulatory limits prevent this method from being widely used in practice. We introduce a low-dose active detection technique, referred to as low-energy nuclear reaction imaging, which exploits the physics of interactions of multi-MeV monoenergetic photons and neutrons to simultaneously measure the material’s areal density and effective atomic number, while confirming the presence of fissionable materials by observing the beta-delayed neutron emission. For the first time, we demonstrate identification and imaging of uranium with this novel technique using a simple yet robust source, setting the stage for its wide adoption in security applications.
An assessment of the external and internal radiation exposure levels, which includes calculation of effective doses from chronic radiation exposure and assessment of long-term radiation-related health risks, has become mandatory for residents living near the nuclear power plant in Fukushima, Japan. Data for all primary and secondary children in Minamisoma who participated in both external and internal screening programs were employed to assess the annual additional effective dose acquired due to the Fukushima Daiichi nuclear power plant disaster. In total, 881 children took part in both internal and external radiation exposure screening programs between 1st April 2012 to 31st March 2013. The level of additional effective doses ranged from 0.025 to 3.49 mSv/year with the median of 0.70 mSv/year. While 99.7% of the children (n = 878) were not detected with internal contamination, 90.3% of the additional effective doses was the result of external radiation exposure. This finding is relatively consistent with the doses estimated by the United Nations Scientific Committee on the Effects of Atomic Radiation (UNSCEAR). The present study showed that the level of annual additional effective doses among children in Minamisoma has been low, even after the inter-individual differences were taken into account. The dose from internal radiation exposure was negligible presumably due to the success of contaminated food control.
Measurement of soil contamination levels has been considered a feasible method for dose estimation of internal radiation exposure following the Chernobyl disaster by means of aggregate transfer factors; however, it is still unclear whether the estimation of internal contamination based on soil contamination levels is universally valid or incident specific.
The Fukushima Daiichi Nuclear Power Plant accident (FDNPP) has caused serious contamination in the environment. The release of Pu isotopes renewed considerable public concern because they present a large risk for internal radiation exposure. In this review, we summarize and analyze published studies related to the release of Pu from the FDNPP accident based on environmental sample analyses and the ORIGEN model simulations. Our analysis emphasizes the environmental distribution of released Pu isotopes, information on Pu isotopic composition for source identification of Pu releases in the FDNPP-damaged reactors or spent fuel pools, and estimation of the amounts of Pu isotopes released from the FDNPP accident. Our analysis indicates that a trace amount of Pu isotopes (ca. 2 ×10-5 % of core inventory) was released into the environment from the damaged reactors, but not from the spent fuel pools located in the reactor buildings. Regarding the possible Pu contamination in the marine environment, limited studies suggest that no extra Pu input from the FDNPP accident could be detected in the western North Pacific 30 km off the Fukushima coast. Finally, we identified knowledge gaps remained on the release of Pu into the environment and recommended issues for future studies.
Estimation of time changes in radiocaesium in foodstuffs is key to predicting the long term impact of the Fukushima accident on the Japanese diet. We have modelled >4000 measurements, spanning 50 years, of (137)Cs in foodstuffs and whole diet in Japan after nuclear weapons testing (NWT) and the Chernobyl accident. Broadly consistent long term trends in (137)Cs activity concentrations are seen between different agricultural foodstuffs; whole diet follows this general trend with remarkably little variation between averages for different regions of Japan. Model blind tests against post-NWT data for the Fukushima Prefecture showed good predictions for radiocaesium in whole diet, spinach and Japanese radish (for which good long term test data were available). For the post-Fukushima period to 2015, radiocaesium in the average diet followed a declining time trend consistent with that seen after NWT and Chernobyl. Data for different regions post-Fukushima show a high degree of mixing of dietary foodstuffs between regions: significant over-estimates of average dietary (137)Cs were made when it was assumed that only regionally-produced food was consumed. Predictions of mean committed effective internal doses from dietary (137)Cs (2011 to 2061) in non-evacuated parts of the Fukushima Prefecture show that average internal dose is relatively low. This study focused on average regional ingestion dose rates and does not attempt to make site specific predictions. However, temporal trends identified could form a basis for site specific predictions of long term activity concentrations in agricultural products and diet both outside and (to assess potential re-use) inside currently evacuated areas.
- Journal of nuclear medicine : official publication, Society of Nuclear Medicine
- Published about 5 years ago
The availability of (99m)Tc for single-photon imaging in diagnostic nuclear medicine is crucial, and current availability is based on the (99)Mo/(99m)Tc generator fabricated from fission-based molybdenum (F (99)Mo) produced using high enriched uranium (HEU) targets. Because of risks related to nuclear material proliferation, the use of HEU targets is being phased out and alternative strategies for production of both (99)Mo and (99m)Tc are being evaluated intensely. There are evidently no plans for replacement of the limited number of reactors that have primarily provided most of the (99)Mo. The uninterrupted, dependable availability of (99m)Tc is a crucial issue. For these reasons, new options being pursued include both reactor- and accelerator-based strategies to sustain the continued availability of (99m)Tc without the use of HEU. In this paper, the scientific and economic issues for transitioning from HEU to non-HEU are also discussed. In addition, the comparative advantages, disadvantages, technical challenges, present status, future prospects, security concerns, economic viability, and regulatory obstacles are reviewed. The international actions in progress toward evolving possible alternative strategies to produce (99)Mo or (99m)Tc are analyzed as well. The breadth of technologies and new strategies under development to provide (99)Mo and (99m)Tc reflects both the broad interest in and the importance of the pivotal role of (99m)Tc in diagnostic nuclear medicine.