Journal: The Science of the total environment
Growing evidence suggests that anthropogenic litter, particularly plastic, represents a highly pervasive and persistent threat to global marine ecosystems. Multinational research is progressing to characterise its sources, distribution and abundance so that interventions aimed at reducing future inputs and clearing extant litter can be developed. Citizen science projects, whereby members of the public gather information, offer a low-cost method of collecting large volumes of data with considerable temporal and spatial coverage. Furthermore, such projects raise awareness of environmental issues and can lead to positive changes in behaviours and attitudes. We present data collected over a decade (2005-2014 inclusive) by Marine Conservation Society (MCS) volunteers during beach litter surveys carried along the British coastline, with the aim of increasing knowledge on the composition, spatial distribution and temporal trends of coastal debris. Unlike many citizen science projects, the MCS beach litter survey programme gathers information on the number of volunteers, duration of surveys and distances covered. This comprehensive information provides an opportunity to standardise data for variation in sampling effort among surveys, enhancing the value of outputs and robustness of findings. We found that plastic is the main constituent of anthropogenic litter on British beaches and the majority of traceable items originate from land-based sources, such as public littering. We identify the coast of the Western English Channel and Celtic Sea as experiencing the highest relative litter levels. Increasing trends over the 10-year time period were detected for a number of individual item categories, yet no statistically significant change in total (effort-corrected) litter was detected. We discuss the limitations of the dataset and make recommendations for future work. The study demonstrates the value of citizen science data in providing insights that would otherwise not be possible due to logistical and financial constraints of running government-funded sampling programmes on such large scales.
Chloride concentrations in northern U.S. included in this study have increased substantially over time with average concentrations approximately doubling from 1990 to 2011, outpacing the rate of urbanization in the northern U.S. Historical data were examined for 30 monitoring sites on 19 streams that had chloride concentration and flow records of 18 to 49years. Chloride concentrations in most studied streams increased in all seasons (13 of 19 in all seasons; 16 of 19 during winter); maximum concentrations occurred during winter. Increasing concentrations during non-deicing periods suggest that chloride was stored in hydrologic reservoirs, such as the shallow groundwater system, during the winter and slowly released in baseflow throughout the year. Streamflow dependency was also observed with chloride concentrations increasing as streamflow decreased, a result of dilution during rainfall- and snowmelt-induced high-flow periods. The influence of chloride on aquatic life increased with time; 29% of sites studied exceeded the concentration for the USEPA chronic water quality criteria of 230mg/L by an average of more than 100 individual days per year during 2006-2011. The rapid rate of chloride concentration increase in these streams is likely due to a combination of possible increased road salt application rates, increased baseline concentrations, and greater snowfall in the Midwestern U.S. during the latter portion of the study period.
The widespread distribution of unconventional oil and gas (UO&G) wells and other facilities in the United States potentially exposes millions of people to air and water pollutants, including known or suspected carcinogens. Childhood leukemia is a particular concern because of the disease severity, vulnerable population, and short disease latency. A comprehensive review of carcinogens and leukemogens associated with UO&G development is not available and could inform future exposure monitoring studies and human health assessments. The objective of this analysis was to assess the evidence of carcinogenicity of water contaminants and air pollutants related to UO&G development. We obtained a list of 1177 chemicals in hydraulic fracturing fluids and wastewater from the U.S. Environmental Protection Agency and constructed a list of 143 UO&G-related air pollutants through a review of scientific papers published through 2015 using PubMed and ProQuest databases. We assessed carcinogenicity and evidence of increased risk for leukemia/lymphoma of these chemicals using International Agency for Research on Cancer (IARC) monographs. The majority of compounds (>80%) were not evaluated by IARC and therefore could not be reviewed. Of the 111 potential water contaminants and 29 potential air pollutants evaluated by IARC (119 unique compounds), 49 water and 20 air pollutants were known, probable, or possible human carcinogens (55 unique compounds). A total of 17 water and 11 air pollutants (20 unique compounds) had evidence of increased risk for leukemia/lymphoma, including benzene, 1,3-butadiene, cadmium, diesel exhaust, and several polycyclic aromatic hydrocarbons. Though information on the carcinogenicity of compounds associated with UO&G development was limited, our assessment identified 20 known or suspected carcinogens that could be measured in future studies to advance exposure and risk assessments of cancer-causing agents. Our findings support the need for investigation into the relationship between UO&G development and risk of cancer in general and childhood leukemia in particular.
Tritium concentrations in Japanese precipitation samples collected after the March 2011 accident at the Fukushima Dai-ichi Nuclear Power Plant (FNPP1) were measured. Values exceeding the pre-accident background were detected at three out of seven localities (Tsukuba, Kashiwa and Hongo) southwest of the FNPP1 at distances varying between 170 and 220km from the source. The highest tritium content was found in the first rainfall in Tsukuba after the accident; however concentrations were 500 times less than the regulatory limit for tritium in drinking water. Tritium concentrations decreased steadily and rapidly with time, becoming indistinguishable from the pre-accident values within five weeks. The atmospheric tritium activities in the vicinity of the FNPP1 during the earliest stage of the accident was estimated to be 1.5×10(3)Bq/m(3), which is potentially capable of producing rainwater exceeding the regulatory limit, but only in the immediate vicinity of the source.
Urban wastewater treatment plants (UWTPs) are among the main sources of antibiotics' release into the environment. The occurrence of antibiotics may promote the selection of antibiotic resistance genes (ARGs) and antibiotic resistant bacteria (ARB), which shade health risks to humans and animals. In this paper the fate of ARB and ARGs in UWTPs, focusing on different processes/technologies (i.e., biological processes, advanced treatment technologies and disinfection), was critically reviewed. The mechanisms by which biological processes influence the development/selection of ARB and ARGs transfer are still poorly understood. Advanced treatment technologies and disinfection process are regarded as a major tool to control the spread of ARB into the environment. In spite of intense efforts made over the last years to bring solutions to control antibiotic resistance spread in the environment, there are still important gaps to fill in. In particular, it is important to: (i) improve risk assessment studies in order to allow accurate estimates about the maximal abundance of ARB in UWTPs effluents that would not pose risks for human and environmental health; (ii) understand the factors and mechanisms that drive antibiotic resistance maintenance and selection in wastewater habitats. The final objective is to implement wastewater treatment technologies capable of assuring the production of UWTPs effluents with an acceptable level of ARB.
Anthropogenic activity is affecting the global climate through the release of greenhouse gases (GHGs) e.g. CO2 and CH4. About a third of anthropogenic GHGs are produced from agriculture, including livestock farming and horticulture. A large proportion of the UK’s horticultural farming takes place on drained lowland peatlands, which are a source of significant amounts of CO2 into the atmosphere. This study set out to establish whether raising the water table from the currently used -50cm to -30cm could reduce GHGs emissions from agricultural peatlands, while simultaneously maintaining the current levels of horticultural productivity. A factorial design experiment used agricultural peat soil collected from the Norfolk Fens (among the largest of the UK’s lowland peatlands under intensive cultivation) to assess the effects of water table levels, elevated CO2, and agricultural production on GHG fluxes and crop productivity of radish, one of the most economically important fenland crops. The results of this study show that a water table of -30cm can increase the productivity of the radish crop while also reducing soil CO2 emissions but without a resultant loss of CH4 to the atmosphere, under both ambient and elevated CO2 concentrations. Elevated CO2 increased dry shoot biomass, but not bulb biomass nor root biomass, suggesting no immediate advantage of future CO2 levels to horticultural farming on peat soils. Overall, increasing the water table could make an important contribution to global warming mitigation while not having a detrimental impact on crop yield.
The West African cocoa belt, reaching from Sierra Leone to southern Cameroon, is the origin of about 70% of the world’s cocoa (Theobroma cacao), which in turn is the basis of the livelihoods of about two million farmers. We analyze cocoa’s vulnerability to climate change in the West African cocoa belt, based on climate projections for the 2050s of 19 Global Circulation Models under the Intergovernmental Panel on Climate Change intermediate emissions scenario RCP 6.0. We use a combination of a statistical model of climatic suitability (Maxent) and the analysis of individual, potentially limiting climate variables. We find that: 1) contrary to expectation, maximum dry season temperatures are projected to become as or more limiting for cocoa as dry season water availability; 2) to reduce the vulnerability of cocoa to excessive dry season temperatures, the systematic use of adaptation strategies like shade trees in cocoa farms will be necessary, in reversal of the current trend of shade reduction; 3) there is a strong differentiation of climate vulnerability within the cocoa belt, with the most vulnerable areas near the forest-savanna transition in Nigeria and eastern Côte d'Ivoire, and the least vulnerable areas in the southern parts of Cameroon, Ghana, Côte d'Ivoire and Liberia; 4) this spatial differentiation of climate vulnerability may lead to future shifts in cocoa production within the region, with the opportunity of partially compensating losses and gains, but also the risk of local production expansion leading to new deforestation. We conclude that adaptation strategies for cocoa in West Africa need to focus at several levels, from the consideration of tolerance to high temperatures in cocoa breeding programs, the promotion of shade trees in cocoa farms, to policies incentivizing the intensification of cocoa production on existing farms where future climate conditions permit and the establishment of new farms in already deforested areas.
Present day lead pollution is an environmental hazard of global proportions. A correct determination of natural lead levels is very important in order to evaluate anthropogenic lead contributions. In this paper, the anthropogenic signature of early metallurgy in Southern Iberia during the Holocene, more specifically during the Late Prehistory, was assessed by mean of a multiproxy approach: comparison of atmospheric lead pollution, fire regimes, deforestation, mass sediment transport, and archeological data. Although the onset of metallurgy in Southern Iberia is a matter of controversy, here we show the oldest lead pollution record from Western Europe in a continuous paleoenvironmental sequence, which suggests clear lead pollution caused by metallurgical activities since ~3900cal BP (Early Bronze Age). This lead pollution was especially important during Late Bronze and Early Iron ages. At the same time, since ~4000cal BP, an increase in fire activity is observed in this area, which is also coupled with deforestation and increased erosion rates. This study also shows that the lead pollution record locally reached near present-day values many times in the past, suggesting intensive use and manipulation of lead during those periods in this area.
Accurate estimates of chlorophyll-a concentration (Chl-a) from remotely sensed data for inland waters are challenging due to their optical complexity. In this study, a framework of Chl-a estimation is established for optically complex inland waters based on combination of water optical classification and two semi-empirical algorithms. Three spectrally distinct water types (Type I to Type III) are first identified using a clustering method performed on remote sensing reflectance (R(rs)) from datasets containing 231 samples from Lake Taihu, Lake Chaohu, Lake Dianchi, and Three Gorges Reservoir. The classification criteria for each optical water type are subsequently defined for MERIS images based on the spectral characteristics of the three water types. The criteria cluster every R(rs) spectrum into one of the three water types by comparing the values from band 7 (central band: 665nm), band 8 (central band: 681.25nm), and band 9 (central band: 708.75nm) of MERIS images. Based on the water classification, the type-specific three-band algorithms (TBA) and type-specific advanced three-band algorithm (ATBA) are developed for each water type using the same datasets. By pre-classifying, errors are decreased for the two algorithms, with the mean absolute percent error (MAPE) of TBA decreasing from 36.5% to 23% for the calibration datasets, and from 40% to 28% for ATBA. The accuracy of the two algorithms for validation data indicates that optical classification eliminates the need to adjust the optimal locations of the three bands or to re-parameterize to estimate Chl-a for other waters. The classification criteria and the type-specific ATBA are additionally validated by two MERIS images. The framework of first classifying optical water types based on reflectance characteristics and subsequently developing type-specific algorithms for different water types is a valid scheme for reducing errors in Chl-a estimation for optically complex inland waters.
The persistence of chemicals is a key parameter for their environmental risk assessment. Extrapolating their biodegradability potential in aqueous systems to soil systems would improve the environmental impact assessment. This study compares the fate of (14/13)C-labelled 2,4-D (2,4-dichlorophenoxyacetic acid) and ibuprofen in OECD tests 301 (ready biodegradability in aqueous systems) and 307 (soil). 85% of 2,4-D and 68% of ibuprofen were mineralised in aqueous systems, indicating ready biodegradability, but only 57% and 45% in soil. Parent compounds and metabolites decreased to <2% of the spiked amounts in both systems. In soil, 36% of 2,4-D and 30% of ibuprofen were bound in non-extractable residues (NER). NER formation in the abiotic controls was half as high as in the biotic treatments. However, mineralisation, biodegradation and abiotic residue formation are competing processes. Assuming the same extent of abiotic NER formation in abiotic and biotic systems may therefore overestimate the abiotic contribution in the biotic systems. Mineralisation was described by a logistic model for the aquatic systems and by a two-pool first order degradation model for the soil systems. This agrees with the different abundance of microorganisms in the two systems, but precludes direct comparison of the fitted parameters. Nevertheless, the maximum mineralisable amounts determined by the models were similar in both systems, although the maximum mineralisation rate was about 3.5 times higher in the aqueous systems than in the soil system for both compounds; these parameters may thus be extrapolated from aqueous to soil systems. However, the maximum mineralisable amount is calculated by extrapolation to infinite times and includes intermediately formed biomass derived from the labelled carbon. The amount of labelled carbon within microbial biomass residues is higher in the soil system, resulting in lower degradation rates. Further evaluation of these relationships requires comparison data on more chemicals and from different soils.