Concept: Surface water
Efficient meltwater drainage through supraglacial streams and rivers on the southwest Greenland ice sheet
- Proceedings of the National Academy of Sciences of the United States of America
- Published almost 3 years ago
Thermally incised meltwater channels that flow each summer across melt-prone surfaces of the Greenland ice sheet have received little direct study. We use high-resolution WorldView-½ satellite mapping and in situ measurements to characterize supraglacial water storage, drainage pattern, and discharge across 6,812 km(2) of southwest Greenland in July 2012, after a record melt event. Efficient surface drainage was routed through 523 high-order stream/river channel networks, all of which terminated in moulins before reaching the ice edge. Low surface water storage (3.6 ± 0.9 cm), negligible impoundment by supraglacial lakes or topographic depressions, and high discharge to moulins (2.54-2.81 cm⋅d(-1)) indicate that the surface drainage system conveyed its own storage volume every <2 d to the bed. Moulin discharges mapped inside ∼52% of the source ice watershed for Isortoq, a major proglacial river, totaled ∼41-98% of observed proglacial discharge, highlighting the importance of supraglacial river drainage to true outflow from the ice edge. However, Isortoq discharges tended lower than runoff simulations from the Modèle Atmosphérique Régional (MAR) regional climate model (0.056-0.112 km(3)⋅d(-1) vs. ∼0.103 km(3)⋅d(-1)), and when integrated over the melt season, totaled just 37-75% of MAR, suggesting nontrivial subglacial water storage even in this melt-prone region of the ice sheet. We conclude that (i) the interior surface of the ice sheet can be efficiently drained under optimal conditions, (ii) that digital elevation models alone cannot fully describe supraglacial drainage and its connection to subglacial systems, and (iii) that predicting outflow from climate models alone, without recognition of subglacial processes, may overestimate true meltwater export from the ice sheet to the ocean.
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 2 years ago
Compared with nutrient levels and habitat degradation, the importance of agricultural pesticides in surface water may have been underestimated due to a lack of comprehensive quantitative analysis. Increasing pesticide contamination results in decreasing regional aquatic biodiversity, i.e., macroinvertebrate family richness is reduced by ∼30% at pesticide concentrations equaling the legally accepted regulatory threshold levels (RTLs). This study provides a comprehensive metaanalysis of 838 peer-reviewed studies (>2,500 sites in 73 countries) that evaluates, for the first time to our knowledge on a global scale, the exposure of surface waters to particularly toxic agricultural insecticides. We tested whether measured insecticide concentrations (MICs; i.e., quantified insecticide concentrations) exceed their RTLs and how risks depend on insecticide development over time and stringency of environmental regulation. Our analysis reveals that MICs occur rarely (i.e., an estimated 97.4% of analyses conducted found no MICs) and there is a complete lack of scientific monitoring data for ∼90% of global cropland. Most importantly, of the 11,300 MICs, 52.4% (5,915 cases; 68.5% of the sites) exceeded the RTL for either surface water (RTLSW) or sediments. Thus, the biological integrity of global water resources is at a substantial risk. RTLSW exceedances depend on the catchment size, sampling regime, and sampling date; are significantly higher for newer-generation insecticides (i.e., pyrethroids); and are high even in countries with stringent environmental regulations. These results suggest the need for worldwide improvements to current pesticide regulations and agricultural pesticide application practices and for intensified research efforts on the presence and effects of pesticides under real-world conditions.
Photodegradation may be the most important elimination process for cephalosporin antibiotics in surface water. Cefazolin (CFZ) and cephapirin (CFP) underwent mainly direct photolysis (t(½) = 0.7, 3.9 h), while cephalexin (CFX) and cephradine (CFD) were mainly transformed by indirect photolysis, which during the process a bicarbonate-enhanced nitrate system contributed most to the loss rate of CFX, CFD, and cefotaxime (CTX) (t(½) = 4.5, 5.3, and 1.3 h, respectively). Laboratory data suggested that bicarbonate enhanced the phototransformation of CFD and CFX in natural water environments. When used together, NO(3)(-), HCO(3)(-), and DOM closely simulated the photolysis behavior in the Jingmei River and were the strongest determinants in the fate of cephalosporins. TOC and byproducts were investigated and identified. Direct photolysis led to decarboxylation of CFD, CFX, and CFP. Transformation only (no mineralization) of all cephalosporins was observed through direct photolysis; byproducts were found to be even less photolabile and more toxic (via the Microtox test). CFZ exhibited the strongest acute toxicity after just a few hours, which may be largely attributed to its 5-methyl-1,3,4-thiadiazole-2-thiol moiety. Many pharmaceuticals were previously known to undergo direct sunlight photolysis and transformation in surface waters; however, the synergistic increase in toxicity caused by this cocktail (via pharmaceutical photobyproducts) cannot be ignored and warrants future research attention.
The photodegradation and biotic transformation of the pharmaceuticals lidocaine (LDC), tramadol (TRA) and venlafaxine (VEN), and of the metabolites O-desmethyltramadol (ODT) and O-desmethylvenlafaxine (ODV) in the aquatic environmental have been investigated. Photodegradation experiments were carried out using a medium pressure Hg lamp (laboratory experiments) and natural sunlight (field experiments). Degradation of the target compounds followed a first-order kinetic model. Rates of direct photodegradation (light absorption by the compounds itself) at pH 6.9 were very low for all of the target analytes (⩽0.0059h(-1) using a Hg lamp and ⩽0.0027h(-1) using natural sunlight), while rates of indirect photodegradation (degradation of the compounds through photosensitizers) in river water at pH 7.5 were approximately 59 (LDC), 5 (TRA), 8 (VEN), 15 (ODT) and 13 times (ODV) higher than the rates obtained from the experiments in ultrapure water. The accelerated photodegradation of the target compounds in natural water is attributed mainly to the formation of hydroxyl radicals through photochemical reactions. Biotic (microbial) degradation of the target compounds in surface water has been shown to occur at very low rates (⩽0.00029h(-1)). The half-life times determined from the field experiments were 31 (LDC), 73 (TRA), 51 (VEN), 21 (ODT) and 18h (ODV) considering all possible mechanisms of degradation for the target compounds in river water (direct photodegradation, indirect photodegradation and biotic degradation).
Free-living amoebae (FLA) are potential reservoirs of Legionella in aquatic environments. However, the parasitic relationship between various Legionella and amoebae remains unclear. In this study, surface water samples were gathered from two rivers for evaluating parasitic Legionella. Warmer water temperature is critical to the existence of Legionella. This result suggests that amoebae may be helpful in maintaining Legionella in natural environments because warmer temperatures could enhance parasitisation of Legionella in amoebae. We next used immunomagnetic separation (IMS) to identify extracellular Legionella and remove most free Legionella before detecting the parasitic ones in selectively enriched amoebae. Legionella pneumophila was detected in all the approaches, confirming that the pathogen is a facultative amoebae parasite. By contrast, two obligate amoebae parasites, Legionella-like amoebal pathogens (LLAPs) 8 and 9, were detected only in enriched amoebae. However, several uncultured Legionella were detected only in the extracellular samples. Because the presence of potential hosts, namely Vermamoeba vermiformis, Acanthamoeba spp. and Naegleria gruberi, was confirmed in the samples that contained intracellular Legionella, uncultured Legionella may survive independently of amoebae. Immunomagnetic separation and amoebae enrichment may have referential value for detecting parasitic Legionella in surface waters.
We organized a crowdsourcing experiment in the form of a snapshot sampling campaign to assess the spatial distribution of nitrogen solutes, namely, nitrate, ammonium and dissolved organic nitrogen (DON), in German surface waters. In particular, we investigated (i) whether crowdsourcing is a reasonable sampling method in hydrology and (ii) what the effects of population density, soil humus content and arable land were on actual nitrogen solute concentrations and surface water quality. The statistical analyses revealed a significant correlation between nitrate and arable land (0.46), as well as soil humus content (0.37) but a weak correlation with population density (0.12). DON correlations were weak but significant with humus content (0.14) and arable land (0.13). The mean contribution of DON to total dissolved nitrogen was 22%. Samples were classified as water quality class II or above, following the European Water Framework Directive for nitrate and ammonium (53% and 82%, respectively). Crowdsourcing turned out to be a useful method to assess the spatial distribution of stream solutes, as considerable amounts of samples were collected with comparatively little effort.
Target 6.4 of the recently adopted Sustainable Development Goals (SDGs) deals with the reduction of water scarcity. To monitor progress towards this target, two indicators are used: Indicator 6.4.1 measuring water use efficiency and 6.4.2 measuring the level of water stress (WS). This paper aims to identify whether the currently proposed indicator 6.4.2 considers the different elements that need to be accounted for in a WS indicator. WS indicators compare water use with water availability. We identify seven essential elements: 1) both gross and net water abstraction (or withdrawal) provide important information to understand WS; 2) WS indicators need to incorporate environmental flow requirements (EFR); 3) temporal and 4) spatial disaggregation is required in a WS assessment; 5) both renewable surface water and groundwater resources, including their interaction, need to be accounted for as renewable water availability; 6) alternative available water resources need to be accounted for as well, like fossil groundwater and desalinated water; 7) WS indicators need to account for water storage in reservoirs, water recycling and managed aquifer recharge. Indicator 6.4.2 considers many of these elements, but there is need for improvement. It is recommended that WS is measured based on net abstraction as well, in addition to currently only measuring WS based on gross abstraction. It does incorporate EFR. Temporal and spatial disaggregation is indeed defined as a goal in more advanced monitoring levels, in which it is also called for a differentiation between surface and groundwater resources. However, regarding element 6 and 7 there are some shortcomings for which we provide recommendations. In addition, indicator 6.4.2 is only one indicator, which monitors blue WS, but does not give information on green or green-blue water scarcity or on water quality. Within the SDG indicator framework, some of these topics are covered with other indicators.
Identification of management practices associated with preharvest pathogen contamination of produce fields is crucial to the development of effective Good Agricultural Practices (GAPs). A cross-sectional study was conducted to (i) determine management practices associated with a Salmonella or Listeria monocytogenes positive field and (ii) quantify the frequency of these pathogens in irrigation and non-irrigation water sources. Over five weeks, 21 produce farms in New York State were visited. Field-level management practices were recorded for 263 fields, and 600 environmental samples (soil, drag swab, and water) were collected and analyzed for Salmonella and L. monocytogenes. Management practices were evaluated for their association with the presence of a pathogen-positive field. Salmonella and L. monocytogenes were detected in 6.1% and 17.5% of fields (n=263), and 11% and 30% of water samples (n=74), respectively. The majority of pathogen-positive water samples were from non-irrigation surface water sources. Multivariate analysis showed that manure application within a year increased the odds of a Salmonella-positive field (odds ratio [OR] 16.7), while presence of a buffer zone had a protective effect (OR 0.1). Irrigation (within 3 days of sample collection, OR 6.0), reported wildlife observation (within 3 days of sample collection, OR 6.1), and soil cultivation (within 7 days of sample collection, OR 2.9) all increased the likelihood of an L. monocytogenes-positive field. Our findings provide new data that will assist growers with science-based evaluation of their current GAPs and implementation of preventive controls that reduce the risk of preharvest contamination.
Natural waters serve as habitat for a wide range of microorganisms, a proportion of which may be derived from fecal material. A number of watershed models have been developed to understand and predict the fate and transport of fecal microorganisms within complex watersheds, as well as to determine whether microbial water quality standards can be satisfied under site-specific meteorological and/or management conditions. The aim of this review is to highlight and critically evaluate developments in the modeling of microbial water quality of surface waters over the last 10 years and to discuss the future of model development and application at the watershed scale, with a particular focus on fecal indicator organisms (FIOs). In doing so, an agenda of research opportunities is identified to help deliver improvements in the modeling of microbial water quality draining through complex landscape systems. This comprehensive review therefore provides a timely steer to help strengthen future modeling capability of FIOs in surface water environments and provides a useful resource to complement the development of risk management strategies to reduce microbial impairment of freshwater sources.
Oil and natural gas development in the Bakken shale play of North Dakota has grown substantially since 2008. This study provides a comprehensive overview and analysis of water quantity and management impacts from this development by (1) estimating water demand for hydraulic fracturing in the Bakken from 2008 to 2012; (2) compiling volume estimates for maintenance water, or brine dilution water; (3) calculating water intensities normalized by the amount of oil produced, or estimated ultimate recovery (EUR); (4) estimating domestic water demand associated with the large oil services population; (5) analyzing the change in wastewater volumes from 2005 to 2012; and (6) examining existing water sources used to meet demand. Water use for hydraulic fracturing in the North Dakota Bakken grew five-fold from 770 million gallons in 2008 to 4.3 billion gallons in 2012. First-year wastewater volumes grew in parallel, from an annual average of 1,135,000 gallons per well in 2008 to 2,905,000 gallons in 2012, exceeding the mean volume of water used in hydraulic fracturing and surpassing typical 4-year wastewater totals for the Barnett, Denver, and Marcellus basins. Surprisingly, domestic water demand from the temporary oilfield services population in the region may be comparable to the regional water demand from hydraulic fracturing activities. Existing groundwater resources are inadequate to meet the demand for hydraulic fracturing, but there appear to be adequate surface water resources, provided that access is available.