Concept: Surface water
Efficient meltwater drainage through supraglacial streams and rivers on the southwest Greenland ice sheet
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 3 years ago
Thermally incised meltwater channels that flow each summer across melt-prone surfaces of the Greenland ice sheet have received little direct study. We use high-resolution WorldView-½ satellite mapping and in situ measurements to characterize supraglacial water storage, drainage pattern, and discharge across 6,812 km(2) of southwest Greenland in July 2012, after a record melt event. Efficient surface drainage was routed through 523 high-order stream/river channel networks, all of which terminated in moulins before reaching the ice edge. Low surface water storage (3.6 ± 0.9 cm), negligible impoundment by supraglacial lakes or topographic depressions, and high discharge to moulins (2.54-2.81 cm⋅d(-1)) indicate that the surface drainage system conveyed its own storage volume every <2 d to the bed. Moulin discharges mapped inside ∼52% of the source ice watershed for Isortoq, a major proglacial river, totaled ∼41-98% of observed proglacial discharge, highlighting the importance of supraglacial river drainage to true outflow from the ice edge. However, Isortoq discharges tended lower than runoff simulations from the Modèle Atmosphérique Régional (MAR) regional climate model (0.056-0.112 km(3)⋅d(-1) vs. ∼0.103 km(3)⋅d(-1)), and when integrated over the melt season, totaled just 37-75% of MAR, suggesting nontrivial subglacial water storage even in this melt-prone region of the ice sheet. We conclude that (i) the interior surface of the ice sheet can be efficiently drained under optimal conditions, (ii) that digital elevation models alone cannot fully describe supraglacial drainage and its connection to subglacial systems, and (iii) that predicting outflow from climate models alone, without recognition of subglacial processes, may overestimate true meltwater export from the ice sheet to the ocean.
- Proceedings of the National Academy of Sciences of the United States of America
- Published about 3 years ago
Compared with nutrient levels and habitat degradation, the importance of agricultural pesticides in surface water may have been underestimated due to a lack of comprehensive quantitative analysis. Increasing pesticide contamination results in decreasing regional aquatic biodiversity, i.e., macroinvertebrate family richness is reduced by ∼30% at pesticide concentrations equaling the legally accepted regulatory threshold levels (RTLs). This study provides a comprehensive metaanalysis of 838 peer-reviewed studies (>2,500 sites in 73 countries) that evaluates, for the first time to our knowledge on a global scale, the exposure of surface waters to particularly toxic agricultural insecticides. We tested whether measured insecticide concentrations (MICs; i.e., quantified insecticide concentrations) exceed their RTLs and how risks depend on insecticide development over time and stringency of environmental regulation. Our analysis reveals that MICs occur rarely (i.e., an estimated 97.4% of analyses conducted found no MICs) and there is a complete lack of scientific monitoring data for ∼90% of global cropland. Most importantly, of the 11,300 MICs, 52.4% (5,915 cases; 68.5% of the sites) exceeded the RTL for either surface water (RTLSW) or sediments. Thus, the biological integrity of global water resources is at a substantial risk. RTLSW exceedances depend on the catchment size, sampling regime, and sampling date; are significantly higher for newer-generation insecticides (i.e., pyrethroids); and are high even in countries with stringent environmental regulations. These results suggest the need for worldwide improvements to current pesticide regulations and agricultural pesticide application practices and for intensified research efforts on the presence and effects of pesticides under real-world conditions.
Photodegradation may be the most important elimination process for cephalosporin antibiotics in surface water. Cefazolin (CFZ) and cephapirin (CFP) underwent mainly direct photolysis (t(½) = 0.7, 3.9 h), while cephalexin (CFX) and cephradine (CFD) were mainly transformed by indirect photolysis, which during the process a bicarbonate-enhanced nitrate system contributed most to the loss rate of CFX, CFD, and cefotaxime (CTX) (t(½) = 4.5, 5.3, and 1.3 h, respectively). Laboratory data suggested that bicarbonate enhanced the phototransformation of CFD and CFX in natural water environments. When used together, NO(3)(-), HCO(3)(-), and DOM closely simulated the photolysis behavior in the Jingmei River and were the strongest determinants in the fate of cephalosporins. TOC and byproducts were investigated and identified. Direct photolysis led to decarboxylation of CFD, CFX, and CFP. Transformation only (no mineralization) of all cephalosporins was observed through direct photolysis; byproducts were found to be even less photolabile and more toxic (via the Microtox test). CFZ exhibited the strongest acute toxicity after just a few hours, which may be largely attributed to its 5-methyl-1,3,4-thiadiazole-2-thiol moiety. Many pharmaceuticals were previously known to undergo direct sunlight photolysis and transformation in surface waters; however, the synergistic increase in toxicity caused by this cocktail (via pharmaceutical photobyproducts) cannot be ignored and warrants future research attention.
The photodegradation and biotic transformation of the pharmaceuticals lidocaine (LDC), tramadol (TRA) and venlafaxine (VEN), and of the metabolites O-desmethyltramadol (ODT) and O-desmethylvenlafaxine (ODV) in the aquatic environmental have been investigated. Photodegradation experiments were carried out using a medium pressure Hg lamp (laboratory experiments) and natural sunlight (field experiments). Degradation of the target compounds followed a first-order kinetic model. Rates of direct photodegradation (light absorption by the compounds itself) at pH 6.9 were very low for all of the target analytes (⩽0.0059h(-1) using a Hg lamp and ⩽0.0027h(-1) using natural sunlight), while rates of indirect photodegradation (degradation of the compounds through photosensitizers) in river water at pH 7.5 were approximately 59 (LDC), 5 (TRA), 8 (VEN), 15 (ODT) and 13 times (ODV) higher than the rates obtained from the experiments in ultrapure water. The accelerated photodegradation of the target compounds in natural water is attributed mainly to the formation of hydroxyl radicals through photochemical reactions. Biotic (microbial) degradation of the target compounds in surface water has been shown to occur at very low rates (⩽0.00029h(-1)). The half-life times determined from the field experiments were 31 (LDC), 73 (TRA), 51 (VEN), 21 (ODT) and 18h (ODV) considering all possible mechanisms of degradation for the target compounds in river water (direct photodegradation, indirect photodegradation and biotic degradation).
Free-living amoebae (FLA) are potential reservoirs of Legionella in aquatic environments. However, the parasitic relationship between various Legionella and amoebae remains unclear. In this study, surface water samples were gathered from two rivers for evaluating parasitic Legionella. Warmer water temperature is critical to the existence of Legionella. This result suggests that amoebae may be helpful in maintaining Legionella in natural environments because warmer temperatures could enhance parasitisation of Legionella in amoebae. We next used immunomagnetic separation (IMS) to identify extracellular Legionella and remove most free Legionella before detecting the parasitic ones in selectively enriched amoebae. Legionella pneumophila was detected in all the approaches, confirming that the pathogen is a facultative amoebae parasite. By contrast, two obligate amoebae parasites, Legionella-like amoebal pathogens (LLAPs) 8 and 9, were detected only in enriched amoebae. However, several uncultured Legionella were detected only in the extracellular samples. Because the presence of potential hosts, namely Vermamoeba vermiformis, Acanthamoeba spp. and Naegleria gruberi, was confirmed in the samples that contained intracellular Legionella, uncultured Legionella may survive independently of amoebae. Immunomagnetic separation and amoebae enrichment may have referential value for detecting parasitic Legionella in surface waters.
Many current watershed modeling efforts now incorporate surface water and groundwater for managing water resources since the exchanges between groundwater and surface water need a special focus considering the changing climate. The influence of groundwater dynamics on water and energy balance components is investigated in the Snake River Basin (SRB) by coupling the Variable Infiltration Capacity (VIC) and MODFLOW models (VIC-MF) for the period of 1986 through 2042. A 4.4% increase in base flows and a 10.3% decrease in peak flows are estimated by VIC-MF compared to the VIC model in SRB. The VIC-MF model shows significant improvement in the streamflow simulation (Nash-Sutcliffe efficiency [NSE] of 0.84) at King Hill, where the VIC model could not capture the effect of spring discharge in the streamflow simulation (NSE of -0.30); however, the streamflow estimates show an overall decreasing trend. Two climate scenarios representing median and high radiative-forcings such as representative concentration pathways 4.5 and 8.5 show an average increase in the water table elevations between 2.1 and 2.6 m (6.9 and 8.5 feet) through the year 2042. The spatial patterns of these exchanges show a higher groundwater elevation of 15 m (50 feet) in the downstream area and a lower elevation of up to 3 m (10 feet) in the upstream area. Broadly, this study supports results of previous work demonstrating that integrated assessment of groundwater-surface water enables stakeholders to balance pumping, recharge and base flow needs and to manage the watersheds that are subjected to human pressures more sustainably.
Target 6.4 of the recently adopted Sustainable Development Goals (SDGs) deals with the reduction of water scarcity. To monitor progress towards this target, two indicators are used: Indicator 6.4.1 measuring water use efficiency and 6.4.2 measuring the level of water stress (WS). This paper aims to identify whether the currently proposed indicator 6.4.2 considers the different elements that need to be accounted for in a WS indicator. WS indicators compare water use with water availability. We identify seven essential elements: 1) both gross and net water abstraction (or withdrawal) provide important information to understand WS; 2) WS indicators need to incorporate environmental flow requirements (EFR); 3) temporal and 4) spatial disaggregation is required in a WS assessment; 5) both renewable surface water and groundwater resources, including their interaction, need to be accounted for as renewable water availability; 6) alternative available water resources need to be accounted for as well, like fossil groundwater and desalinated water; 7) WS indicators need to account for water storage in reservoirs, water recycling and managed aquifer recharge. Indicator 6.4.2 considers many of these elements, but there is need for improvement. It is recommended that WS is measured based on net abstraction as well, in addition to currently only measuring WS based on gross abstraction. It does incorporate EFR. Temporal and spatial disaggregation is indeed defined as a goal in more advanced monitoring levels, in which it is also called for a differentiation between surface and groundwater resources. However, regarding element 6 and 7 there are some shortcomings for which we provide recommendations. In addition, indicator 6.4.2 is only one indicator, which monitors blue WS, but does not give information on green or green-blue water scarcity or on water quality. Within the SDG indicator framework, some of these topics are covered with other indicators.
We organized a crowdsourcing experiment in the form of a snapshot sampling campaign to assess the spatial distribution of nitrogen solutes, namely, nitrate, ammonium and dissolved organic nitrogen (DON), in German surface waters. In particular, we investigated (i) whether crowdsourcing is a reasonable sampling method in hydrology and (ii) what the effects of population density, soil humus content and arable land were on actual nitrogen solute concentrations and surface water quality. The statistical analyses revealed a significant correlation between nitrate and arable land (0.46), as well as soil humus content (0.37) but a weak correlation with population density (0.12). DON correlations were weak but significant with humus content (0.14) and arable land (0.13). The mean contribution of DON to total dissolved nitrogen was 22%. Samples were classified as water quality class II or above, following the European Water Framework Directive for nitrate and ammonium (53% and 82%, respectively). Crowdsourcing turned out to be a useful method to assess the spatial distribution of stream solutes, as considerable amounts of samples were collected with comparatively little effort.
There have been many studies of the diverse impacts of invasions by alien plants but few have assessed impacts on water resources. We reviewed the information on the impacts of invasions on surface runoff and groundwater resources at stand to catchment scales and covering a full annual cycle. Most of the research is South African so the emphasis is on South Africa’s major invaders with data from commercial forest plantations where relevant. Catchment studies worldwide have shown that changes in vegetation structure and the physiology of the dominant plant species result in changes in surface runoff and groundwater discharge, whether they involve native or alien plant species. Where there is little change in vegetation structure (e.g. leaf area [index], height, rooting depth and seasonality) the effects of invasions generally are small or undetectable. In South Africa, the most important woody invaders typically are taller and deeper rooted than the native species. The impacts of changes in evaporation (and thus runoff) in dryland settings are constrained by water availability to the plants and, thus, by rainfall. Where the dryland invaders are evergreen and the native vegetation (grass) is seasonal the increases can reach 300-400 mm/yr. Where the native vegetation is evergreen (shrublands) the increases are about 200-300 mm/yr. Where water availability is greater (riparian settings or shallow water tables), invading tree water-use can reach 1.5-2.0 times that of the same species in a dryland setting. So riparian invasions have a much greater impact per unit area invaded than dryland invasions. The available data are scattered and incomplete, and there are many gaps and issues that must be addressed before a thorough understanding of the impacts at the site scale can be gained and used in extrapolating to watershed scales, and in converting changes in flows to water supply system yields.
Identification of management practices associated with preharvest pathogen contamination of produce fields is crucial to the development of effective Good Agricultural Practices (GAPs). A cross-sectional study was conducted to (i) determine management practices associated with a Salmonella or Listeria monocytogenes positive field and (ii) quantify the frequency of these pathogens in irrigation and non-irrigation water sources. Over five weeks, 21 produce farms in New York State were visited. Field-level management practices were recorded for 263 fields, and 600 environmental samples (soil, drag swab, and water) were collected and analyzed for Salmonella and L. monocytogenes. Management practices were evaluated for their association with the presence of a pathogen-positive field. Salmonella and L. monocytogenes were detected in 6.1% and 17.5% of fields (n=263), and 11% and 30% of water samples (n=74), respectively. The majority of pathogen-positive water samples were from non-irrigation surface water sources. Multivariate analysis showed that manure application within a year increased the odds of a Salmonella-positive field (odds ratio [OR] 16.7), while presence of a buffer zone had a protective effect (OR 0.1). Irrigation (within 3 days of sample collection, OR 6.0), reported wildlife observation (within 3 days of sample collection, OR 6.1), and soil cultivation (within 7 days of sample collection, OR 2.9) all increased the likelihood of an L. monocytogenes-positive field. Our findings provide new data that will assist growers with science-based evaluation of their current GAPs and implementation of preventive controls that reduce the risk of preharvest contamination.