Concept: Chemical substance
Whisky is distilled to around 70% alcohol by volume (vol-%) then diluted to about 40 vol-%, and often drunk after further slight dilution to enhance its taste. The taste of whisky is primarily associated with amphipathic molecules, such as guaiacol, but why and how dilution enhances the taste is not well understood. We carried out computer simulations of water-ethanol mixtures in the presence of guaiacol, providing atomistic details on the structure of the liquid mixture. We found that guaiacol is preferentially associated with ethanol, and, therefore, primarily found at the liquid-air interface in mixtures that contain up to 45 vol-% of ethanol. At ethanol concentrations of 59 vol-% or higher, guaiacol is increasingly surrounded by ethanol molecules and is driven to the bulk. This indicates that the taste of guaiacol in the whisky would be enhanced upon dilution prior to bottling. Our findings may apply to other flavour-giving amphipathic molecules and could contribute to optimising the production of spirits for desired tastes. Furthermore, it sheds light on the molecular structure of water-alcohol mixtures that contain small solutes, and reveals that interactions with the water may be negligible already at 89 vol-% of ethanol.
Plastic debris litters aquatic habitats globally, the majority of which is microscopic (< 1 mm), and is ingested by a large range of species. Risks associated with such small fragments come from the material itself and from chemical pollutants that sorb to it from surrounding water. Hazards associated with the complex mixture of plastic and accumulated pollutants are largely unknown. Here, we show that fish, exposed to a mixture of polyethylene with chemical pollutants sorbed from the marine environment, bioaccumulate these chemical pollutants and suffer liver toxicity and pathology. Fish fed virgin polyethylene fragments also show signs of stress, although less severe than fish fed marine polyethylene fragments. We provide baseline information regarding the bioaccumulation of chemicals and associated health effects from plastic ingestion in fish and demonstrate that future assessments should consider the complex mixture of the plastic material and their associated chemical pollutants.
The Fukushima nuclear accident released radioactive materials into the environment over the entire Northern Hemisphere in March 2011, and the Japanese government is spending large amounts of money to clean up the contaminated residential areas and agricultural fields. However, we still do not know the exact physical and chemical properties of the radioactive materials. This study directly observed spherical Cs-bearing particles emitted during a relatively early stage (March 14-15) of the accident. In contrast to the Cs-bearing radioactive materials that are currently assumed, these particles are larger, contain Fe, Zn, and Cs, and are water insoluble. Our simulation indicates that the spherical Cs-bearing particles mainly fell onto the ground by dry deposition. The finding of the spherical Cs particles will be a key to understand the processes of the accident and to accurately evaluate the health impacts and the residence time in the environment.
The ecological impacts of emerging pollutants such as pharmaceuticals are not well understood. The lack of experimental approaches for the identification of pollutant effects in realistic settings (that is, low doses, complex mixtures, and variable environmental conditions) supports the widespread perception that these effects are often unpredictable. To address this, we developed a novel screening method (GSA-QHTS) that couples the computational power of global sensitivity analysis (GSA) with the experimental efficiency of quantitative high-throughput screening (QHTS). We present a case study where GSA-QHTS allowed for the identification of the main pharmaceutical pollutants (and their interactions), driving biological effects of low-dose complex mixtures at the microbial population level. The QHTS experiments involved the integrated analysis of nearly 2700 observations from an array of 180 unique low-dose mixtures, representing the most complex and data-rich experimental mixture effect assessment of main pharmaceutical pollutants to date. An ecological scaling-up experiment confirmed that this subset of pollutants also affects typical freshwater microbial community assemblages. Contrary to our expectations and challenging established scientific opinion, the bioactivity of the mixtures was not predicted by the null mixture models, and the main drivers that were identified by GSA-QHTS were overlooked by the current effect assessment scheme. Our results suggest that current chemical effect assessment methods overlook a substantial number of ecologically dangerous chemical pollutants and introduce a new operational framework for their systematic identification.
Since its public introduction in 2005 the IUPAC InChI chemical structure identifier standard has become the international, worldwide standard for defined chemical structures. This article will describe the extensive use and dissemination of the InChI and InChIKey structure representations by and for the world-wide chemistry community, the chemical information community, and major publishers and disseminators of chemical and related scientific offerings in manuscripts and databases.
Targeted journal curation as a method to improve data currency at the Comparative Toxicogenomics Database
- Database : the journal of biological databases and curation
- Published about 6 years ago
The Comparative Toxicogenomics Database (CTD) is a public resource that promotes understanding about the effects of environmental chemicals on human health. CTD biocurators read the scientific literature and manually curate a triad of chemical-gene, chemical-disease and gene-disease interactions. Typically, articles for CTD are selected using a chemical-centric approach by querying PubMed to retrieve a corpus containing the chemical of interest. Although this technique ensures adequate coverage of knowledge about the chemical (i.e. data completeness), it does not necessarily reflect the most current state of all toxicological research in the community at large (i.e. data currency). Keeping databases current with the most recent scientific results, as well as providing a rich historical background from legacy articles, is a challenging process. To address this issue of data currency, CTD designed and tested a journal-centric approach of curation to complement our chemical-centric method. We first identified priority journals based on defined criteria. Next, over 7 weeks, three biocurators reviewed 2425 articles from three consecutive years (2009-2011) of three targeted journals. From this corpus, 1252 articles contained relevant data for CTD and 52 752 interactions were manually curated. Here, we describe our journal selection process, two methods of document delivery for the biocurators and the analysis of the resulting curation metrics, including data currency, and both intra-journal and inter-journal comparisons of research topics. Based on our results, we expect that curation by select journals can (i) be easily incorporated into the curation pipeline to complement our chemical-centric approach; (ii) build content more evenly for chemicals, genes and diseases in CTD (rather than biasing data by chemicals-of-interest); (iii) reflect developing areas in environmental health and (iv) improve overall data currency for chemicals, genes and diseases. Database URL: http://ctdbase.org/
Planet Earth’s biosphere has evolved over billions of years as a balanced bio-geological system ultimately sustained by sunpower and the large-scale cycling of elements largely run by the global environmental microbiome. Humans have been part of this picture for much of their existence. But the industrial revolution started in the XIX century and the subsequent advances in medicine, chemistry, agriculture and communications have impacted such balances to an unprecedented degree - and the problem has nothing but exacerbated in the last 20 years. Human overpopulation, industrial growth along with unsustainable use of natural resources have driven many sites and perhaps the planetary ecosystem as a whole, beyond recovery by spontaneous natural means, even if the immediate causes could be stopped. The most conspicuous indications of such a state of affairs include the massive change in land use, the accelerated increase in the levels of greenhouse gases, the frequent natural disasters associated to climate change and the growing non-recyclable waste (e.g. plastics and recalcitrant chemicals) that we release to the Environment. While the whole planet is afflicted at a global scale by chemical pollution and anthropogenic emissions, the ongoing development of systems and synthetic biology, metagenomics, modern chemistry and some key concepts from ecological theory allow us to tackle this phenomenal challenge and propose large-scale interventions aimed at reversing and even improving the situation. This involves (i) identification of key reactions or processes that need to be re-established (or altogether created) for ecosystem reinstallation, (ii) implementation of such reactions in natural or designer hosts able to self-replicate and deliver the corresponding activities when/where needed in a fashion guided by sound ecological modelling, (iii) dispersal of niche-creating agents at a global scale and (iv) containment, monitoring and risk assessment of the whole process.
PubChem is an open repository for chemical structures, biological activities and biomedical annotations. Semantic Web technologies are emerging as an increasingly important approach to distribute and integrate scientific data. Exposing PubChem data to Semantic Web services may help enable automated data integration and management, as well as facilitate interoperable web applications.
Concentration addition (CA) was proposed as a reasonable default approach for the ecological risk assessment of chemical mixtures. However, CA cannot predict the toxicity of mixture at some effect zones if not all components have definite effective concentrations at the given effect, such as some compounds induce hormesis. In this paper, we developed a new method for the toxicity prediction of various types of binary mixtures, an interpolation method based on the Delaunay triangulation (DT) and Voronoi tessellation (VT) as well as the training set of direct equipartition ray design (EquRay) mixtures, simply IDVequ. At first, the EquRay was employed to design the basic concentration compositions of five binary mixture rays. The toxic effects of single components and mixture rays at different times and various concentrations were determined by the time-dependent microplate toxicity analysis. Secondly, the concentration-toxicity data of the pure components and various mixture rays were acted as a training set. The DT triangles and VT polygons were constructed by various vertices of concentrations in the training set. The toxicities of unknown mixtures were predicted by the linear interpolation and natural neighbor interpolation of vertices. The IDVequ successfully predicted the toxicities of various types of binary mixtures.
Use of a modified GreenScreen tool to conduct a screening-level comparative hazard assessment of conventional silver and two forms of nanosilver
- Environmental health : a global access science source
- Published about 2 years ago
Increased concern for potential health and environmental impacts of chemicals, including nanomaterials, in consumer products is driving demand for greater transparency regarding potential risks. Chemical hazard assessment is a powerful tool to inform product design, development and procurement and has been integrated into alternative assessment frameworks. The extent to which assessment methods originally designed for conventionally-sized materials can be used for nanomaterials, which have size-dependent physical and chemical properties, have not been well established. We contracted with a certified GreenScreen profiler to conduct three GreenScreen hazard assessments, for conventional silver and two forms of nanosilver. The contractor summarized publicly available literature, and used defined GreenScreen hazard criteria and expert judgment to assign and report hazard classification levels, along with indications of confidence in those assignments. Where data were not available, a data gap (DG) was assigned. Using the individual endpoint scores, an aggregated benchmark score (BM) was applied.