BACKGROUND: Despite considerable global investigation over several decades, the roles of vitamin D in health and disease development remains convoluted. One recognised issue is the difficulty of accurately measuring the active forms of vitamin D. Advances made include some new methods addressing the potential interference by excluding epimers and isobars. However, there is no evidence that epimers are without function. Therefore, the aim of this study was to develop and validate, for the first time, a new assay to simultaneously measure levels of 6 forms of vitamin D along with two epimers. The assay was applied to multilevel certified reference material calibrators and 25 pooled human sera samples obtained from the Vitamin D External Quality Assessment Scheme (DEQAS) to demonstrate its efficiency. RESULTS: The assay is capable of simultaneously measuring 8 vitamin D analogues over the calibration ranges and LODs (in nmol/L) of: 1alpha25(OH)2D2 [0.015-1; 0.01], 1alpha25(OH)2D3 [0.1-100; 0.01], 25OHD3 [0.5-100, 0.025], 3-epi-25OHD3 [0.1-100, 0.05], 25OHD2 [0.5-100, 0.025], 3-epi-25OHD2 [0.1-100, 0.05], vitamin D3 [0.5-100, 0.05] and vitamin D2 [0.5-100, 0.05], using stanozolol-d3 as internal standard. Certified reference material calibrators and external quality control samples (DEQAS) were analysed to meet the standards outlined by National Institute of Standards and Technology (NIST). Validation steps included recovery and both precision and accuracy under inter- and intra-day variation limit of detection, and analysis of each analyte over a linear range. All validation parameters were in line with acceptable Food and Drug Administration (FDA) guidelines and the standards of the National Institute of Standards and Technology (NIST). All eight analogues were quantified with the 25OHD levels being commensurate with DEQAS data. CONCLUSIONS: This report details the application of a new LC-MS/MS based assay for the efficient analysis of eight analogues of vitamin D over a range of samples, which is a significant advance over the existing methods. Simultaneous measure of 8 vitamin D analogues does not compromise the analytical capability of the assay to quantify the commonly used biomarker (25OHD) for vitamin D status. The results demonstrate the feasibility of applying the assay in research and clinical practice that i) excludes misleading measures owing to epimers and isobars and ii) is able to quantify the excluded component to facilitate further in vivo investigation into the roles of ubiquitous epimers.
Electromagnetic (EM) tracking systems are highly susceptible to field distortion. The interference can cause measurement errors up to a few centimeters in clinical environments, which limits the reliability of these systems. Unless corrected for, this measurement error imperils the success of clinical procedures. It is therefore fundamental to dynamically calibrate EM tracking systems and compensate for measurement error caused by field distorting objects commonly present in clinical environments. We propose to combine a motion model with observations of redundant EM sensors and compensate for field distortions in real-time. We employ a simultaneous localization and mapping (SLAM) technique to accurately estimate the pose of the tracked instrument while creating the field distortion map. We conducted experiments with 6 degrees-of-freedom motions in the presence of field distorting objects in research and clinical environments. We applied our approach to improve the EM tracking accuracy and compared our results to a conventional sensor fusion technique. Using our approach, the maximum tracking error was reduced by 67% for position measurements and by 64% for orientation measurements. Currently, clinical applications of EM trackers are hampered by the adverse distortion effects. Our approach introduces a novel method for dynamic field distortion compensation, independent from pre-operative calibrations or external tracking devices, and enables reliable EM navigation for potential applications.
Next-generation sequencing (NGS) provides a broad investigation of the genome, and it is being readily applied for the diagnosis of disease-associated genetic features. However, the interpretation of NGS data remains challenging owing to the size and complexity of the genome and the technical errors that are introduced during sample preparation, sequencing and analysis. These errors can be understood and mitigated through the use of reference standards - well-characterized genetic materials or synthetic spike-in controls that help to calibrate NGS measurements and to evaluate diagnostic performance. The informed use of reference standards, and associated statistical principles, ensures rigorous analysis of NGS data and is essential for its future clinical use.
The mixing of magmas is a common phenomenon in explosive eruptions. Concentration variance is a useful metric of this process and its decay (CVD) with time is an inevitable consequence during the progress of magma mixing. In order to calibrate this petrological/volcanological clock we have performed a time-series of high temperature experiments of magma mixing. The results of these experiments demonstrate that compositional variance decays exponentially with time. With this calibration the CVD rate (CVD-R) becomes a new geochronometer for the time lapse from initiation of mixing to eruption. The resultant novel technique is fully independent of the typically unknown advective history of mixing - a notorious uncertainty which plagues the application of many diffusional analyses of magmatic history. Using the calibrated CVD-R technique we have obtained mingling-to-eruption times for three explosive volcanic eruptions from Campi Flegrei (Italy) in the range of tens of minutes. These in turn imply ascent velocities of 5-8 meters per second. We anticipate the routine application of the CVD-R geochronometer to the eruptive products of active volcanoes in future in order to constrain typical “mixing to eruption” time lapses such that monitoring activities can be targeted at relevant timescales and signals during volcanic unrest.
People and societies seek to combat harmful events. However, because resources are limited, every wrong righted leaves another wrong left unchecked. Responses must therefore be calibrated to the magnitude of the harm. One underappreciated factor that affects this calibration may be people’s oversensitivity to intent. Across a series of studies, people saw intended harms as worse than unintended harms, even though the two harms were identical. This harm-magnification effect occurred for both subjective and monetary estimates of harm, and it remained when participants were given incentives to be accurate. The effect was fully mediated by blame motivation. People may therefore focus on intentional harms to the neglect of unintentional (but equally damaging) harms.
The certification of a new standard reference material for small-angle scattering [NIST Standard Reference Material (SRM) 3600: Absolute Intensity Calibration Standard for Small-Angle X-ray Scattering (SAXS)], based on glassy carbon, is presented. Creation of this SRM relies on the intrinsic primary calibration capabilities of the ultra-small-angle X-ray scattering technique. This article describes how the intensity calibration has been achieved and validated in the certified Q range, Q = 0.008-0.25 Å(-1), together with the purpose, use and availability of the SRM. The intensity calibration afforded by this robust and stable SRM should be applicable universally to all SAXS instruments that employ a transmission measurement geometry, working with a wide range of X-ray energies or wavelengths. The validation of the SRM SAXS intensity calibration using small-angle neutron scattering (SANS) is discussed, together with the prospects for including SANS in a future renewal certification.
Current biodiversity assessment and biomonitoring are largely based on the morphological identification of selected bioindicator taxa. Recently, several attempts have been made to use eDNA metabarcoding as an alternative tool. However, until now, most applied metabarcoding studies have been based on the taxonomic assignment of sequences that provides reference to morphospecies ecology. Usually, only a small portion of metabarcoding data can be used due to a limited reference database and a lack of phylogenetic resolution. Here, we investigate the possibility to overcome these limitations by using a taxonomy-free approach that allows the computing of a molecular index directly from eDNA data without any reference to morphotaxonomy. As a case study, we use the benthic diatoms index, commonly used for monitoring the biological quality of rivers and streams. We analysed 87 epilithic samples from Swiss rivers, the ecological status of which was established based on the microscopic identification of diatom species. We compared the diatom index derived from eDNA data obtained with or without taxonomic assignment. Our taxonomy-free approach yields promising results by providing a correct assessment for 77% of examined sites. The main advantage of this method is that almost 95% of OTUs could be used for index calculation, compared to 35% in the case of the taxonomic assignment approach. Its main limitations are under-sampling and the need to calibrate the index based on the microscopic assessment of diatoms communities. However, once calibrated, the taxonomy-free molecular index can be easily standardized and applied in routine biomonitoring, as a complementary tool allowing fast and cost-effective assessment of the biological quality of watercourses. This article is protected by copyright. All rights reserved.
Valid objective measurement is integral to increasing our understanding of physical activity and sedentary behaviours. However, no population-specific cut points have been calibrated for children with intellectual disabilities. Therefore, this study aimed to calibrate and cross-validate the first population-specific accelerometer intensity cut points for children with intellectual disabilities.
Secondary calibrations (calibrations based on the results of previous molecular dating studies) are commonly applied in divergence time analyses in groups that lack fossil data; however, the consequences of applying secondary calibrations in a relaxed-clock approach are not fully understood. I tested whether applying the posterior estimate from a primary study as a prior distribution in a secondary study results in consistent age and uncertainty estimates. I compared age estimates from simulations with 100 randomly replicated secondary trees. On average, the 95% credible intervals of node ages for secondary estimates were significantly younger and narrower than primary estimates. The primary and secondary age estimates were significantly different in 97% of the replicates after Bonferroni corrections. Greater error in magnitude was associated with deeper than shallower nodes, but the opposite was found when standardized by median node age, and a significant positive relationship was determined between the number of tips/age of secondary trees and the total amount of error. When two secondary calibrated nodes were analyzed, estimates remained significantly different, and although the minimum and median estimates were associated with less error, maximum age estimates and credible interval widths had greater error. The shape of the prior also influenced error, in which applying a normal, rather than uniform, prior distribution resulted in greater error. Secondary calibrations, in summary, lead to a false impression of precision and the distribution of age estimates shift away from those that would be inferred by the primary analysis. These results suggest that secondary calibrations should not be applied as the only source of calibration in divergence time analyses that test time-dependent hypotheses until the additional error associated with secondary calibrations is more properly modeled to take into account increased uncertainty in age estimates.
Sequencing DNA fragments associated with proteins following in vivo cross-linking with formaldehyde (known as ChIP-seq) has been used extensively to describe the distribution of proteins across genomes. It is not widely appreciated that this method merely estimates a protein’s distribution and cannot reveal changes in occupancy between samples. To do this, we tagged with the same epitope orthologous proteins in Saccharomyces cerevisiae and Candida glabrata, whose sequences have diverged to a degree that most DNA fragments longer than 50 bp are unique to just one species. By mixing defined numbers of C. glabrata cells (the calibration genome) with S. cerevisiae samples (the experimental genomes) prior to chromatin fragmentation and immunoprecipitation, it is possible to derive a quantitative measure of occupancy (the occupancy ratio - OR) that enables a comparison of occupancies not only within but also between genomes. We demonstrate for the first time that this ‘internal standard’ calibration method satisfies the sine qua non for quantifying ChIP-seq profiles, namely linearity over a wide range. Crucially, by employing functional tagged proteins, our calibration process describes a method that distinguishes genuine association within ChIP-seq profiles from background noise. Our method is applicable to any protein, not merely highly conserved ones, and obviates the need for the time consuming, expensive, and technically demanding quantification of ChIP using qPCR, which can only be performed on individual loci. As we demonstrate for the first time in this paper, calibrated ChIP-seq represents a major step towards documenting the quantitative distributions of proteins along chromosomes in different cell states, which we term biological chromodynamics.