Concept: Robust statistics
BACKGROUND: To evaluate institutional nursing care performance in the context of national comparative statistics (benchmarks), approximately one in every three major healthcare institutions (over 1,800 hospitals) across the United States, have joined the National Database for Nursing Quality Indicators[REGISTERED SIGN] (NDNQI[REGISTERED SIGN]). With over 18,000 hospital units contributing data for nearly 200 quantitative measures at present, a reliable and efficient input data screening for all quantitative measures for data quality control is critical to the integrity, validity, and on-time delivery of NDNQI reports. METHODS: With Monte Carlo simulation and quantitative NDNQI indicator examples, we compared two ad-hoc methods using robust scale estimators, Inter Quartile Range (IQR) and Median Absolute Deviation from the Median (MAD), to the classic, theoretically-based Minimum Covariance Determinant (FAST-MCD) approach, for initial univariate outlier detection. RESULTS: While the theoretically based FAST-MCD used in one dimension can be sensitive and is better suited for identifying groups of outliers because of its high breakdown point, the ad-hoc IQR and MAD approaches are fast, easy to implement, and could be more robust and efficient, depending on the distributional property of the underlying measure of interest. CONCLUSION: With highly skewed distributions for most NDNQI indicators within a short data screen window, the FAST-MCD approach, when used in one dimensional raw data setting, could overestimate the false alarm rates for potential outliers than the IQR and MAD with the same pre-set of critical value, thus, overburden data quality control at both the data entry and administrative ends in our setting.
Background Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied.Aims We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making.Methods Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection.Results We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error.Conclusions The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals.
Considerable progress in wireless power transfer has been made in the realm of non-radiative transfer, which employs magnetic-field coupling in the near field. A combination of circuit resonance and impedance transformation is often used to help to achieve efficient transfer of power over a predetermined distance of about the size of the resonators. The development of non-radiative wireless power transfer has paved the way towards real-world applications such as wireless powering of implantable medical devices and wireless charging of stationary electric vehicles. However, it remains a fundamental challenge to create a wireless power transfer system in which the transfer efficiency is robust against the variation of operating conditions. Here we propose theoretically and demonstrate experimentally that a parity-time-symmetric circuit incorporating a nonlinear gain saturation element provides robust wireless power transfer. Our results show that the transfer efficiency remains near unity over a distance variation of approximately one metre, without the need for any tuning. This is in contrast with conventional methods where high transfer efficiency can only be maintained by constantly tuning the frequency or the internal coupling parameters as the transfer distance or the relative orientation of the source and receiver units is varied. The use of a nonlinear parity-time-symmetric circuit should enable robust wireless power transfer to moving devices or vehicles.
Quantifying diversity is of central importance for the study of structure, function and evolution of microbial communities. The estimation of microbial diversity has received renewed attention with the advent of large-scale metagenomic studies. Here, we consider what the diversity observed in a sample tells us about the diversity of the community being sampled. First, we argue that one cannot reliably estimate the absolute and relative number of microbial species present in a community without making unsupported assumptions about species abundance distributions. The reason for this is that sample data do not contain information about the number of rare species in the tail of species abundance distributions. We illustrate the difficulty in comparing species richness estimates by applying Chao’s estimator of species richness to a set of in silico communities: they are ranked incorrectly in the presence of large numbers of rare species. Next, we extend our analysis to a general family of diversity metrics (‘Hill diversities’), and construct lower and upper estimates of diversity values consistent with the sample data. The theory generalizes Chao’s estimator, which we retrieve as the lower estimate of species richness. We show that Shannon and Simpson diversity can be robustly estimated for the in silico communities. We analyze nine metagenomic data sets from a wide range of environments, and show that our findings are relevant for empirically-sampled communities. Hence, we recommend the use of Shannon and Simpson diversity rather than species richness in efforts to quantify and compare microbial diversity.The ISME Journal advance online publication, 14 February 2013; doi:10.1038/ismej.2013.10.
- IEEE transactions on image processing : a publication of the IEEE Signal Processing Society
- Published almost 6 years ago
Recently, there has been significant interest in robust fractal image coding for the purpose of robustness against outliers. However, the known robust fractal coding methods (HFIC and LAD-FIC, etc.) are not optimal, since, besides the high computational cost, they use the corrupted domain block as the independent variable in the robust regression model, which may adversely affect the robust estimator to calculate the fractal parameters (depending on the noise level). This paper presents a Huber fitting plane-based fractal image coding (HFPFIC) method. This method builds Huber fitting planes (HFPs) for the domain and range blocks, respectively, ensuring the use of an uncorrupted independent variable in the robust model. On this basis, a new matching error function is introduced to robustly evaluate the best scaling factor. Meanwhile, a median absolute deviation (MAD) about the median decomposition criterion is proposed to achieve fast adaptive quadtree partitioning for the image corrupted by salt & pepper noise. In order to reduce computational cost, the no-search method is applied to speedup the encoding process. Experimental results show that the proposed HFPFIC can yield superior performance over conventional robust fractal image coding methods in encoding speed and the quality of the restored image. Furthermore, the no-search method can significantly reduce encoding time and achieve less than 2.0 s for the HFPFIC with acceptable image quality degradation. In addition, we show that, combined with the MAD decomposition scheme, the HFP technique used as a robust method can further reduce the encoding time while maintaining image quality.
The recent stunning rise in power conversion efficiencies (PCEs) of perovskite solar cells (PSCs) has triggered worldwide intense research. However, high PCE values have often been reached with poor stability at an illuminated area of typically less than 0.1 cm(2). We used heavily doped inorganic charge extraction layers in planar PSCs to achieve very rapid carrier extraction even with 10-20 nm thick layers avoiding pinholes and eliminating local structural defects over large areas. This robust inorganic nature allowed for the fabrication of PSCs with an aperture area >1 cm(2) showing a power conversion efficiency (PCE) >15% certified by an accredited photovoltaic calibration laboratory. Hysteresis in the current-voltage characteristics was eliminated; the PSCs were stable: >90% of the initial PCE remained after 1000 hours light soaking.
Hydrolysates of lignocellulosic biomass, used as substrates for the sustainable production of fuels and chemicals often contain high amounts of phenolic compounds inhibiting the production microbiota. Quantification of these inhibitor compounds may help to understand possible difficulties in bioprocessing and further the development of more efficient, robust and tolerable processes. A separation method based on capillary electrophoresis with UV detection was developed for the simultaneous quantification of 10 phenolic compounds that may have inhibitor properties. Intraday relative standard deviations were less than 0.7% for migration times and between 2.6% and 6.4% for peak areas. Interday relative standard deviations were less than 3.0% for migration times and between 5.0% and 7.2% for peak areas. The method was applied to demonstrate that Saccharomyces cerevisiae was able to decrease the concentrations of vanillin, coniferyl aldehyde, syringaldehyde, acetoguaiacone and cinnamic acid during the cultivation, whereas the concentrations of phenols increased.
A one-step automated synthesis of the dopamine transporter ligand [(18) F]FECNT from the chlorinated precursor
- Journal of labelled compounds & radiopharmaceuticals
- Published over 2 years ago
The use of [(18) F]labelled nortropane derivative 2β-carbomethoxy-3β-(4-chlorophenyl)-8-(2-fluoroethyl)-nortropane (FECNT) as a dopamine transporter ligand for PET imaging is dependent on efficient radiosynthesis method. Herein, the automated synthesis of [(18) F]FECNT from its chlorinated precursor in commercially available SynChrom [(18) F] R&D module has been developed. The synthesis unit was readily configured for the one-step synthesis from corresponding chlorinated precursor. The radiolabeling process involved a classical [(18) F]fluoride nucleophilic substitution performed at 110 °C for 12 min and finally HPLC and SPE purification. Crude [(18) F]FECNT was obtained with a radiolabeling yield of 59 ± 12% (n = 5). The average uncorrected amount of [(18) F]FECNT in the final formulated dose was 2.0 ± 0.5 GBq (32 ± 7% overall decay-corrected yields) obtained with radiochemical purity over 99% and specific activity of 55 GBq/µmol. The total duration of the procedure was 80-90 min. An automated radiosynthesis of [(18) F]FECNT with high radiochemical purity may provide a simple and robust method of radiopharmaceutical preparation for routine clinical applications.
The decision on the primary endpoint in a randomized clinical trial is of paramount importance and the combination of several endpoints might be a reasonable choice. Gómez and Lagakos (2013) have developed a method that quantifies how much more efficient it could be to use a composite instead of an individual relevant endpoint. From the information provided by the frequencies of observing the component endpoints in the control group and by the relative treatment effects on each individual endpoint, the Asymptotic Relative Efficiency (ARE) can be computed. This paper presents the applicability of the ARE method as a practical and objective tool to evaluate which components, among the plausible ones, are more efficient in the construction of the primary endpoint. The method is illustrated with two real cardiovascular clinical trials and is extended to allow for different dependence structures between the times to the individual endpoints. The influence of this choice on the recommendation on whether or not to use the composite endpoint as the primary endpoint for the investigation is studied. We conclude that the recommendation between using the composite or the relevant endpoint only depends on the frequencies of the endpoints and the relative effects of the treatment.
Converting signals at low intensities between different electromagnetic modes is an asset for future information technologies. In general, slightly asymmetric optical nanoantennas enable the coupling between bright and dark modes that they sustain. However, the conversion efficiency might be very low. Here, we show that the additional incorporation of a quantum emitter allows us to tremendously enhance this efficiency. The enhanced local density of states cycles the quantum emitter between its upper and lower level at an extremely high rate, hence converting the energy very efficiently. The process is robust with respect to possible experimental tolerances, and adds a new ingredient to be exploited while studying and applying coupling phenomena in optical nanosystems.