SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Robust statistics

170

BACKGROUND: To evaluate institutional nursing care performance in the context of national comparative statistics (benchmarks), approximately one in every three major healthcare institutions (over 1,800 hospitals) across the United States, have joined the National Database for Nursing Quality Indicators[REGISTERED SIGN] (NDNQI[REGISTERED SIGN]). With over 18,000 hospital units contributing data for nearly 200 quantitative measures at present, a reliable and efficient input data screening for all quantitative measures for data quality control is critical to the integrity, validity, and on-time delivery of NDNQI reports. METHODS: With Monte Carlo simulation and quantitative NDNQI indicator examples, we compared two ad-hoc methods using robust scale estimators, Inter Quartile Range (IQR) and Median Absolute Deviation from the Median (MAD), to the classic, theoretically-based Minimum Covariance Determinant (FAST-MCD) approach, for initial univariate outlier detection. RESULTS: While the theoretically based FAST-MCD used in one dimension can be sensitive and is better suited for identifying groups of outliers because of its high breakdown point, the ad-hoc IQR and MAD approaches are fast, easy to implement, and could be more robust and efficient, depending on the distributional property of the underlying measure of interest. CONCLUSION: With highly skewed distributions for most NDNQI indicators within a short data screen window, the FAST-MCD approach, when used in one dimensional raw data setting, could overestimate the false alarm rates for potential outliers than the IQR and MAD with the same pre-set of critical value, thus, overburden data quality control at both the data entry and administrative ends in our setting.

Concepts: Median, Dimension, Absolute deviation, Normal distribution, Standard deviation, Robust statistics, Outlier, Median absolute deviation

168

Background Formulation and evaluation of public health policy commonly employs science-based mathematical models. For instance, epidemiological dynamics of TB is dominated, in general, by flow between actively and latently infected populations. Thus modelling is central in planning public health intervention. However, models are highly uncertain because they are based on observations that are geographically and temporally distinct from the population to which they are applied.Aims We aim to demonstrate the advantages of info-gap theory, a non-probabilistic approach to severe uncertainty when worst cases cannot be reliably identified and probability distributions are unreliable or unavailable. Info-gap is applied here to mathematical modelling of epidemics and analysis of public health decision-making.Methods Applying info-gap robustness analysis to tuberculosis/HIV (TB/HIV) epidemics, we illustrate the critical role of incorporating uncertainty in formulating recommendations for interventions. Robustness is assessed as the magnitude of uncertainty that can be tolerated by a given intervention. We illustrate the methodology by exploring interventions that alter the rates of diagnosis, cure, relapse and HIV infection.Results We demonstrate several policy implications. Equivalence among alternative rates of diagnosis and relapse are identified. The impact of initial TB and HIV prevalence on the robustness to uncertainty is quantified. In some configurations, increased aggressiveness of intervention improves the predicted outcome but also reduces the robustness to uncertainty. Similarly, predicted outcomes may be better at larger target times, but may also be more vulnerable to model error.Conclusions The info-gap framework is useful for managing model uncertainty and is attractive when uncertainties on model parameters are extreme. When a public health model underlies guidelines, info-gap decision theory provides valuable insight into the confidence of achieving agreed-upon goals.

Concepts: Public health, Epidemiology, Physics, Decision theory, Uncertainty, Robust statistics, Minimax, Info-gap decision theory

28

Quantifying diversity is of central importance for the study of structure, function and evolution of microbial communities. The estimation of microbial diversity has received renewed attention with the advent of large-scale metagenomic studies. Here, we consider what the diversity observed in a sample tells us about the diversity of the community being sampled. First, we argue that one cannot reliably estimate the absolute and relative number of microbial species present in a community without making unsupported assumptions about species abundance distributions. The reason for this is that sample data do not contain information about the number of rare species in the tail of species abundance distributions. We illustrate the difficulty in comparing species richness estimates by applying Chao’s estimator of species richness to a set of in silico communities: they are ranked incorrectly in the presence of large numbers of rare species. Next, we extend our analysis to a general family of diversity metrics (‘Hill diversities’), and construct lower and upper estimates of diversity values consistent with the sample data. The theory generalizes Chao’s estimator, which we retrieve as the lower estimate of species richness. We show that Shannon and Simpson diversity can be robustly estimated for the in silico communities. We analyze nine metagenomic data sets from a wide range of environments, and show that our findings are relevant for empirically-sampled communities. Hence, we recommend the use of Shannon and Simpson diversity rather than species richness in efforts to quantify and compare microbial diversity.The ISME Journal advance online publication, 14 February 2013; doi:10.1038/ismej.2013.10.

Concepts: Statistics, Mathematics, Estimation theory, Estimator, Approximation, Estimation, Microorganism, Robust statistics

28

Recently, there has been significant interest in robust fractal image coding for the purpose of robustness against outliers. However, the known robust fractal coding methods (HFIC and LAD-FIC, etc.) are not optimal, since, besides the high computational cost, they use the corrupted domain block as the independent variable in the robust regression model, which may adversely affect the robust estimator to calculate the fractal parameters (depending on the noise level). This paper presents a Huber fitting plane-based fractal image coding (HFPFIC) method. This method builds Huber fitting planes (HFPs) for the domain and range blocks, respectively, ensuring the use of an uncorrupted independent variable in the robust model. On this basis, a new matching error function is introduced to robustly evaluate the best scaling factor. Meanwhile, a median absolute deviation (MAD) about the median decomposition criterion is proposed to achieve fast adaptive quadtree partitioning for the image corrupted by salt & pepper noise. In order to reduce computational cost, the no-search method is applied to speedup the encoding process. Experimental results show that the proposed HFPFIC can yield superior performance over conventional robust fractal image coding methods in encoding speed and the quality of the restored image. Furthermore, the no-search method can significantly reduce encoding time and achieve less than 2.0 s for the HFPFIC with acceptable image quality degradation. In addition, we show that, combined with the MAD decomposition scheme, the HFP technique used as a robust method can further reduce the encoding time while maintaining image quality.

Concepts: Regression analysis, Function, Median, Absolute deviation, Standard deviation, Errors and residuals in statistics, Robust statistics, Median absolute deviation

27

The recent stunning rise in power conversion efficiencies (PCEs) of perovskite solar cells (PSCs) has triggered worldwide intense research. However, high PCE values have often been reached with poor stability at an illuminated area of typically less than 0.1 cm(2). We used heavily doped inorganic charge extraction layers in planar PSCs to achieve very rapid carrier extraction even with 10-20 nm thick layers avoiding pinholes and eliminating local structural defects over large areas. This robust inorganic nature allowed for the fabrication of PSCs with an aperture area >1 cm(2) showing a power conversion efficiency (PCE) >15% certified by an accredited photovoltaic calibration laboratory. Hysteresis in the current-voltage characteristics was eliminated; the PSCs were stable: >90% of the initial PCE remained after 1000 hours light soaking.

Concepts: Solar cell, Photovoltaics, Energy conversion, Energy conversion efficiency, Photovoltaic module, Robust statistics, P-n junction, Photovoltaic array

27

Hydrolysates of lignocellulosic biomass, used as substrates for the sustainable production of fuels and chemicals often contain high amounts of phenolic compounds inhibiting the production microbiota. Quantification of these inhibitor compounds may help to understand possible difficulties in bioprocessing and further the development of more efficient, robust and tolerable processes. A separation method based on capillary electrophoresis with UV detection was developed for the simultaneous quantification of 10 phenolic compounds that may have inhibitor properties. Intraday relative standard deviations were less than 0.7% for migration times and between 2.6% and 6.4% for peak areas. Interday relative standard deviations were less than 3.0% for migration times and between 5.0% and 7.2% for peak areas. The method was applied to demonstrate that Saccharomyces cerevisiae was able to decrease the concentrations of vanillin, coniferyl aldehyde, syringaldehyde, acetoguaiacone and cinnamic acid during the cultivation, whereas the concentrations of phenols increased.

Concepts: Gel electrophoresis, Saccharomyces cerevisiae, Saccharomyces pastorianus, Saccharomyces, Phenols, Vanillin, Phenolic compounds in wine, Robust statistics

25

The use of [(18) F]labelled nortropane derivative 2β-carbomethoxy-3β-(4-chlorophenyl)-8-(2-fluoroethyl)-nortropane (FECNT) as a dopamine transporter ligand for PET imaging is dependent on efficient radiosynthesis method. Herein, the automated synthesis of [(18) F]FECNT from its chlorinated precursor in commercially available SynChrom [(18) F] R&D module has been developed. The synthesis unit was readily configured for the one-step synthesis from corresponding chlorinated precursor. The radiolabeling process involved a classical [(18) F]fluoride nucleophilic substitution performed at 110 °C for 12 min and finally HPLC and SPE purification. Crude [(18) F]FECNT was obtained with a radiolabeling yield of 59 ± 12% (n = 5). The average uncorrected amount of [(18) F]FECNT in the final formulated dose was 2.0 ± 0.5 GBq (32 ± 7% overall decay-corrected yields) obtained with radiochemical purity over 99% and specific activity of 55 GBq/µmol. The total duration of the procedure was 80-90 min. An automated radiosynthesis of [(18) F]FECNT with high radiochemical purity may provide a simple and robust method of radiopharmaceutical preparation for routine clinical applications.

Concepts: Nucleophilic substitution, Substitution reaction, Positron emission tomography, The Final, Nucleophile, Dopamine, Yield, Robust statistics

25

The decision on the primary endpoint in a randomized clinical trial is of paramount importance and the combination of several endpoints might be a reasonable choice. Gómez and Lagakos (2013) have developed a method that quantifies how much more efficient it could be to use a composite instead of an individual relevant endpoint. From the information provided by the frequencies of observing the component endpoints in the control group and by the relative treatment effects on each individual endpoint, the Asymptotic Relative Efficiency (ARE) can be computed. This paper presents the applicability of the ARE method as a practical and objective tool to evaluate which components, among the plausible ones, are more efficient in the construction of the primary endpoint. The method is illustrated with two real cardiovascular clinical trials and is extended to allow for different dependence structures between the times to the individual endpoints. The influence of this choice on the recommendation on whether or not to use the composite endpoint as the primary endpoint for the investigation is studied. We conclude that the recommendation between using the composite or the relevant endpoint only depends on the frequencies of the endpoints and the relative effects of the treatment.

Concepts: Epidemiology, Clinical trial, Estimation theory, Estimator, ClinicalTrials.gov, Maximum likelihood, Pharmaceutical industry, Robust statistics

24

Converting signals at low intensities between different electromagnetic modes is an asset for future information technologies. In general, slightly asymmetric optical nanoantennas enable the coupling between bright and dark modes that they sustain. However, the conversion efficiency might be very low. Here, we show that the additional incorporation of a quantum emitter allows us to tremendously enhance this efficiency. The enhanced local density of states cycles the quantum emitter between its upper and lower level at an extremely high rate, hence converting the energy very efficiently. The process is robust with respect to possible experimental tolerances, and adds a new ingredient to be exploited while studying and applying coupling phenomena in optical nanosystems.

Concepts: Photon, Energy, Fundamental physics concepts, Physics, Light, Electromagnetic radiation, Frequency, Robust statistics

17

Loneliness and social isolation are major problems for older adults. Interventions and activities aimed at reducing social isolation and loneliness are widely advocated as a solution to this growing problem. The aim of this study was to conduct an integrative review to identify the range and scope of interventions that target social isolation and loneliness among older people, to gain insight into why interventions are successful and to determine the effectiveness of those interventions. Six electronic databases were searched from 2003 until January 2016 for literature relating to interventions with a primary or secondary outcome of reducing or preventing social isolation and/or loneliness among older people. Data evaluation followed Evidence for Policy and Practice Information and Co-ordinating Centre guidelines and data analysis was conducted using a descriptive thematic method for synthesising data. The review identified 38 studies. A range of interventions were described which relied on differing mechanisms for reducing social isolation and loneliness. The majority of interventions reported some success in reducing social isolation and loneliness, but the quality of evidence was generally weak. Factors which were associated with the most effective interventions included adaptability, a community development approach, and productive engagement. A wide range of interventions have been developed to tackle social isolation and loneliness among older people. However, the quality of the evidence base is weak and further research is required to provide more robust data on the effectiveness of interventions. Furthermore, there is an urgent need to further develop theoretical understandings of how successful interventions mediate social isolation and loneliness.

Concepts: Middle age, Sociology, Data, Old age, Robust statistics, Conducting, 2016