Concept: Standard deviation
- Proceedings of the National Academy of Sciences of the United States of America
- Published almost 6 years ago
“Climate dice,” describing the chance of unusually warm or cool seasons, have become more and more “loaded” in the past 30 y, coincident with rapid global warming. The distribution of seasonal mean temperature anomalies has shifted toward higher temperatures and the range of anomalies has increased. An important change is the emergence of a category of summertime extremely hot outliers, more than three standard deviations (3σ) warmer than the climatology of the 1951-1980 base period. This hot extreme, which covered much less than 1% of Earth’s surface during the base period, now typically covers about 10% of the land area. It follows that we can state, with a high degree of confidence, that extreme anomalies such as those in Texas and Oklahoma in 2011 and Moscow in 2010 were a consequence of global warming because their likelihood in the absence of global warming was exceedingly small. We discuss practical implications of this substantial, growing, climate change.
BACKGROUND: Chronic childhood malnutrition remains common in India. As part of an initiative to improve maternal and child health in urban slums, we collected anthropometric data from a sample of children followed up from birth. We described the proportions of underweight, stunting, and wasting in young children, and examined their relationships with age. METHODS: We used two linked datasets: one based on institutional birth weight records for 17 318 infants, collected prospectively, and one based on follow-up of a subsample of 1941 children under five, collected in early 2010. RESULTS: Mean birth weight was 2736 g (SD 530 g), with a low birth weight (<2500 g) proportion of 22%. 21% of infants had low weight for age standard deviation (z) scores at birth (<-2 SD). At follow-up, 35% of young children had low weight for age, 17% low weight for height, and 47% low height for age. Downward change in weight for age was greater in children who had been born with higher z scores. DISCUSSION: Our data support the idea that much of growth faltering was explained by faltering in height for age, rather than by wasting. Stunting appeared to be established early and the subsequent decline in height for age was limited. Our findings suggest a focus on a younger age-group than the children over the age of three who are prioritized by existing support systems.FundingThe trial during which the birth weight data were collected was funded by the ICICI Foundation for Inclusive Growth (Centre for Child Health and Nutrition), and The Wellcome Trust (081052/Z/06/Z). Subsequent collection, analysis and development of the manuscript was funded by a Wellcome Trust Strategic Award: Population Science of Maternal and Child Survival (085417ma/Z/08/Z). D Osrin is funded by The Wellcome Trust (091561/Z/10/Z).
BACKGROUND: The symptom of tongue deviation is observed in a stroke or transient ischemic attack. Nevertheless, there is much room for the interpretation of the tongue deviation test. The crucial factor is the lack of an effective quantification method of tongue deviation. If we can quantify the features of the tongue deviation and scientifically verify the relationship between the deviation angle and a stroke, the information provided by the tongue will be helpful in recognizing a warning of a stroke. METHODS: In this study, a quantification method of the tongue deviation angle was proposed for the first time to characterize stroke patients. We captured the tongue images of stroke patients (15 males and 10 females, ranging between 55 and 82 years of age); transient ischemic attack (TIA) patients (16 males and 9 females, ranging between 53 and 79 years of age); and normal subjects (14 males and 11 females, ranging between 52 and 80 years of age) to analyze whether the method is effective. In addition, we used the receiver operating characteristic curve (ROC) for the sensitivity analysis, and determined the threshold value of the tongue deviation angle for the warning sign of a stroke. RESULTS: The means and standard deviations of the tongue deviation angles of the stroke, TIA, and normal groups were: 6.9 [PLUS-MINUS SIGN] 3.1, 4.9 [PLUS-MINUS SIGN] 2.1 and 1.4 [PLUS-MINUS SIGN] 0.8 degrees, respectively. Analyzed by the unpaired Student’s t-test, the p-value between the stroke group and the TIA group was 0.015 (>0.01), indicating no significant difference in the tongue deviation angle. The p-values between the stroke group and the normal group, as well as between the TIA group and the normal group were both less than 0.01. These results show the significant differences in the tongue deviation angle between the patient groups (stroke and TIA patients) and the normal group. These results also imply that the tongue deviation angle can effectively identify the patient group (stroke and TIA patients) and the normal group. With respect to the visual examination, 40% and 32% of stroke patients, 24% and 16% of TIA patients, and 4% and 0% of normal subjects were found to have tongue deviations when physicians “A” and “B” examined them. The variation showed the essentiality of the quantification method in a clinical setting. In the receiver operating characteristic curve (ROC), the Area Under Curve (AUC, = 0.96) indicates good discrimination. The tongue deviation angle more than the optimum threshold value (= 3.2[DEGREE SIGN]) predicts a risk of stroke. CONCLUSIONS: In summary, we developed an effective quantification method to characterize the tongue deviation angle, and we confirmed the feasibility of recognizing the tongue deviation angle as an early warning sign of an impending stroke.
BACKGROUND: To evaluate institutional nursing care performance in the context of national comparative statistics (benchmarks), approximately one in every three major healthcare institutions (over 1,800 hospitals) across the United States, have joined the National Database for Nursing Quality Indicators[REGISTERED SIGN] (NDNQI[REGISTERED SIGN]). With over 18,000 hospital units contributing data for nearly 200 quantitative measures at present, a reliable and efficient input data screening for all quantitative measures for data quality control is critical to the integrity, validity, and on-time delivery of NDNQI reports. METHODS: With Monte Carlo simulation and quantitative NDNQI indicator examples, we compared two ad-hoc methods using robust scale estimators, Inter Quartile Range (IQR) and Median Absolute Deviation from the Median (MAD), to the classic, theoretically-based Minimum Covariance Determinant (FAST-MCD) approach, for initial univariate outlier detection. RESULTS: While the theoretically based FAST-MCD used in one dimension can be sensitive and is better suited for identifying groups of outliers because of its high breakdown point, the ad-hoc IQR and MAD approaches are fast, easy to implement, and could be more robust and efficient, depending on the distributional property of the underlying measure of interest. CONCLUSION: With highly skewed distributions for most NDNQI indicators within a short data screen window, the FAST-MCD approach, when used in one dimensional raw data setting, could overestimate the false alarm rates for potential outliers than the IQR and MAD with the same pre-set of critical value, thus, overburden data quality control at both the data entry and administrative ends in our setting.
OBJECTIVE: To clearly define the proportions of benign vs malignant histologic findings in resected renal masses through an in-depth review of the contemporary medical data to assist in preoperative risk assessment. MATERIALS AND METHODS: PubMed and select oncology congresses were searched for publications that identify the histologic classification of resected renal masses in a representative sample from the contemporary data: [search] incidence AND (renal cell carcinoma AND benign); incidence AND (renal tumor AND benign); percentage AND (renal cell carcinoma AND benign); limit 2003-2011. RESULTS: We identified 26 representative studies meeting the inclusion criteria and incorporating 27,272 patients. The frequency of benign tumors ranged from 7% to 33%, with most studies within a few percentage points of the mean (14.5% ± 5.2%, median 13.9%). Clear cell renal cell carcinoma occurred in 46% to 83% of patients, with a mean of 68.3% (median 61.3; SD = 11.9%). An inverse relationship between tumor size and benign pathologic features was identified in 14 of 19 (74%) studies that examined an association between tumor size and pathologic characteristics. A statistically significant correlation between clear cell renal cell carcinoma and tumor size was identified in 13 of 19 studies (63%). The accuracy of preoperative cross-sectional imaging was low in the 2 studies examining computed tomography (17%). CONCLUSION: Benign renal tumors represent ∼15% of detected surgically resected renal masses and are more prevalent among small clinical T1a lesions. Noninvasive preoperative differentiation between more and less aggressive renal masses would be an important clinical advance that could allow clinicians greater diagnostic confidence and guide patient management through improved risk stratification.
We sought to describe the management of patients with atrial fibrillation (AF) in Europe after the release of the 2010 AF Guidelines of the European Society of Cardiology.METHODS AND RESULTS: The PREFER in AF registry enrolled consecutive patients with AF from January 2012 to January 2013 in 461 centres in seven European countries. Seven thousand two hundred and forty-three evaluable patients were enrolled, aged 71.5 ± 11 years, 60.1% male, CHA2DS2VASc score 3.4 ± 1.8 (mean ± standard deviation). Thirty per cent patients had paroxysmal, 24.0% had persistent, 7.2% had long-standing persistent, and 38.8% had permanent AF. Oral anticoagulation was used in the majority of patients: 4799 patients (66.3%) received a vitamin K antagonist (VKA) as mono-therapy, 720 patients a combination of VKA and antiplatelet agents (9.9%), 442 patients (6.1%) a new oral anticoagulant drugs (NOAC). Antiplatelet agents alone were given to 808 patients (11.2%), no antithrombotic therapy to 474 patients (6.5%). Of 7034 evaluable patients, 5530 (78.6%) patients were adequately rate controlled (mean heart rate 60-100 bpm). Half of the patients (50.7%) received rhythm control therapy by electrical cardioversion (18.1%), pharmacological cardioversion (19.5%), antiarrhythmic drugs (amiodarone 24.1%, flecainide or propafenone 13.5%, sotalol 5.5%, dronedarone 4.0%), and catheter ablation (5.0%).CONCLUSION: The management of AF patients in 2012 has adapted to recent evidence and guideline recommendations. Oral anticoagulant therapy with VKA (majority) or NOACs is given to over 80% of eligible patients, including those at risk for bleeding. Rate is often adequately controlled, and rhythm control therapy is widely used.
Wireless sensor networks (WSNs) have become more and more diversified and are today able to also support high data rate applications, such as multimedia. In this case, per-packet channel handshaking/switching may result in inducing additional overheads, such as energy consumption, delays and, therefore, data loss. One of the solutions is to perform stream-based channel allocation where channel handshaking is performed once before transmitting the whole data stream. Deciding stream-based channel allocation is more critical in case of multichannel WSNs where channels of different quality/stability are available and the wish for high performance requires sensor nodes to switch to the best among the available channels. In this work, we will focus on devising mechanisms that perform channel quality/stability estimation in order to improve the accommodation of stream-based communication in multichannel wireless sensor networks. For performing channel quality assessment, we have formulated a composite metric, which we call channel rank measurement (CRM), that can demarcate channels into good, intermediate and bad quality on the basis of the standard deviation of the received signal strength indicator (RSSI) and the average of the link quality indicator (LQI) of the received packets. CRM is then used to generate a data set for training a supervised machine learning-based algorithm (which we call Normal Equation based Channel quality prediction (NEC) algorithm) in such a way that it may perform instantaneous channel rank estimation of any channel. Subsequently, two robust extensions of the NEC algorithm are proposed (which we call Normal Equation based Weighted Moving Average Channel quality prediction (NEWMAC) algorithm and Normal Equation based Aggregate Maturity Criteria with Beta Tracking based Channel weight prediction (NEAMCBTC) algorithm), that can perform channel quality estimation on the basis of both current and past values of channel rank estimation. In the end, simulations are made using MATLAB, and the results show that the Extended version of NEAMCBTC algorithm (Ext-NEAMCBTC) outperforms the compared techniques in terms of channel quality and stability assessment. It also minimizes channel switching overheads (in terms of switching delays and energy consumption) for accommodating stream-based communication in multichannel WSNs.
Tornadoes cause loss of life and damage to property each year in the United States and around the world. The largest impacts come from ‘outbreaks’ consisting of multiple tornadoes closely spaced in time. Here we find an upward trend in the annual mean number of tornadoes per US tornado outbreak for the period 1954-2014. Moreover, the variance of this quantity is increasing more than four times as fast as the mean. The mean and variance of the number of tornadoes per outbreak vary according to Taylor’s power law of fluctuation scaling (TL), with parameters that are consistent with multiplicative growth. Tornado-related atmospheric proxies show similar power-law scaling and multiplicative growth. Path-length-integrated tornado outbreak intensity also follows TL, but with parameters consistent with sampling variability. The observed TL power-law scaling of outbreak severity means that extreme outbreaks are more frequent than would be expected if mean and variance were independent or linearly related.
Transendocardial stem cell injection in patients with ischemic cardiomyopathy (ICM) improves left ventricular function and structure but has ill-defined effects on ventricular arrhythmias. We hypothesized that mesenchymal stem cell (MSC) implantation is not proarrhythmic. Post hoc analyses were performed on ambulatory ECGs collected from the POSEIDON and TAC-HFT trials. Eighty-eight subjects (mean age 61 ± 10 years) with ICM (mean EF 32.2% ± 9.8%) received treatment with MSC (n = 48), Placebo (n = 21), or bone marrow mononuclear cells (BMC) (n = 19). Heart rate variability (HRV) and ventricular ectopy (VE) were evaluated over 12 months. VE did not change in any group following MSC implantation. However, in patients with ≥ 1 VE run (defined as ≥ 3 consecutive premature ventricular complexes in 24 hours) at baseline, there was a decrease in VE runs at 12 months in the MSC group (p = .01), but not in the placebo group (p = .07; intergroup comparison: p = .18). In a subset of the MSC group, HRV measures of standard deviation of normal intervals was 75 ± 30 msec at baseline and increased to 87 ± 32 msec (p =.02) at 12 months, and root mean square of intervals between successive complexes was 36 ± 30 msec and increased to 58.2 ± 50 msec (p = .01) at 12 months. In patients receiving MSCs, there was no evidence for ventricular proarrhythmia, manifested by sustained or nonsustained ventricular ectopy or worsened HRV. Signals of improvement in ventricular arrhythmias and HRV in the MSC group suggest a need for further studies of the antiarrhythmic potential of MSCs. © Stem Cells Translational Medicine 2017.
Purpose To determine if patient survival and mechanisms of right ventricular failure in pulmonary hypertension could be predicted by using supervised machine learning of three-dimensional patterns of systolic cardiac motion. Materials and Methods The study was approved by a research ethics committee, and participants gave written informed consent. Two hundred fifty-six patients (143 women; mean age ± standard deviation, 63 years ± 17) with newly diagnosed pulmonary hypertension underwent cardiac magnetic resonance (MR) imaging, right-sided heart catheterization, and 6-minute walk testing with a median follow-up of 4.0 years. Semiautomated segmentation of short-axis cine images was used to create a three-dimensional model of right ventricular motion. Supervised principal components analysis was used to identify patterns of systolic motion that were most strongly predictive of survival. Survival prediction was assessed by using difference in median survival time and area under the curve with time-dependent receiver operating characteristic analysis for 1-year survival. Results At the end of follow-up, 36% of patients (93 of 256) died, and one underwent lung transplantation. Poor outcome was predicted by a loss of effective contraction in the septum and free wall, coupled with reduced basal longitudinal motion. When added to conventional imaging and hemodynamic, functional, and clinical markers, three-dimensional cardiac motion improved survival prediction (area under the receiver operating characteristic curve, 0.73 vs 0.60, respectively; P < .001) and provided greater differentiation according to difference in median survival time between high- and low-risk groups (13.8 vs 10.7 years, respectively; P < .001). Conclusion A machine-learning survival model that uses three-dimensional cardiac motion predicts outcome independent of conventional risk factors in patients with newly diagnosed pulmonary hypertension. Online supplemental material is available for this article.