Concept: Cumulative distribution function
Recent progress in Affective Computing (AC) has enabled integration of physiological cues and spontaneous expressions to reveal a subject’s emotional state. Due to the lack of an effective technique for evaluating multimodal correlations, experience and intuition play a main role in present AC studies when fusing affective cues or modalities, resulting in unexpected outcomes. This study seeks to demonstrate a dynamic correlation between two such affective cues, physiological changes and spontaneous expressions, which were obtained by a combination of stereo vision based tracking and imaging photoplethysmography (iPPG), with a designed protocol involving 20 healthy subjects. The two cues obtained were sampled into a Statistical Association Space (SAS) to evaluate their dynamic correlation. It is found that the probability densities in the SAS increase as the peaks in two cues are approached. Also the complex form of the high probability density region in the SAS suggests a nonlinear correlation between two cues. Finally the cumulative distribution on the zero time-difference surface is found to be small (<0.047) demonstrating a lack of simultaneity. These results show that the two cues have a close interrelation, that is both asynchronous and nonlinear, in which a peak of one cue heralds a peak in the other.
When lightning strikes soil, it may generate a cylindrical tube of glass known as a fulgurite. The morphology of a fulgurite is ultimately a consequence of the energy of the lightning strike that formed it, and hence fulgurites may be useful in elucidating the energy distribution frequency of cloud-to-ground lightning. Fulgurites from sand mines in Polk County, Florida, USA were collected and analyzed to determine morphologic properties. Here we show that the energy per unit length of lightning strikes within quartz sand has a geometric mean of ~1.0 MJ/m, and that the distribution is lognormal with respect to energy per length and frequency. Energy per length is determined from fulgurites as a function of diameter, and frequency is determined both by cumulative number and by cumulative length. This distribution parallels those determined for a number of lightning parameters measured in actual atmospheric discharge events, such as charge transferred, voltage, and action integral. This methodology suggests a potential useful pathway for elucidating lightning energy and damage potential of strikes.
We used an environmental justice screening tool (CalEnviroScreen 1.1) to compare the distribution of environmental hazards and vulnerable populations across California communities.
We explore a spatially implicit patch-occupancy model of a population on a landscape with continuous-valued heterogeneous habitat quality, primarily considering the case where the habitat quality of a site affects the mortality rate but not the fecundity of individuals at that site. Two analytical approaches to the model are constructed, by summing over the sites in the landscape and by integrating over the range of habitat quality. We obtain results relating the equilibrium population density and all moments of the probability distribution of the habitat quality of occupied sites, and relating the probability distributions of total habitat quality and occupied habitat quality. Special cases are considered for landscapes where habitat quality has either a uniform or a linear probability density function. For these cases, we demonstrate habitat association, where the quality of occupied sites is higher than the overall mean quality of all sites; the discrepancy between the two is reduced at larger population densities. The variance of the quality of occupied sites may be greater or less than the overall variance of habitat quality, depending on the distribution of habitat quality across the landscape. Increasing the variance of habitat quality is also shown to increase the ability of a population to persist on a landscape.
Decision Making for Risk Management: A Comparison of Graphical Methods for Presenting Quantitative Uncertainty
- Risk analysis : an official publication of the Society for Risk Analysis
- Published over 7 years ago
Previous research has shown that people err when making decisions aided by probability information. Surprisingly, there has been little exploration into the accuracy of decisions made based on many commonly used probabilistic display methods. Two experiments examined the ability of a comprehensive set of such methods to effectively communicate critical information to a decision maker and influence confidence in decision making. The second experiment investigated the performance of these methods under time pressure, a situational factor known to exacerbate judgmental errors. Ten commonly used graphical display methods were randomly assigned to participants. Across eight scenarios in which a probabilistic outcome was described, participants were asked questions regarding graph interpretation (e.g., mean) and made behavioral choices (i.e., act; do not act) based on the provided information indicated that decision-maker accuracy differed by graphical method; error bars and boxplots led to greatest mean estimation and behavioral choice accuracy whereas complementary cumulative probability distribution functions were associated with the highest probability estimation accuracy. Under time pressure, participant performance decreased when making behavioral choices.
In corrosion assessment, ultrasonic wall-thickness measurements are often presented in the form of a color map. However, this gives little quantitative information on the distribution of the thickness measurements. The collected data can be used to form an empirical cumulative distribution function (ECDF), which provides information on the fraction of the surface with less than a certain thickness. It has been speculated that the ECDF could be used to draw conclusions about larger areas, from inspection data of smaller sub-sections. A detailed understanding of the errors introduced by such an approach is required to be confident in its predictions. There are two major sources of error: the actual thickness variation due to the morphology of the surface and the interaction of the signal processing algorithm with the recorded ultrasonic signals. Parallel experimental and computational studies were performed using three surfaces, generated with Gaussian height distributions. The surfaces were machined onto mild steel plates and ultrasonic C-scans were performed, while the distributed point source method was used to perform equivalent simulations. ECDFs corresponding to each of these surfaces (for both the experimental and computational data) are presented and their variation with changing surface roughness and different timing algorithms is discussed.
The forehead was studied as a possible sampling site for capturing changes in volatile organic compound (VOC) profiles associated with psychological-stress. Skin-VOCs were sampled with a polydimethylsilicone (PDMS)-coupon and the resulting VOCs were recovered and analysed with two-stage thermal desorption gas chromatography-mass spectrometry. Fifteen young adult volunteers (19 years-26 years) participated in two interventions run in a randomised crossover design. One intervention, termed ‘Neutral’, required the participants to listen to peaceful music, the other, termed a ‘paced audio serial addition task’, required the participants to undertake a series of rapid mental arithmetic calculations in a challenging environment that induced a stress response. Skin-VOC samples were taken during each intervention. The resultant data were processed with dynamic background compensation, deconvolved, and registered to a common retention index scale. The importance of freezing skin patch samplers to -80 °C was determined during the method development phase of this study. The cumulative distribution function of the GC-MS data indicates the possibility that PDMS-coupons are selective towards the lower volatility VOC components in skin. The frequency distribution of the GC-MS data was observed to be approximately log-normal, and on the basis of this study, a further two-orders of magnitude reduction in sensitivity may be required before the complete skin-VOC profile may be characterised. Multi-variate analysis involving Pareto-scaling prior to partial least squares discriminant analysis identified four VOCs with the highest probability of contributing to the variance between the two states, and the responses to these VOCs were modelled with principle components analysis (PCA). Two VOCs, benzoic acid and n-decanoic acid were upregulated (14 and 8 fold respectively) and appear to be PASAT sensitive, with areas under (AUC) their receiver operator characteristic (ROC) curves of 0.813 and 0.852 respectively. A xylene isomer and 3-carene were down regulated 75% and 97% respectively, and found to be predictive of the neutral intervention (ROC AUC values of 0.898 and 0.929 respectively). VOC profiles in skin appear to change with stress either due to increased elimination, elevated bacterial activity, or perhaps increased oxidative pathways.
Despite being a paradigm of quantitative linguistics, Zipf’s law for words suffers from three main problems: its formulation is ambiguous, its validity has not been tested rigorously from a statistical point of view, and it has not been confronted to a representatively large number of texts. So, we can summarize the current support of Zipf’s law in texts as anecdotic. We try to solve these issues by studying three different versions of Zipf’s law and fitting them to all available English texts in the Project Gutenberg database (consisting of more than 30 000 texts). To do so we use state-of-the art tools in fitting and goodness-of-fit tests, carefully tailored to the peculiarities of text statistics. Remarkably, one of the three versions of Zipf’s law, consisting of a pure power-law form in the complementary cumulative distribution function of word frequencies, is able to fit more than 40% of the texts in the database (at the 0.05 significance level), for the whole domain of frequencies (from 1 to the maximum value), and with only one free parameter (the exponent).
Random number generation is crucial in many aspects of everyday life, as online security and privacy depend ultimately on the quality of random numbers. Many current implementations are based on pseudo-random number generators, but information security requires true random numbers for sensitive applications like key generation in banking, defence or even social media. True random number generators are systems whose outputs cannot be determined, even if their internal structure and response history are known. Sources of quantum noise are thus ideal for this application due to their intrinsic uncertainty. In this work, we propose using resonant tunnelling diodes as practical true random number generators based on a quantum mechanical effect. The output of the proposed devices can be directly used as a random stream of bits or can be further distilled using randomness extraction algorithms, depending on the application.
To estimate the cumulative probability © of arrest by age 28 years in the United States by disability status, race/ethnicity, and gender.