Concept: Paul Ekman
Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another’s emotional expression produces, in the observer’s face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B’s faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others' emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.
Until recently, research in animal welfare science has mainly focused on negative experiences like pain and suffering, often neglecting the importance of assessing and promoting positive experiences. In rodents, specific facial expressions have been found to occur in situations thought to induce negatively valenced emotional states (e.g., pain, aggression and fear), but none have yet been identified for positive states. Thus, this study aimed to investigate if facial expressions indicative of positive emotional state are exhibited in rats. Adolescent male Lister Hooded rats (Rattus norvegicus, N = 15) were individually subjected to a Positive and a mildly aversive Contrast Treatment over two consecutive days in order to induce contrasting emotional states and to detect differences in facial expression. The Positive Treatment consisted of playful manual tickling administered by the experimenter, while the Contrast Treatment consisted of exposure to a novel test room with intermittent bursts of white noise. The number of positive ultrasonic vocalisations was greater in the Positive Treatment compared to the Contrast Treatment, indicating the experience of differentially valenced states in the two treatments. The main findings were that Ear Colour became significantly pinker and Ear Angle was wider (ears more relaxed) in the Positive Treatment compared to the Contrast Treatment. All other quantitative and qualitative measures of facial expression, which included Eyeball height to width Ratio, Eyebrow height to width Ratio, Eyebrow Angle, visibility of the Nictitating Membrane, and the established Rat Grimace Scale, did not show differences between treatments. This study contributes to the exploration of positive emotional states, and thus good welfare, in rats as it identified the first facial indicators of positive emotions following a positive heterospecific play treatment. Furthermore, it provides improvements to the photography technique and image analysis for the detection of fine differences in facial expression, and also adds to the refinement of the tickling procedure.
Unlike frozen snapshots of facial expressions that we often see in photographs, natural facial expressions are dynamic events that unfold in a particular fashion over time. But how important are the temporal properties of expressions for our ability to reliably extract information about a person’s emotional state? We addressed this question experimentally by gauging human performance in recognizing facial expressions with varying temporal properties relative to that of a statistically optimal (“ideal”) observer. We found that people recognized emotions just as efficiently when viewing them as naturally evolving dynamic events, temporally reversed events, temporally randomized events, or single images frozen in time. Our results suggest that the dynamic properties of human facial movements may play a surprisingly small role in people’s ability to infer the emotional states of others from their facial expressions.
Diagnostic features of emotional expressions are differentially distributed across the face. The current study examined whether these diagnostic features are preferentially attended to even when they are irrelevant for the task at hand or when faces appear at different locations in the visual field. To this aim, fearful, happy and neutral faces were presented to healthy individuals in two experiments while measuring eye movements. In Experiment 1, participants had to accomplish an emotion classification, a gender discrimination or a passive viewing task. To differentiate fast, potentially reflexive, eye movements from a more elaborate scanning of faces, stimuli were either presented for 150 or 2000 ms. In Experiment 2, similar faces were presented at different spatial positions to rule out the possibility that eye movements only reflect a general bias for certain visual field locations. In both experiments, participants fixated the eye region much longer than any other region in the face. Furthermore, the eye region was attended to more pronouncedly when fearful or neutral faces were shown whereas more attention was directed toward the mouth of happy facial expressions. Since these results were similar across the other experimental manipulations, they indicate that diagnostic features of emotional expressions are preferentially processed irrespective of task demands and spatial locations. Saliency analyses revealed that a computational model of bottom-up visual attention could not explain these results. Furthermore, as these gaze preferences were evident very early after stimulus onset and occurred even when saccades did not allow for extracting further information from these stimuli, they may reflect a preattentive mechanism that automatically detects relevant facial features in the visual field and facilitates the orientation of attention towards them. This mechanism might crucially depend on amygdala functioning and it is potentially impaired in a number of clinical conditions such as autism or social anxiety disorders.
Sexual arousal is a motivational state that moves humans toward situations that inherently pose a risk of disease transmission. Disgust is an emotion that adaptively moves humans away from such situations. Incongruent is the fact that sexual activity is elementary to human fitness yet involves strong disgust elicitors. Using an experimental paradigm, we investigated how these two states interact. Women (final N=76) were assigned to one of four conditions: rate disgust stimuli then watch a pornographic clip; watch a pornographic clip then rate disgust stimuli; rate fear stimuli then watch a pornographic clip; or watch a pornographic clip then rate fear stimuli. Women’s genital sexual arousal was measured with vaginal photoplethysmography and their disgust and fear reactions were measured via self-report. We did not find that baseline disgust propensity predicted sexual arousal in women who were exposed to neutral stimuli before erotic content. In the Erotic-before-Disgust condition we did not find that sexual arousal straightforwardly predicted decreased image disgust ratings. However, we did find some evidence that sexual arousal increased self-reported disgust in women with high trait disgust and sexual arousal decreased self-reported disgust in women with low trait disgust. Women who were exposed to disgusting images before erotic content showed significantly less sexual arousal than women in the control condition or women exposed to fear-inducing images before erotic content. In the Disgust-before-Erotic condition the degree of self-reported disgust was negatively correlated with genital sexual arousal. Hence, in the conflict between the ultimate goals of reproduction and disease avoidance, cues of the presence of pathogens significantly reduce the motivation to engage in mating behaviors that, by their nature, entail a risk of pathogen transmission.
Impairments in social communication are a core feature of Autism Spectrum Disorder (ASD). Because the ability to infer other people’s emotions from their facial expressions is critical for many aspects of social communication, deficits in expression recognition are a plausible candidate marker for ASD. However, previous studies on facial expression recognition produced mixed results, which may be due to differences in the sensitivity of the many tests used and/or the heterogeneity among individuals with ASD. To ascertain whether expression recognition may serve as a diagnostic marker (which distinguishes people with ASD from a comparison group) or a stratification marker (which helps to divide ASD into more homogeneous subgroups), a crucial first step is to move beyond identification of mean group differences and to better understand the frequency and severity of impairments.
Facial expressions convey key cues of human emotions, and may also be important for interspecies interactions. The universality hypothesis suggests that six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) should be expressed by similar facial expressions in close phylogenetic species such as humans and nonhuman primates. However, some facial expressions have been shown to differ in meaning between humans and nonhuman primates like macaques. This ambiguity in signalling emotion can lead to an increased risk of aggression and injuries for both humans and animals. This raises serious concerns for activities such as wildlife tourism where humans closely interact with wild animals. Understanding what factors (i.e., experience and type of emotion) affect ability to recognise emotional state of nonhuman primates, based on their facial expressions, can enable us to test the validity of the universality hypothesis, as well as reduce the risk of aggression and potential injuries in wildlife tourism.
Takotsubo syndrome (TTS) is typically provoked by negative stressors such as grief, anger, or fear leading to the popular term ‘broken heart syndrome’. However, the role of positive emotions triggering TTS remains unclear. The aim of the present study was to analyse the prevalence and characteristics of patients with TTS following pleasant events, which are distinct from the stressful or undesirable episodes commonly triggering TTS.
Facial expression of emotion is a foundational aspect of social interaction and nonverbal communication. In this study, we use a computer-animated 3D facial tool to investigate how dynamic properties of a smile are perceived. We created smile animations where we systematically manipulated the smile’s angle, extent, dental show, and dynamic symmetry. Then we asked a diverse sample of 802 participants to rate the smiles in terms of their effectiveness, genuineness, pleasantness, and perceived emotional intent. We define a “successful smile” as one that is rated effective, genuine, and pleasant in the colloquial sense of these words. We found that a successful smile can be expressed via a variety of different spatiotemporal trajectories, involving an intricate balance of mouth angle, smile extent, and dental show combined with dynamic symmetry. These findings have broad applications in a variety of areas, such as facial reanimation surgery, rehabilitation, computer graphics, and psychology.
Facial expressions are important for humans in communicating emotions to the conspecifics and enhancing interpersonal understanding. Many muscles producing facial expressions in humans are also found in domestic dogs, but little is known about how humans perceive dog facial expressions, and which psychological factors influence people’s perceptions. Here, we asked 34 observers to rate the valence, arousal, and the six basic emotions (happiness, sadness, surprise, disgust, fear, and anger/aggressiveness) from images of human and dog faces with Pleasant, Neutral and Threatening expressions. We investigated how the subjects' personality (the Big Five Inventory), empathy (Interpersonal Reactivity Index) and experience of dog behavior affect the ratings of dog and human faces. Ratings of both species followed similar general patterns: human subjects classified dog facial expressions from pleasant to threatening very similarly to human facial expressions. Subjects with higher emotional empathy evaluated Threatening faces of both species as more negative in valence and higher in anger/aggressiveness. More empathetic subjects also rated the happiness of Pleasant humans but not dogs higher, and they were quicker in their valence judgments of Pleasant human, Threatening human and Threatening dog faces. Experience with dogs correlated positively with ratings of Pleasant and Neutral dog faces. Personality also had a minor effect on the ratings of Pleasant and Neutral faces in both species. The results imply that humans perceive human and dog facial expression in a similar manner, and the perception of both species is influenced by psychological factors of the evaluators. Especially empathy affects both the speed and intensity of rating dogs' emotional facial expressions.