Concept: Maurice White
In general, sad music is thought to cause us to experience sadness, which is considered an unpleasant emotion. As a result, the question arises as to why we listen to sad music if it evokes sadness. One possible answer to this question is that we may actually feel positive emotions when we listen to sad music. This suggestion may appear to be counterintuitive; however, in this study, by dividing musical emotion into perceived emotion and felt emotion, we investigated this potential emotional response to music. We hypothesized that felt and perceived emotion may not actually coincide in this respect: sad music would be perceived as sad, but the experience of listening to sad music would evoke positive emotions. A total of 44 participants listened to musical excerpts and provided data on perceived and felt emotions by rating 62 descriptive words or phrases related to emotions on a scale that ranged from 0 (not at all) to 4 (very much). The results revealed that the sad music was perceived to be more tragic, whereas the actual experiences of the participants listening to the sad music induced them to feel more romantic, more blithe, and less tragic emotions than they actually perceived with respect to the same music. Thus, the participants experienced ambivalent emotions when they listened to the sad music. After considering the possible reasons that listeners were induced to experience emotional ambivalence by the sad music, we concluded that the formulation of a new model would be essential for examining the emotions induced by music and that this new model must entertain the possibility that what we experience when listening to music is vicarious emotion.
Research in emotion regulation has largely focused on how people manage their own emotions, but there is a growing recognition that the ways in which we regulate the emotions of others also are important. Drawing on work from diverse disciplines, we propose an integrative model of the psychological and neural processes supporting the social regulation of emotion. This organizing framework, the ‘social regulatory cycle’, specifies at multiple levels of description the act of regulating another person’s emotions as well as the experience of being a target of regulation. The cycle describes the processing stages that lead regulators to attempt to change the emotions of a target person, the impact of regulation on the processes that generate emotions in the target, and the underlying neural systems.
Socialization of Early Prosocial Behavior: Parents' Talk about Emotions is Associated with Sharing and Helping in Toddlers
- Infancy : the official journal of the International Society on Infant Studies
- Published over 7 years ago
What role does socialization play in the origins of prosocial behavior? We examined one potential socialization mechanism, parents' discourse about others' emotions with very young children in whom prosocial behavior is still nascent. Two studies are reported, one of sharing in 18- and 24-month-olds (n = 29), and one of instrumental and empathy-based helping in 18- and 30-month-olds (n = 62). In both studies, parents read age-appropriate picture books to their children and the content and structure of their emotion-related and internal state discourse were coded. Results showed that children who helped and shared more quickly and more often, especially in tasks that required more complex emotion understanding, had parents who more often asked them to label and explain the emotions depicted in the books. Moreover, it was parents' elicitation of children’s talk about emotions rather than parents' own production of emotion labels and explanations that explained children’s prosocial behavior, even after controlling for age. Thus, it is the quality, not the quantity, of parents' talk about emotions with their toddlers that matters for early prosocial behavior.
Research on the development of selective trust has shown that young children do not indiscriminately trust all potential informants. They are likely to seek and endorse information from individuals who have proven competent or benign in the past. However, research on trust among adults raises the possibility that children might also be influenced by the emotions expressed by potential informants. In particular, they might trust individuals expressing more positive emotion. Indeed, young children’s trust in particular informants based on their past behaviour might be undermined by their currently expressed emotions. To examine this possibility, we tested the selective trust of fifty 4- and 5-year-olds in two steps. We first confirmed that children are likely to invest more trust in individuals expressing more positive emotion. We then showed that even if children have already formed an impression of two potential informants based on their behavioural record, their choices about whose claims to trust are markedly influenced by the degree of positive emotion currently expressed by the two informants. By implication, the facial emotions expressed by potential informants can undermine young children’s selective trust based on the behavioural record of those informants.
Infants may recognize facial expressions of emotion more readily when familiar faces express the emotions. Studies 1 and 2 investigated whether familiarity influences two metrics of emotion processing: Categorization and spontaneous preference. In Study 1 (n = 32), we replicated previous findings showing an asymmetrical pattern of categorization of happy and fearful faces in 6.5-month-old infants, and extended these findings by demonstrating that infants' categorization did not differ when emotions were expressed by familiar (i.e., caregiver) faces. In Study 2 (n = 34), we replicated the spontaneous preference for fearful over happy expressions in 6.5-month-old infants, and extended these findings by demonstrating that the spontaneous preference for fear was also present for familiar faces. Thus, infants' performance on two metrics of emotion processing did not differ depending on face familiarity.
- Nursing philosophy : an international journal for healthcare professionals
- Published over 3 years ago
Philosophical and empirical work on the nature of the emotions is extensive, and there are many theories of emotions. However, all agree that emotions are not knee jerk reactions to stimuli and are open to rational assessment or warrant. This paper’s focus is on the condition or conditions for compassion as an emotion and the likelihood that it or they can be met in nursing practice. Thus, it is attempting to keep, as far as possible, compassion as an emotion separate from both moral norms and professional norms. This is because empirical or causal conditions that can make experiencing and acting out of compassion difficult seem especially relevant in nursing practice. I consider how theories of emotion in general and of compassion in particular are somewhat contested, but all recent accounts agree that emotions are not totally immune to reason. Then, using accounts of constitutive conditions of the emotion of compassion, I will show how they are often likely to be quite fragile or unstable in practice and particularly so within much nursing practice. In addition, some of the conditions for compassion will be shown to be problematic for nursing practice. It is difficult to keep ideas of compassion separate from morality, and this connection is noticeable in the claims made of compassion for nursing and so I will briefly highlight one such connection that of the need for normative theory to give an account of the value that emotions such as compassion presume and that compassionate motivation is separate from moral motivation and may conflict with it. The fragility or instability of the emotion of compassion in practice has implications for both what can be expected and what should be expected of compassion; at least if what is wanted is a realist rather than idealist account of “should.”
Although the Emotions as a Child Scale (EAC) has been widely used in research with children and adolescents, no peer-reviewed studies have examined its factor structure using factor analytic methods. Likewise, the measurement equivalence of the scale across gender and race/ethnicity has never been investigated. To address these gaps, this study examines the factor structure of the scale in late adolescence and emerging adulthood, compares it to previous theory-driven models, and evaluates its measurement invariance across gender and 2 racial/ethnic groups. Participants were 1,087 individuals participating in a larger community-based study of adolescent health (M = 19.35 years, SD = 1.19). Results of exploratory and confirmatory factor analyses suggest that a 2-factor model from a shortened version of the scale (3 items were eliminated from each emotion scale), involving supportive and unsupportive socialization strategies, is a good alternative model to the original 5-factor structure for researchers interested in broader conceptualization of emotion socialization strategies. This 2-factor model of the shortened scale showed stronger measurement invariance across gender than racial/ethnic groups. Future studies addressing racial/ethnic differences with this measure should compare the results with and without imposing corresponding invariance constraints on noninvariant items. Findings of this study should be replicated in other age and racial/ethnic groups, and examine the predictive utility of the abbreviated 2-factor model for emotion-related outcomes across development. (PsycINFO Database Record
To explore the emotions work undertaken by practitioners with responsibility for the safeguarding of child wellbeing and establish whether there is a relationship between emotion work, role visibility, professional wellbeing and effectiveness of supportive frameworks.
The perception of emotions is an important component in enabling human beings to social interaction in everyday life. Thus, the ability to recognize the emotions of the other one’s mime is a key prerequisite for this.
As technology advances, robots and virtual agents will be introduced into the home and healthcare settings to assist individuals, both young and old, with everyday living tasks. Understanding how users recognize an agent’s social cues is therefore imperative, especially in social interactions. Facial expression, in particular, is one of the most common non-verbal cues used to display and communicate emotion in on-screen agents (Cassell, Sullivan, Prevost, & Churchill, 2000). Age is important to consider because age-related differences in emotion recognition of human facial expression have been supported (Ruffman et al., 2008), with older adults showing a deficit for recognition of negative facial expressions. Previous work has shown that younger adults can effectively recognize facial emotions displayed by agents (Bartneck & Reichenbach, 2005; Courgeon et al. 2009; 2011; Breazeal, 2003); however, little research has compared in-depth younger and older adults' ability to label a virtual agent’s facial emotions, an import consideration because social agents will be required to interact with users of varying ages. If such age-related differences exist for recognition of virtual agent facial expressions, we aim to understand if those age-related differences are influenced by the intensity of the emotion, dynamic formation of emotion (i.e., a neutral expression developing into an expression of emotion through motion), or the type of virtual character differing by human-likeness. Study 1 investigated the relationship between age-related differences, the implication of dynamic formation of emotion, and the role of emotion intensity in emotion recognition of the facial expressions of a virtual agent (iCat). Study 2 examined age-related differences in recognition expressed by three types of virtual characters differing by human-likeness (non-humanoid iCat, synthetic human, and human). Study 2 also investigated the role of configural and featural processing as a possible explanation for age-related differences in emotion recognition. First, our findings show age-related differences in the recognition of emotions expressed by a virtual agent, with older adults showing lower recognition for the emotions of anger, disgust, fear, happiness, sadness, and neutral. These age-related difference might be explained by older adults having difficulty discriminating similarity in configural arrangement of facial features for certain emotions; for example, older adults often mislabeled the similar emotions of fear as surprise. Second, our results did not provide evidence for the dynamic formation improving emotion recognition; but, in general, the intensity of the emotion improved recognition. Lastly, we learned that emotion recognition, for older and younger adults, differed by character type, from best to worst: human, synthetic human, and then iCat. Our findings provide guidance for design, as well as the development of a framework of age-related differences in emotion recognition.