What is it like to be invisible? This question has long fascinated man and has been the central theme of many classic literary works. Recent advances in materials science suggest that invisibility cloaking of the human body may be possible in the not-so-distant future. However, it remains unknown how invisibility affects body perception and embodied cognition. To address these questions, we developed a perceptual illusion of having an entire invisible body. Through a series of experiments, we characterized the multisensory rules that govern the elicitation of the illusion and show that the experience of having an invisible body reduces the social anxiety response to standing in front of an audience. This study provides an experimental model of what it is like to be invisible and shows that this experience affects bodily self-perception and social cognition.
Effect of frequent interruptions of prolonged sitting on self-perceived levels of energy, mood, food cravings and cognitive function
- The international journal of behavioral nutrition and physical activity
- Published about 3 years ago
While physical activity has been shown to improve cognitive performance and well-being, office workers are essentially sedentary. We compared the effects of physical activity performed as (i) one bout in the morning or (ii) as microbouts spread out across the day to (iii) a day spent sitting, on mood and energy levels and cognitive function.
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 6 years ago
The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation.
Psychophysiological evidence suggests that music and language are intimately coupled such that experience/training in one domain can influence processing required in the other domain. While the influence of music on language processing is now well-documented, evidence of language-to-music effects have yet to be firmly established. Here, using a cross-sectional design, we compared the performance of musicians to that of tone-language (Cantonese) speakers on tasks of auditory pitch acuity, music perception, and general cognitive ability (e.g., fluid intelligence, working memory). While musicians demonstrated superior performance on all auditory measures, comparable perceptual enhancements were observed for Cantonese participants, relative to English-speaking nonmusicians. These results provide evidence that tone-language background is associated with higher auditory perceptual performance for music listening. Musicians and Cantonese speakers also showed superior working memory capacity relative to nonmusician controls, suggesting that in addition to basic perceptual enhancements, tone-language background and music training might also be associated with enhanced general cognitive abilities. Our findings support the notion that tone language speakers and musically trained individuals have higher performance than English-speaking listeners for the perceptual-cognitive processing necessary for basic auditory as well as complex music perception. These results illustrate bidirectional influences between the domains of music and language.
We contrasted the predictive power of three measures of semantic richness-number of features (NFs), contextual dispersion (CD), and a novel measure of number of semantic neighbors (NSN)-for a large set of concrete and abstract concepts on lexical decision and naming tasks. NSN (but not NF) facilitated processing for abstract concepts, while NF (but not NSN) facilitated processing for the most concrete concepts, consistent with claims that linguistic information is more relevant for abstract concepts in early processing. Additionally, converging evidence from two datasets suggests that when NSN and CD are controlled for, the features that most facilitate processing are those associated with a concept’s physical characteristics and real-world contexts. These results suggest that rich linguistic contexts (many semantic neighbors) facilitate early activation of abstract concepts, whereas concrete concepts benefit more from rich physical contexts (many associated objects and locations).
Cognitive theories on deception posit that lying requires more cognitive resources than telling the truth. In line with this idea, it has been demonstrated that deceptive responses are typically associated with increased response times and higher error rates compared to truthful responses. Although the cognitive cost of lying has been assumed to be resistant to practice, it has recently been shown that people who are trained to lie can reduce this cost. In the present study (n = 42), we further explored the effects of practice on one’s ability to lie by manipulating the proportions of lie and truth-trials in a Sheffield lie test across three phases: Baseline (50% lie, 50% truth), Training (frequent-lie group: 75% lie, 25% truth; control group: 50% lie, 50% truth; and frequent-truth group: 25% lie, 75% truth), and Test (50% lie, 50% truth). The results showed that lying became easier while participants were trained to lie more often and that lying became more difficult while participants were trained to tell the truth more often. Furthermore, these effects did carry over to the test phase, but only for the specific items that were used for the training manipulation. Hence, our study confirms that relatively little practice is enough to alter the cognitive cost of lying, although this effect does not persist over time for non-practiced items.
INTRODUCTION: The Borderline Intellectual Functioning (BIF) is conceptualized as the frontier that delimits “normal” intellectual functioning from intellectual disability (IQ 71-85). In spite of its magnitude, its prevalence cannot be quantified and its diagnosis has not yet been defined. OBJECTIVES: To elaborate a conceptual framework and to establish consensus guidelines. METHOD: A mixed qualitative methodology, including frame analysis and nominal groups techniques, was used. The literature was extensively reviewed in evidence based medical databases, scientific publications, and the grey literature. This information was studied and a framing document was prepared. RESULTS: Scientific publications covering BIF are scarce. The term that yields a bigger number of results is “Borderline Intelligence”. The Working Group detected a number of areas in which consensus was needed and wrote a consensus document covering the conclusions of the experts and the framing document. CONCLUSIONS: It is a priority to reach an international consensus about the BIF construct and its operative criteria, as well as to develop specific tools for screening and diagnosis. It is also necessary to define criteria that enable its incidence and prevalence. To know what interventions are the most efficient, and what are the needs of this population, is vital to implement an integral model of care centred on the individual.
Over the last decade, several countries around the world developed a collective sense of doom and gloom: Their Zeitgeist could be characterized as one of decline. Paradoxically, in some countries, such as the Netherlands, this collective discontent with society seems to exist despite high levels of individual well-being. Current psychological research informs us about why individuals would feel unduly optimistic, but does not account for a collective sense of decline. The present research develops a novel operationalization of Zeitgeist, referred to as a general factor Z. We conceptualize Zeitgeist as a collective global-level evaluation of the state (and future) of society. Three studies confirm that perceptions of the same societal problems at the personal and collective level differed strongly. Across these studies we found support for a hypothesized latent factor Z, underlying collective-level perceptions of society. This Z-factor predicted people’s interpretation of new information about society that was presented through news stories. These results provide a first step in operationalizing and (ultimately) understanding the concept of Zeitgeist: collectively shared ideas about society. Implications for policy are discussed.
We experience the world as a seamless stream of percepts. However, intriguing illusions and recent experiments suggest that the world is not continuously translated into conscious perception. Instead, perception seems to operate in a discrete manner, just like movies appear continuous although they consist of discrete images. To explain how the temporal resolution of human vision can be fast compared to sluggish conscious perception, we propose a novel conceptual framework in which features of objects, such as their color, are quasi-continuously and unconsciously analyzed with high temporal resolution. Like other features, temporal features, such as duration, are coded as quantitative labels. When unconscious processing is “completed,” all features are simultaneously rendered conscious at discrete moments in time, sometimes even hundreds of milliseconds after stimuli were presented.
Potentially modifiable lifestyle factors may influence cognitive health in later life and offer potential to reduce the risk of cognitive decline and dementia. The concept of cognitive reserve has been proposed as a mechanism to explain individual differences in rates of cognitive decline, but its potential role as a mediating pathway has seldom been explored using data from large epidemiological studies. We explored the mediating effect of cognitive reserve on the cross-sectional association between lifestyle factors and cognitive function in later life using data from a population-based cohort of healthy older people.