Kawaii (a Japanese word meaning “cute”) things are popular because they produce positive feelings. However, their effect on behavior remains unclear. In this study, three experiments were conducted to examine the effects of viewing cute images on subsequent task performance. In the first experiment, university students performed a fine motor dexterity task before and after viewing images of baby or adult animals. Performance indexed by the number of successful trials increased after viewing cute images (puppies and kittens; M ± SE = 43.9±10.3% improvement) more than after viewing images that were less cute (dogs and cats; 11.9±5.5% improvement). In the second experiment, this finding was replicated by using a non-motor visual search task. Performance improved more after viewing cute images (15.7±2.2% improvement) than after viewing less cute images (1.4±2.1% improvement). Viewing images of pleasant foods was ineffective in improving performance (1.2±2.1%). In the third experiment, participants performed a global-local letter task after viewing images of baby animals, adult animals, and neutral objects. In general, global features were processed faster than local features. However, this global precedence effect was reduced after viewing cute images. Results show that participants performed tasks requiring focused attention more carefully after viewing cute images. This is interpreted as the result of a narrowed attentional focus induced by the cuteness-triggered positive emotion that is associated with approach motivation and the tendency toward systematic processing. For future applications, cute objects may be used as an emotion elicitor to induce careful behavioral tendencies in specific situations, such as driving and office work.
- Journal of the American Medical Informatics Association : JAMIA
- Published about 3 years ago
An individual’s birth month has a significant impact on the diseases they develop during their lifetime. Previous studies reveal relationships between birth month and several diseases including atherothrombosis, asthma, attention deficit hyperactivity disorder, and myopia, leaving most diseases completely unexplored. This retrospective population study systematically explores the relationship between seasonal affects at birth and lifetime disease risk for 1688 conditions.
Adults and children are spending more time interacting with media and technology and less time participating in activities in nature. This life-style change clearly has ramifications for our physical well-being, but what impact does this change have on cognition? Higher order cognitive functions including selective attention, problem solving, inhibition, and multi-tasking are all heavily utilized in our modern technology-rich society. Attention Restoration Theory (ART) suggests that exposure to nature can restore prefrontal cortex-mediated executive processes such as these. Consistent with ART, research indicates that exposure to natural settings seems to replenish some, lower-level modules of the executive attentional system. However, the impact of nature on higher-level tasks such as creative problem solving has not been explored. Here we show that four days of immersion in nature, and the corresponding disconnection from multi-media and technology, increases performance on a creativity, problem-solving task by a full 50% in a group of naive hikers. Our results demonstrate that there is a cognitive advantage to be realized if we spend time immersed in a natural setting. We anticipate that this advantage comes from an increase in exposure to natural stimuli that are both emotionally positive and low-arousing and a corresponding decrease in exposure to attention demanding technology, which regularly requires that we attend to sudden events, switch amongst tasks, maintain task goals, and inhibit irrelevant actions or cognitions. A limitation of the current research is the inability to determine if the effects are due to an increased exposure to nature, a decreased exposure to technology, or to other factors associated with spending three days immersed in nature.
Functional brain networks demonstrate significant temporal variability and dynamic reconfiguration even in the resting state. Currently, most studies investigate temporal variability of brain networks at the scale of single (micro) or whole-brain (macro) connectivity. However, the mechanism underlying time-varying properties remains unclear, as the coupling between brain network variability and neural activity is not readily apparent when analysed at either micro or macroscales. We propose an intermediate (meso) scale analysis and characterize temporal variability of the functional architecture associated with a particular region. This yields a topography of variability that reflects the whole-brain and, most importantly, creates an analytical framework to establish the fundamental relationship between variability of regional functional architecture and its neural activity or structural connectivity. We find that temporal variability reflects the dynamical reconfiguration of a brain region into distinct functional modules at different times and may be indicative of brain flexibility and adaptability. Primary and unimodal sensory-motor cortices demonstrate low temporal variability, while transmodal areas, including heteromodal association areas and limbic system, demonstrate the high variability. In particular, regions with highest variability such as hippocampus/parahippocampus, inferior and middle temporal gyrus, olfactory gyrus and caudate are all related to learning, suggesting that the temporal variability may indicate the level of brain adaptability. With simultaneously recorded electroencephalography/functional magnetic resonance imaging and functional magnetic resonance imaging/diffusion tensor imaging data, we also find that variability of regional functional architecture is modulated by local blood oxygen level-dependent activity and α-band oscillation, and is governed by the ratio of intra- to inter-community structural connectivity. Application of the mesoscale variability measure to multicentre datasets of three mental disorders and matched controls involving 1180 subjects reveals that those regions demonstrating extreme, i.e. highest/lowest variability in controls are most liable to change in mental disorders. Specifically, we draw attention to the identification of diametrically opposing patterns of variability changes between schizophrenia and attention deficit hyperactivity disorder/autism. Regions of the default-mode network demonstrate lower variability in patients with schizophrenia, but high variability in patients with autism/attention deficit hyperactivity disorder, compared with respective controls. In contrast, subcortical regions, especially the thalamus, show higher variability in schizophrenia patients, but lower variability in patients with attention deficit hyperactivity disorder. The changes in variability of these regions are also closely related to symptom scores. Our work provides insights into the dynamic organization of the resting brain and how it changes in brain disorders. The nodal variability measure may also be potentially useful as a predictor for learning and neural rehabilitation.
Amazon Mechanical Turk (AMT) is an online crowdsourcing service where anonymous online workers complete web-based tasks for small sums of money. The service has attracted attention from experimental psychologists interested in gathering human subject data more efficiently. However, relative to traditional laboratory studies, many aspects of the testing environment are not under the experimenter’s control. In this paper, we attempt to empirically evaluate the fidelity of the AMT system for use in cognitive behavioral experiments. These types of experiment differ from simple surveys in that they require multiple trials, sustained attention from participants, comprehension of complex instructions, and millisecond accuracy for response recording and stimulus presentation. We replicate a diverse body of tasks from experimental psychology including the Stroop, Switching, Flanker, Simon, Posner Cuing, attentional blink, subliminal priming, and category learning tasks using participants recruited using AMT. While most of replications were qualitatively successful and validated the approach of collecting data anonymously online using a web-browser, others revealed disparity between laboratory results and online results. A number of important lessons were encountered in the process of conducting these replications that should be of value to other researchers.
Attention deficit/hyperactivity disorder (ADHD) is associated with adverse outcomes and elevated societal costs. The American Academy of Pediatrics (AAP) 2011 guidelines recommend “behavior therapy” over medication as first-line treatment for children aged 4-5 years with ADHD; these recommendations are consistent with current guidelines from the American Academy of Child and Adolescent Psychiatry for younger children. CDC analyzed claims data to assess national and state-level ADHD treatment patterns among young children.
Hyperactivity is one of the core symptoms in attention deficit hyperactivity disorder (ADHD). However, it remains unclear in which way the motor system itself and its development are affected by the disorder. Movement-related potentials (MRP) can separate different stages of movement execution, from the programming of a movement to motor post-processing and memory traces. Pre-movement MRP are absent or positive during early childhood and display a developmental increase of negativity.
BACKGROUND:: Anesthesiology requires performing visually oriented procedures while monitoring auditory information about a patient’s vital signs. A concern in operating room environments is the amount of competing information and the effects that divided attention has on patient monitoring, such as detecting auditory changes in arterial oxygen saturation via pulse oximetry. METHODS:: The authors measured the impact of visual attentional load and auditory background noise on the ability of anesthesia residents to monitor the pulse oximeter auditory display in a laboratory setting. Accuracies and response times were recorded reflecting anesthesiologists' abilities to detect changes in oxygen saturation across three levels of visual attention in quiet and with noise. RESULTS:: Results show that visual attentional load substantially affects the ability to detect changes in oxygen saturation concentrations conveyed by auditory cues signaling 99 and 98% saturation. These effects are compounded by auditory noise, up to a 17% decline in performance. These deficits are seen in the ability to accurately detect a change in oxygen saturation and in speed of response. CONCLUSIONS:: Most anesthesia accidents are initiated by small errors that cascade into serious events. Lack of monitor vigilance and inattention are two of the more commonly cited factors. Reducing such errors is thus a priority for improving patient safety. Specifically, efforts to reduce distractors and decrease background noise should be considered during induction and emergence, periods of especially high risk, when anesthesiologists has to attend to many tasks and are thus susceptible to error.
In studies of change blindness, observers often have the phenomenological impression that the blindness is overcome all at once, so that change detection, localization and identification apparently occur together. Three experiments are described that explore dissociations between these processes using a discrete trial procedure in which 2 visual frames are presented sequentially with no intervening inter-frame-interval. The results reveal that change detection and localization are essentially perfect under these conditions regardless of the number of elements in the display, which is consistent with the idea that change detection and localization are mediated by pre-attentive parallel processes.In contrast, identification accuracy for an item before it changes is generally poor, and is heavily dependent on the number of items displayed. Identification accuracy after a change is substantially better, but depends on the new item’s duration. This suggests that the change captures attention, which substantially enhances the likelihood of correctly identifying the new item. However, the results also reveal a limited capacity to identify unattended items. Specifically, we provide evidence that strongly suggests that, at least under these conditions, observers were able to identify two items without focused attention. Our results further suggest that spatial pre-cues that attract attention to an item before the change occurs simply ensure that the cued item is one of the two whose identity is encoded.
INTRODUCTION: The aetiology of attention deficit hyperactivity disorder (ADHD) is attributed to different factors: genetic, environmental, and biological (neurotransmitters: dopaminergic system). Iron is essential for the correct functioning of the dopaminergic system. Iron deficiency is common in patients with ADHD, and its correction may be useful in the treatment. OBJECTIVES: To analyse a possible relationship between iron deficiency and symptoms of inattention, hyperactivity and impulsivity in ADHD patients, and the potential benefit of iron therapy. PATIENTS AND METHODS: A prospective study was conducted on non-anaemic and cognitively normal children, newly diagnosed with ADHD, according to DSM-IV criteria. Specific scales were used (SNAP-IV, ADHS) and serum ferritin was determined. Those with ferritin ≤ 30ng/ml were treated with ferrous sulphate (4mg/kg/day) for 3 months, with its effect quantified being subsequently quantified. RESULTS: A total of 60 patients, with a mean age of 9.02 years (range: 6-14), were analysed. The inattentive subtype was the most frequent one (53.3%). Almost two-thirds (63.3%) had iron deficiency, which was more frequent among the inattentive group (38 vs 22, P<.02). The iron treatment was completed by 17 patients. The treatment was not effective in 7 of the 8 non-inattentive subtypes, with a partial response in the remaining one. Of the 9 inattentive subtypes: the treatment was successful in the total control of symptoms in 5 of them, partially effective in other 3, and ineffective in one patient. The probability of complete response after treatment with iron was higher in inattentive patients with ADHD (P=.02). CONCLUSIONS: Treatment with iron supplements can be an effective alternative to treat patients with ADHD and iron deficiency, especially the inattentive subtype.