Despite partial success, communication has remained impossible for persons suffering from complete motor paralysis but intact cognitive and emotional processing, a state called complete locked-in state (CLIS). Based on a motor learning theoretical context and on the failure of neuroelectric brain-computer interface (BCI) communication attempts in CLIS, we here report BCI communication using functional near-infrared spectroscopy (fNIRS) and an implicit attentional processing procedure. Four patients suffering from advanced amyotrophic lateral sclerosis (ALS)-two of them in permanent CLIS and two entering the CLIS without reliable means of communication-learned to answer personal questions with known answers and open questions all requiring a “yes” or “no” thought using frontocentral oxygenation changes measured with fNIRS. Three patients completed more than 46 sessions spread over several weeks, and one patient (patient W) completed 20 sessions. Online fNIRS classification of personal questions with known answers and open questions using linear support vector machine (SVM) resulted in an above-chance-level correct response rate over 70%. Electroencephalographic oscillations and electrooculographic signals did not exceed the chance-level threshold for correct communication despite occasional differences between the physiological signals representing a “yes” or “no” response. However, electroencephalogram (EEG) changes in the theta-frequency band correlated with inferior communication performance, probably because of decreased vigilance and attention. If replicated with ALS patients in CLIS, these positive results could indicate the first step towards abolition of complete locked-in states, at least for ALS.
Stories of g-tummo meditators mysteriously able to dry wet sheets wrapped around their naked bodies during a frigid Himalayan ceremony have intrigued scholars and laypersons alike for a century. Study 1 was conducted in remote monasteries of eastern Tibet with expert meditators performing g-tummo practices while their axillary temperature and electroencephalographic (EEG) activity were measured. Study 2 was conducted with Western participants (a non-meditator control group) instructed to use the somatic component of the g-tummo practice (vase breathing) without utilization of meditative visualization. Reliable increases in axillary temperature from normal to slight or moderate fever zone (up to 38.3°C) were observed among meditators only during the Forceful Breath type of g-tummo meditation accompanied by increases in alpha, beta, and gamma power. The magnitude of the temperature increases significantly correlated with the increases in alpha power during Forceful Breath meditation. The findings indicate that there are two factors affecting temperature increase. The first is the somatic component which causes thermogenesis, while the second is the neurocognitive component (meditative visualization) that aids in sustaining temperature increases for longer periods. Without meditative visualization, both meditators and non-meditators were capable of using the Forceful Breath vase breathing only for a limited time, resulting in limited temperature increases in the range of normal body temperature. Overall, the results suggest that specific aspects of the g-tummo technique might help non-meditators learn how to regulate their body temperature, which has implications for improving health and regulating cognitive performance.
Flexible, wearable sensing devices can yield important information about the underlying physiology of a human subject for applications in real-time health and fitness monitoring. Despite significant progress in the fabrication of flexible biosensors that naturally comply with the epidermis, most designs measure only a small number of physical or electrophysiological parameters, and neglect the rich chemical information available from biomarkers. Here, we introduce a skin-worn wearable hybrid sensing system that offers simultaneous real-time monitoring of a biochemical (lactate) and an electrophysiological signal (electrocardiogram), for more comprehensive fitness monitoring than from physical or electrophysiological sensors alone. The two sensing modalities, comprising a three-electrode amperometric lactate biosensor and a bipolar electrocardiogram sensor, are co-fabricated on a flexible substrate and mounted on the skin. Human experiments reveal that physiochemistry and electrophysiology can be measured simultaneously with negligible cross-talk, enabling a new class of hybrid sensing devices.
- Proceedings of the National Academy of Sciences of the United States of America
- Published over 6 years ago
Despite the fact that midday naps are characteristic of early childhood, very little is understood about the structure and function of these sleep bouts. Given that sleep benefits memory in young adults, it is possible that naps serve a similar function for young children. However, children transition from biphasic to monophasic sleep patterns in early childhood, eliminating the nap from their daily sleep schedule. As such, naps may contain mostly light sleep stages and serve little function for learning and memory during this transitional age. Lacking scientific understanding of the function of naps in early childhood, policy makers may eliminate preschool classroom nap opportunities due to increasing curriculum demands. Here we show evidence that classroom naps support learning in preschool children by enhancing memories acquired earlier in the day compared with equivalent intervals spent awake. This nap benefit is greatest for children who nap habitually, regardless of age. Performance losses when nap-deprived are not recovered during subsequent overnight sleep. Physiological recordings of naps support a role of sleep spindles in memory performance. These results suggest that distributed sleep is critical in early learning; when short-term memory stores are limited, memory consolidation must take place frequently.
We present, to our knowledge, the first demonstration that a non-invasive brain-to-brain interface (BBI) can be used to allow one human to guess what is on the mind of another human through an interactive question-and-answering paradigm similar to the “20 Questions” game. As in previous non-invasive BBI studies in humans, our interface uses electroencephalography (EEG) to detect specific patterns of brain activity from one participant (the “respondent”), and transcranial magnetic stimulation (TMS) to deliver functionally-relevant information to the brain of a second participant (the “inquirer”). Our results extend previous BBI research by (1) using stimulation of the visual cortex to convey visual stimuli that are privately experienced and consciously perceived by the inquirer; (2) exploiting real-time rather than off-line communication of information from one brain to another; and (3) employing an interactive task, in which the inquirer and respondent must exchange information bi-directionally to collaboratively solve the task. The results demonstrate that using the BBI, ten participants (five inquirer-respondent pairs) can successfully identify a “mystery item” using a true/false question-answering protocol similar to the “20 Questions” game, with high levels of accuracy that are significantly greater than a control condition in which participants were connected through a sham BBI.
To further test and explore the hypothesis that synchronous oscillatory brain activity supports interpersonally coordinated behavior during dyadic music performance, we simultaneously recorded the electroencephalogram (EEG) from the brains of each of 12 guitar duets repeatedly playing a modified Rondo in two voices by C.G. Scheidler. Indicators of phase locking and of within-brain and between-brain phase coherence were obtained from complex time-frequency signals based on the Gabor transform. Analyses were restricted to the delta (1-4 Hz) and theta (4-8 Hz) frequency bands. We found that phase locking as well as within-brain and between-brain phase-coherence connection strengths were enhanced at frontal and central electrodes during periods that put particularly high demands on musical coordination. Phase locking was modulated in relation to the experimentally assigned musical roles of leader and follower, corroborating the functional significance of synchronous oscillations in dyadic music performance. Graph theory analyses revealed within-brain and hyperbrain networks with small-worldness properties that were enhanced during musical coordination periods, and community structures encompassing electrodes from both brains (hyperbrain modules). We conclude that brain mechanisms indexed by phase locking, phase coherence, and structural properties of within-brain and hyperbrain networks support interpersonal action coordination (IAC).
In the past years, a few methods have been developed to translate human EEG to music. In 2009, PloS One 4 e5915, we developed a method to generate scale-free brainwave music where the amplitude of EEG was translated to music pitch according to the power law followed by both of them, the period of an EEG waveform is translated directly to the duration of a note, and the logarithm of the average power change of EEG is translated to music intensity according to the Fechner’s law. In this work, we proposed to adopt simultaneously-recorded fMRI signal to control the intensity of the EEG music, thus an EEG-fMRI music is generated by combining two different and simultaneous brain signals. And most importantly, this approach further realized power law for music intensity as fMRI signal follows it. Thus the EEG-fMRI music makes a step ahead in reflecting the physiological process of the scale-free brain.
Newborn infants display strong nociceptive behavior in response to tissue damaging stimuli, and this is accompanied by nociceptive activity generated in subcortical and cortical areas of the brain [1, 2]. In the absence of verbal report, these nociceptive responses are used as measures of pain sensation in newborn humans, as they are in animals [3, 4]. However, many infants are raised in a physiologically stressful environment, and little is known about the effect of background levels of stress upon their pain responses. In adults, acute physiological stress causes hyperalgesia [5-7], and increased background stress increases pain [8-10], but these data cannot necessarily be extrapolated to infants. Here we have simultaneously measured nociceptive behavior, brain activity, and levels of physiological stress in a sample of 56 newborn human infants aged 36-42 weeks. Salivary cortisol (hypothalamic pituitary axis), heart rate variability (sympathetic adrenal medullary system), EEG event-related potentials (nociceptive cortical activity), and facial expression (behavior) were acquired in individual infants following a clinically required heel lance. We show that infants with higher levels of stress exhibit larger amplitude cortical nociceptive responses, but this is not reflected in their behavior. Furthermore, while nociceptive behavior and cortical activity are normally correlated, this relationship is disrupted in infants with high levels of physiological stress. Brain activity evoked by noxious stimulation is therefore enhanced by stress, but this cannot be deduced from observation of pain behavior. This may be important in the prevention of adverse effects of early repetitive pain on brain development.
The human brain has evolved for group living . Yet we know so little about how it supports dynamic group interactions that the study of real-world social exchanges has been dubbed the “dark matter of social neuroscience” . Recently, various studies have begun to approach this question by comparing brain responses of multiple individuals during a variety of (semi-naturalistic) tasks [3-15]. These experiments reveal how stimulus properties , individual differences , and contextual factors  may underpin similarities and differences in neural activity across people. However, most studies to date suffer from various limitations: they often lack direct face-to-face interaction between participants, are typically limited to dyads, do not investigate social dynamics across time, and, crucially, they rarely study social behavior under naturalistic circumstances. Here we extend such experimentation drastically, beyond dyads and beyond laboratory walls, to identify neural markers of group engagement during dynamic real-world group interactions. We used portable electroencephalogram (EEG) to simultaneously record brain activity from a class of 12 high school students over the course of a semester (11 classes) during regular classroom activities (Figures 1A-1C; Supplemental Experimental Procedures, section S1). A novel analysis technique to assess group-based neural coherence demonstrates that the extent to which brain activity is synchronized across students predicts both student class engagement and social dynamics. This suggests that brain-to-brain synchrony is a possible neural marker for dynamic social interactions, likely driven by shared attention mechanisms. This study validates a promising new method to investigate the neuroscience of group interactions in ecologically natural settings.
Little is known about the spread of emotions beyond dyads. Yet, it is of importance for explaining the emergence of crowd behaviors. Here, we experimentally addressed whether emotional homogeneity within a crowd might result from a cascade of local emotional transmissions where the perception of another’s emotional expression produces, in the observer’s face and body, sufficient information to allow for the transmission of the emotion to a third party. We reproduced a minimal element of a crowd situation and recorded the facial electromyographic activity and the skin conductance response of an individual C observing the face of an individual B watching an individual A displaying either joy or fear full body expressions. Critically, individual B did not know that she was being watched. We show that emotions of joy and fear displayed by A were spontaneously transmitted to C through B, even when the emotional information available in B’s faces could not be explicitly recognized. These findings demonstrate that one is tuned to react to others' emotional signals and to unintentionally produce subtle but sufficient emotional cues to induce emotional states in others. This phenomenon could be the mark of a spontaneous cooperative behavior whose function is to communicate survival-value information to conspecifics.