Rhythms, or patterns in time, play a vital role in both speech and music. Proficiency in a number of rhythm skills has been linked to language ability, suggesting that certain rhythmic processes in music and language rely on overlapping resources. However, a lack of understanding about how rhythm skills relate to each other has impeded progress in understanding how language relies on rhythm processing. In particular, it is unknown whether all rhythm skills are linked together, forming a single broad rhythmic competence, or whether there are multiple dissociable rhythm skills. We hypothesized that beat tapping and rhythm memory/sequencing form two separate clusters of rhythm skills. This hypothesis was tested with a battery of two beat tapping and two rhythm memory tests. Here we show that tapping to a metronome and the ability to adjust to a changing tempo while tapping to a metronome are related skills. The ability to remember rhythms and to drum along to repeating rhythmic sequences are also related. However, we found no relationship between beat tapping skills and rhythm memory skills. Thus, beat tapping and rhythm memory are dissociable rhythmic aptitudes. This discovery may inform future research disambiguating how distinct rhythm competencies track with specific language functions.
Cochlear implant users show a profile of residual, yet poorly understood, musical abilities. An ability that has received little to no attention in this population is entrainment to a musical beat. We show for the first time that a heterogeneous group of cochlear implant users is able to find the beat and move their bodies in time to Latin Merengue music, especially when the music is presented in unpitched drum tones. These findings not only reveal a hidden capacity for feeling musical rhythm through the body in the deaf and hearing impaired population, but illuminate promising avenues for designing early childhood musical training that can engage implanted children in social musical activities with benefits potentially extending to non-musical domains.
- The Journal of neuroscience : the official journal of the Society for Neuroscience
- Published over 7 years ago
Fundamental to the experience of music, beat and meter perception refers to the perception of periodicities while listening to music occurring within the frequency range of musical tempo. Here, we explored the spontaneous building of beat and meter hypothesized to emerge from the selective entrainment of neuronal populations at beat and meter frequencies. The electroencephalogram (EEG) was recorded while human participants listened to rhythms consisting of short sounds alternating with silences to induce a spontaneous perception of beat and meter. We found that the rhythmic stimuli elicited multiple steady state-evoked potentials (SS-EPs) observed in the EEG spectrum at frequencies corresponding to the rhythmic pattern envelope. Most importantly, the amplitude of the SS-EPs obtained at beat and meter frequencies were selectively enhanced even though the acoustic energy was not necessarily predominant at these frequencies. Furthermore, accelerating the tempo of the rhythmic stimuli so as to move away from the range of frequencies at which beats are usually perceived impaired the selective enhancement of SS-EPs at these frequencies. The observation that beat- and meter-related SS-EPs are selectively enhanced at frequencies compatible with beat and meter perception indicates that these responses do not merely reflect the physical structure of the sound envelope but, instead, reflect the spontaneous emergence of an internal representation of beat, possibly through a mechanism of selective neuronal entrainment within a resonance frequency range. Taken together, these results suggest that musical rhythms constitute a unique context to gain insight on general mechanisms of entrainment, from the neuronal level to individual level.
The current study aims at characterizing the mechanisms that allow humans to entrain the mind and body to incoming rhythmic sensory inputs in real time. We addressed this unresolved issue by examining the relationship between covert neural processes and overt behavior in the context of musical rhythm. We measured temporal prediction abilities, sensorimotor synchronization accuracy and neural entrainment to auditory rhythms as captured using an EEG frequency-tagging approach. Importantly, movement synchronization accuracy with a rhythmic beat could be explained by the amplitude of neural activity selectively locked with the beat period when listening to the rhythmic inputs. Furthermore, stronger endogenous neural entrainment at the beat frequency was associated with superior temporal prediction abilities. Together, these results reveal a direct link between cortical and behavioral measures of rhythmic entrainment, thus providing evidence that frequency-tagged brain activity has functional relevance for beat perception and synchronization.
Mechanical oscillators are present in almost every electronic device. They mainly consist of a resonating element providing an oscillating output with a specific frequency. Their ability to maintain a determined frequency in a specified period of time is the most important parameter limiting their implementation. Historically, quartz crystals have almost exclusively been used as the resonating element, but micromechanical resonators are increasingly being considered to replace them. These resonators are easier to miniaturize and allow for monolithic integration with electronics. However, as their dimensions shrink to the microscale, most mechanical resonators exhibit nonlinearities that considerably degrade the frequency stability of the oscillator. Here we demonstrate that, by coupling two different vibrational modes through an internal resonance, it is possible to stabilize the oscillation frequency of nonlinear self-sustaining micromechanical resonators. Our findings provide a new strategy for engineering low-frequency noise oscillators capitalizing on the intrinsic nonlinear phenomena of micromechanical resonators.
- IEEE transactions on ultrasonics, ferroelectrics, and frequency control
- Published over 7 years ago
In diagnostic medicine, microbubbles are used as contrast agents to image blood flow and perfusion in large and small vessels. The small vessels (the capillaries) have diameters from a few hundred micrometers down to less than 10 μ m. The effect of such microvessels surrounding the oscillating microbubbles is currently unknown, and is important for increased sensitivity in contrast diagnostics and manipulation of microbubbles for localized drug release. Here, oscillations of microbubbles in tubes with inner diameters of 25 μm and 160 ¿m are investigated using an ultra-high-speed camera at frame rates of ~12 million frames/s. A reduction of up to 50% in the amplitude of oscillation was observed for microbubbles in the smaller 25-μm tube, compared with those in a 160-μm tube. In the 25-μm tube, at 50 kPa, a 48% increase of microbubbles that did not oscillate above the noise level of the system was observed, indicating increased oscillation damping. No difference was observed between the resonance frequency curves calculated for microbubbles in 25-μm and 160-μm tubes. Although previous investigators have shown the effect of microvessels on microbubble oscillation at high ultrasound pressures, the present study provides the first optical images of low-amplitude microbubble oscillations in small tubes.
How animals precisely time behaviour over the lunar cycle is a decades-old mystery. Experiments on diverse species show this behaviour to be endogenous and under clock control but the mechanism has remained elusive. We present new experimental and analytical techniques to test the hypotheses for the semilunar clock and show that the rhythm of foraging behaviour in the intertidal isopod, Scyphax ornatus, can be precisely shifted by manipulating the lengths of the light/dark and tidal cycles. Using light T-cycles (Tcd) the resultant semilunar beat period undergoes shifts from 14.79 days to 6.47 days under T = 23 hours (h), or to 23.29 days under T = 24.3 h. In tidal T-cycles (Tt) of natural length Tt = 12.42 h, the semilunar rhythm is shifted to 24.5 days under Tt = 12.25 h and to 9.7 days under Tt = 12.65 h. The implications of this finding go beyond our model species and illustrate that longer period rhythms can be generated by shorter period clocks. Our novel analysis, in which periodic spline models are embedded within randomization tests, creates a new methodology for assessing long-period rhythms in chronobiology. Applications are far-reaching and extend to other species and rhythms, potentially including the human-ovarian cycle.
Entrainment of cortical rhythms to acoustic rhythms has been hypothesized to be the neural correlate of pulse and meter perception in music. Dynamic attending theory first proposed synchronization of endogenous perceptual rhythms nearly 40 years ago, but only recently has the pivotal role of neural synchrony been demonstrated. Significant progress has since been made in understanding the role of neural oscillations and the neural structures that support synchronized responses to musical rhythm. Synchronized neural activity has been observed in auditory and motor networks, and has been linked with attentional allocation and movement coordination. Here we describe a neurodynamic model that shows how self-organization of oscillations in interacting sensory and motor networks could be responsible for the formation of the pulse percept in complex rhythms. In a pulse synchronization study, we test the model’s key prediction that pulse can be perceived at a frequency for which no spectral energy is present in the amplitude envelope of the acoustic rhythm. The result shows that participants perceive the pulse at the theoretically predicted frequency. This model is one of the few consistent with neurophysiological evidence on the role of neural oscillation, and it explains a phenomenon that other computational models fail to explain. Because it is based on a canonical model, the predictions hold for an entire family of dynamical systems, not only a specific one. Thus, this model provides a theoretical link between oscillatory neurodynamics and the induction of pulse and meter in musical rhythm.
The thermoelectric voltage developed across an atomic metal junction (i.e., a nanostructure in which one or a few atoms connect two metal electrodes) in response to a temperature difference between the electrodes, results from the quantum interference of electrons that pass through the junction multiple times after being scattered by the surrounding defects. Here we report successfully tuning this quantum interference and thus controlling the magnitude and sign of the thermoelectric voltage by applying a mechanical force that deforms the junction. The observed switching of the thermoelectric voltage is reversible and can be cycled many times. Our ab initio and semi-empirical calculations elucidate the detailed mechanism by which the quantum interference is tuned. We show that the applied strain alters the quantum phases of electrons passing through the narrowest part of the junction and hence modifies the electronic quantum interference in the device. Tuning the quantum interference causes the energies of electronic transport resonances to shift, which affects the thermoelectric voltage. These experimental and theoretical studies reveal that Au atomic junctions can be made to exhibit both positive and negative thermoelectric voltages on demand, and demonstrate the importance and tunability of the quantum interference effect in the atomic-scale metal nanostructures.
Perception of a regular beat in music is inferred from different types of accents. For example, increases in loudness cause intensity accents, and the grouping of time intervals in a rhythm creates temporal accents. Accents are expected to occur on the beat: when accents are “missing” on the beat, the beat is more difficult to find. However, it is unclear whether accents occurring off the beat alter beat perception similarly to missing accents on the beat. Moreover, no one has examined whether intensity accents influence beat perception more or less strongly than temporal accents, nor how musical expertise affects sensitivity to each type of accent. In two experiments, we obtained ratings of difficulty in finding the beat in rhythms with either temporal or intensity accents, and which varied in the number of accents on the beat as well as the number of accents off the beat. In both experiments, the occurrence of accents on the beat facilitated beat detection more in musical experts than in musical novices. In addition, the number of accents on the beat affected beat finding more in rhythms with temporal accents than in rhythms with intensity accents. The effect of accents off the beat was much weaker than the effect of accents on the beat and appeared to depend on musical expertise, as well as on the number of accents on the beat: when many accents on the beat are missing, beat perception is quite difficult, and adding accents off the beat may not reduce beat perception further. Overall, the different types of accents were processed qualitatively differently, depending on musical expertise. Therefore, these findings indicate the importance of designing ecologically valid stimuli when testing beat perception in musical novices, who may need different types of accent information than musical experts to be able to find a beat. Furthermore, our findings stress the importance of carefully designing rhythms for social and clinical applications of beat perception, as not all listeners treat all rhythms alike.