Concept: Rigid body
Ants can navigate over long distances between their nest and food sites using visual cues [1, 2]. Recent studies show that this capacity is undiminished when walking backward while dragging a heavy food item [3-5]. This challenges the idea that ants use egocentric visual memories of the scene for guidance [1, 2, 6]. Can ants use their visual memories of the terrestrial cues when going backward? Our results suggest that ants do not adjust their direction of travel based on the perceived scene while going backward. Instead, they maintain a straight direction using their celestial compass. This direction can be dictated by their path integrator  but can also be set using terrestrial visual cues after a forward peek. If the food item is too heavy to enable body rotations, ants moving backward drop their food on occasion, rotate and walk a few steps forward, return to the food, and drag it backward in a now-corrected direction defined by terrestrial cues. Furthermore, we show that ants can maintain their direction of travel independently of their body orientation. It thus appears that egocentric retinal alignment is required for visual scene recognition, but ants can translate this acquired directional information into a holonomic frame of reference, which enables them to decouple their travel direction from their body orientation and hence navigate backward. This reveals substantial flexibility and communication between different types of navigational information: from terrestrial to celestial cues and from egocentric to holonomic directional memories.
The salient feature of liquid crystal elastomers and networks is strong coupling between orientational order and mechanical strain. Orientational order can be changed by a wide variety of stimuli, including the presence of moisture. Changes in the orientation of constituents give rise to stresses and strains, which result in changes in sample shape. We have utilized this effect to build soft cellulose-based motor driven by humidity. The motor consists of a circular loop of cellulose film, which passes over two wheels. When humid air is present near one of the wheels on one side of the film, with drier air elsewhere, rotation of the wheels results. As the wheels rotate, the humid film dries. The motor runs so long as the difference in humidity is maintained. Our cellulose liquid crystal motor thus extracts mechanical work from a difference in humidity.
Essential to spatial orientation in the natural environment is a dynamic representation of direction and distance to objects. Despite the importance of 3D spatial localization to parse objects in the environment and to guide movement, most neurophysiological investigations of sensory mapping have been limited to studies of restrained subjects, tested with 2D, artificial stimuli. Here, we show for the first time that sensory neurons in the midbrain superior colliculus (SC) of the free-flying echolocating bat encode 3D egocentric space, and that the bat’s inspection of objects in the physical environment sharpens tuning of single neurons, and shifts peak responses to represent closer distances. These findings emerged from wireless neural recordings in free-flying bats, in combination with an echo model that computes the animal’s instantaneous stimulus space. Our research reveals dynamic 3D space coding in a freely moving mammal engaged in a real-world navigation task.
Sensorimotor control in vertebrates relies on internal models. When extending an arm to reach for an object, the brain uses predictive models of both limb dynamics and target properties. Whether invertebrates use such models remains unclear. Here we examine to what extent prey interception by dragonflies (Plathemis lydia), a behaviour analogous to targeted reaching, requires internal models. By simultaneously tracking the position and orientation of a dragonfly’s head and body during flight, we provide evidence that interception steering is driven by forward and inverse models of dragonfly body dynamics and by models of prey motion. Predictive rotations of the dragonfly’s head continuously track the prey’s angular position. The head-body angles established by prey tracking appear to guide systematic rotations of the dragonfly’s body to align it with the prey’s flight path. Model-driven control thus underlies the bulk of interception steering manoeuvres, while vision is used for reactions to unexpected prey movements. These findings illuminate the computational sophistication with which insects construct behaviour.
- Proceedings of the National Academy of Sciences of the United States of America
- Published about 1 year ago
Natural composites exhibit exceptional mechanical performance that often arises from complex fiber arrangements within continuous matrices. Inspired by these natural systems, we developed a rotational 3D printing method that enables spatially controlled orientation of short fibers in polymer matrices solely by varying the nozzle rotation speed relative to the printing speed. Using this method, we fabricated carbon fiber-epoxy composites composed of volume elements (voxels) with programmably defined fiber arrangements, including adjacent regions with orthogonally and helically oriented fibers that lead to nonuniform strain and failure as well as those with purely helical fiber orientations akin to natural composites that exhibit enhanced damage tolerance. Our approach broadens the design, microstructural complexity, and performance space for fiber-reinforced composites through site-specific optimization of their fiber orientation, strain, failure, and damage tolerance.
A modular molecular kit for the preparation of crystalline molecular rotors was devised from a set of stators and rotators to gain simple access to a large number of structures with different dynamic performance and physical properties. In this paper, we have accomplished this with crystalline molecular rotors self-assembled by halogen bonding of di-azabicylo[2.2.2]octane (DABCO), acting as a rotator, and a set of five fluorine-substituted iodobenzenes that take the role of the stator. Using variable temperature 1H T1 spin lattice relaxation measurements, we have shown that all structures display ultrafast Brownian rotation with activation energies ranging from 2.4 to 4.9 kcal/mol and a pre-exponential factors of the order of 1-9 x 1012 s-1. Lineshape analysis of quadrupo-lar echo 2H NMR measurements in selected examples indicated rotational trajectories consistent with the 3-fold or 6 fold symmetric potential of the rotator.
Gaze stabilization is an almost ubiquitous animal behaviour, one that is required to see the world clearly and without blur. Stomatopods, however, only fix their eyes on scenes or objects of interest occasionally. Almost uniquely among animals they explore their visual environment with a series pitch, yaw and torsional (roll) rotations of their eyes, where each eye may also move largely independently of the other. In this work, we demonstrate that the torsional rotations are used to actively enhance their ability to see the polarization of light. Both Gonodactylus smithii and Odontodactylus scyllarus rotate their eyes to align particular photoreceptors relative to the angle of polarization of a linearly polarized visual stimulus, thereby maximizing the polarization contrast between an object of interest and its background. This is the first documented example of any animal displaying dynamic polarization vision, in which the polarization information is actively maximized through rotational eye movements.
The magnetoelastic effect-the change of magnetic properties caused by the elastic deformation of a magnetic material-has been proposed as an alternative approach to magnetic fields for the low-power control of magnetization states of nanoelements since it avoids charge currents, which entail ohmic losses. Here, we have studied the effect of dynamic strain accompanying a surface acoustic wave on magnetic nanostructures in thermal equilibrium. We have developed an experimental technique based on stroboscopic X-ray microscopy that provides a pathway to the quantitative study of strain waves and magnetization at the nanoscale. We have simultaneously imaged the evolution of both strain and magnetization dynamics of nanostructures at the picosecond time scale and found that magnetization modes have a delayed response to the strain modes, adjustable by the magnetic domain configuration. Our results provide fundamental insight into magnetoelastic coupling in nanostructures and have implications for the design of strain-controlled magnetostrictive nano-devices.Understanding the effects of local dynamic strain on magnetization may help the development of magnetic devices. Foerster et al. demonstrate stroboscopic imaging that allows the observation of both strain and magnetization dynamics in nickel when surface acoustic waves are driven in the substrate.
- Proceedings of the National Academy of Sciences of the United States of America
- Published about 3 years ago
As raw sensory data are partial, our visual system extensively fills in missing details, creating enriched percepts based on incomplete bottom-up information. Despite evidence for internally generated representations at early stages of cortical processing, it is not known whether these representations include missing information of dynamically transforming objects. Long-range apparent motion (AM) provides a unique test case because objects in AM can undergo changes both in position and in features. Using fMRI and encoding methods, we found that the “intermediate” orientation of an apparently rotating grating, never presented in the retinal input but interpolated during AM, is reconstructed in population-level, feature-selective tuning responses in the region of early visual cortex (V1) that corresponds to the retinotopic location of the AM path. This neural representation is absent when AM inducers are presented simultaneously and when AM is visually imagined. Our results demonstrate dynamic filling-in in V1 for object features that are interpolated during kinetic transformations.
Information about translations and rotations of the body is critical for complex self-motion perception during spatial navigation. However, little is known about the nature and function of their convergence in the cortex. We measured neural activity in multiple areas in the macaque parietal cortex in response to three different types of body motion applied through a motion platform: translation, rotation, and combined stimuli, i.e., curvilinear motion. We found a continuous representation of motion types in each area. In contrast to single-modality cells preferring either translation-only or rotation-only stimuli, convergent cells tend to be optimally tuned to curvilinear motion. A weighted summation model captured the data well, suggesting that translation and rotation signals are integrated subadditively in the cortex. Interestingly, variation in the activity of convergent cells parallels behavioral outputs reported in human psychophysical experiments. We conclude that representation of curvilinear self-motion perception is widely distributed in the primate sensory cortex.