SciCombinator

Discover the most talked about and latest scientific content & concepts.

Concept: Saccade

183

We often perform movements and actions on the basis of internal motivations and without any explicit instructions or cues. One common example of such behaviors is our ability to initiate movements solely on the basis of an internally generated sense of the passage of time. In order to isolate the neuronal signals responsible for such timed behaviors, we devised a task that requires nonhuman primates to move their eyes consistently at regular time intervals in the absence of any external stimulus events and without an immediate expectation of reward. Despite the lack of sensory information, we found that animals were remarkably precise and consistent in timed behaviors, with standard deviations on the order of 100 ms. To examine the potential neural basis of this precision, we recorded from single neurons in the lateral intraparietal area (LIP), which has been implicated in the planning and execution of eye movements. In contrast to previous studies that observed a build-up of activity associated with the passage of time, we found that LIP activity decreased at a constant rate between timed movements. Moreover, the magnitude of activity was predictive of the timing of the impending movement. Interestingly, this relationship depended on eye movement direction: activity was negatively correlated with timing when the upcoming saccade was toward the neuron’s response field and positively correlated when the upcoming saccade was directed away from the response field. This suggests that LIP activity encodes timed movements in a push-pull manner by signaling for both saccade initiation towards one target and prolonged fixation for the other target. Thus timed movements in this task appear to reflect the competition between local populations of task relevant neurons rather than a global timing signal.

Concepts: Nervous system, Neuron, Brain, Action potential, Eye, Saccade, Signal, Intraparietal sulcus

176

In human vision, acuity and color sensitivity are greatest at the center of fixation and fall off rapidly as visual eccentricity increases. Humans exploit the high resolution of central vision by actively moving their eyes three to four times each second. Here we demonstrate that it is possible to classify the task that a person is engaged in from their eye movements using multivariate pattern classification. The results have important theoretical implications for computational and neural models of eye movement control. They also have important practical implications for using passively recorded eye movements to infer the cognitive state of a viewer, information that can be used as input for intelligent human-computer interfaces and related applications.

Concepts: Brain, Human, Retina, Eye, Visual perception, Visual system, Saccade, Color

172

Smooth pursuit eye movements (SPEM) are needed to keep the retinal image of slowly moving objects within the fovea. Depending on the task, about 50%-80% of patients with schizophrenia have difficulties in maintaining SPEM. We designed a study that comprised different target velocities as well as testing for internal (extraretinal) guidance of SPEM in the absence of a visual target. We applied event-related fMRI by presenting four velocities (5, 10, 15, 20¬į/s) both with and without intervals of target blanking. 17 patients and 16 healthy participants were included. Eye movements were registered during scanning sessions. Statistical analysis included mixed ANOVAs and regression analyses of the target velocity on the Blood Oxygen Level Dependency (BOLD) signal. The main effect group and the interaction of velocity√ógroup revealed reduced activation in V5 and putamen but increased activation of cerebellar regions in patients. Regression analysis showed that activation in supplementary eye field, putamen, and cerebellum was not correlated to target velocity in patients in contrast to controls. Furthermore, activation in V5 and in intraparietal sulcus (putative LIP) bilaterally was less strongly correlated to target velocity in patients than controls. Altered correlation of target velocity and neural activation in the cortical network supporting SPEM (V5, SEF, LIP, putamen) implies impaired transformation of the visual motion signal into an adequate motor command in patients. Cerebellar regions seem to be involved in compensatory mechanisms although cerebellar activity in patients was not related to target velocity.

Concepts: Regression analysis, Statistics, Retina, Visual system, Saccade, Velocity, Oxygen saturation, Smooth pursuit

171

Saccades are so called ballistic movements which are executed without online visual feedback. After each saccade the saccadic motor plan is modified in response to post-saccadic feedback with the mechanism of saccadic adaptation. The post-saccadic feedback is provided by the retinal position of the target after the saccade. If the target moves after the saccade, gaze may follow the moving target. In that case, the eyes are controlled by the pursuit system, a system that controls smooth eye movements. Although these two systems have in the past been considered as mostly independent, recent lines of research point towards many interactions between them. We were interested in the question if saccade amplitude adaptation is induced when the target moves smoothly after the saccade. Prior studies of saccadic adaptation have considered intra-saccadic target steps as learning signals. In the present study, the intra-saccadic target step of the McLaughlin paradigm of saccadic adaptation was replaced by target movement, and a post-saccadic pursuit of the target. We found that saccadic adaptation occurred in this situation, a further indication of an interaction of the saccadic system and the pursuit system with the aim of optimized eye movements.

Concepts: Eye, Saccade, Eye movement, Eye tracking, Eye movement in language reading, Smooth pursuit

169

This paper introduces a model of oculomotor control during the smooth pursuit of occluded visual targets. This model is based upon active inference, in which subjects try to minimise their (proprioceptive) prediction error based upon posterior beliefs about the hidden causes of their (exteroceptive) sensory input. Our model appeals to a single principle - the minimisation of variational free energy - to provide Bayes optimal solutions to the smooth pursuit problem. However, it tries to accommodate the cardinal features of smooth pursuit of partially occluded targets that have been observed empirically in normal subjects and schizophrenia. Specifically, we account for the ability of normal subjects to anticipate periodic target trajectories and emit pre-emptive smooth pursuit eye movements - prior to the emergence of a target from behind an occluder. Furthermore, we show that a single deficit in the postsynaptic gain of prediction error units (encoding the precision of posterior beliefs) can account for several features of smooth pursuit in schizophrenia: namely, a reduction in motor gain and anticipatory eye movements during visual occlusion, a paradoxical improvement in tracking unpredicted deviations from target trajectories and a failure to recognise and exploit regularities in the periodic motion of visual targets. This model will form the basis of subsequent (dynamic causal) models of empirical eye tracking measurements, which we hope to validate, using psychopharmacology and studies of schizophrenia.

Concepts: Scientific method, Economics, Visual system, Saccade, Sensory system, Sense, Pupil, Smooth pursuit

139

Interactions between sensory pathways such as the visual and auditory systems are known to occur in the brain, but where they first occur is uncertain. Here, we show a multimodal interaction evident at the eardrum. Ear canal microphone measurements in humans (n = 19 ears in 16 subjects) and monkeys (n = 5 ears in three subjects) performing a saccadic eye movement task to visual targets indicated that the eardrum moves in conjunction with the eye movement. The eardrum motion was oscillatory and began as early as 10 ms before saccade onset in humans or with saccade onset in monkeys. These eardrum movements, which we dub eye movement-related eardrum oscillations (EMREOs), occurred in the absence of a sound stimulus. The amplitude and phase of the EMREOs depended on the direction and horizontal amplitude of the saccade. They lasted throughout the saccade and well into subsequent periods of steady fixation. We discuss the possibility that the mechanisms underlying EMREOs create eye movement-related binaural cues that may aid the brain in evaluating the relationship between visual and auditory stimulus locations as the eyes move.

Concepts: Brain, Auditory system, Eye, Saccade, Eye movement, Ear, Sound, Hearing

60

Visual sensitivity varies across the visual field in several characteristic ways. For example, sensitivity declines sharply in peripheral (vs. foveal) vision and is typically worse in the upper (vs. lower) visual field. These variations can affect processes ranging from acuity and crowding (the deleterious effect of clutter on object recognition) to the precision of saccadic eye movements. Here we examine whether these variations can be attributed to a common source within the visual system. We first compared the size of crowding zones with the precision of saccades using an oriented clock target and two adjacent flanker elements. We report that both saccade precision and crowded-target reports vary idiosyncratically across the visual field with a strong correlation across tasks for all participants. Nevertheless, both group-level and trial-by-trial analyses reveal dissociations that exclude a common representation for the two processes. We therefore compared crowding with two measures of spatial localization: Landolt-C gap resolution and three-dot bisection. Here we observe similar idiosyncratic variations with strong interparticipant correlations across tasks despite considerably finer precision. Hierarchical regression analyses further show that variations in spatial precision account for much of the variation in crowding, including the correlation between crowding and saccades. Altogether, we demonstrate that crowding, spatial localization, and saccadic precision show clear dissociations, indicative of independent spatial representations, whilst nonetheless sharing idiosyncratic variations in spatial topology. We propose that these topological idiosyncrasies are established early in the visual system and inherited throughout later stages to affect a range of higher-level representations.

Concepts: Eye, Vision, Visual perception, Visual system, Saccade, Eye tracking, Visual field, Idiosyncrasy

31

Diurnal flying animals such as birds depend primarily on vision to coordinate their flight path during goal-directed flight tasks. To extract the spatial structure of the surrounding environment, birds are thought to use retinal image motion (optical flow) that is primarily induced by motion of their head. It is unclear what gaze behaviors birds perform to support visuomotor control during rapid maneuvering flight in which they continuously switch between flight modes. To analyze this, we measured the gaze behavior of rapidly turning lovebirds in a goal-directed task: take-off and fly away from a perch, turn on a dime, and fly back and land on the same perch. High-speed flight recordings revealed that rapidly turning lovebirds perform a remarkable stereotypical gaze behavior with peak saccadic head turns up to 2700 degrees per second, as fast as insects, enabled by fast neck muscles. In between saccades, gaze orientation is held constant. By comparing saccade and wingbeat phase, we find that these super-fast saccades are coordinated with the downstroke when the lateral visual field is occluded by the wings. Lovebirds thus maximize visual perception by overlying behaviors that impair vision, which helps coordinate maneuvers. Before the turn, lovebirds keep a high contrast edge in their visual midline. Similarly, before landing, the lovebirds stabilize the center of the perch in their visual midline. The perch on which the birds land swings, like a branch in the wind, and we find that retinal size of the perch is the most parsimonious visual cue to initiate landing. Our observations show that rapidly maneuvering birds use precisely timed stereotypic gaze behaviors consisting of rapid head turns and frontal feature stabilization, which facilitates optical flow based flight control. Similar gaze behaviors have been reported for visually navigating humans. This finding can inspire more effective vision-based autopilots for drones.

Concepts: Retina, Vision, Visual perception, Visual system, Saccade, Flying and gliding animals, Takeoff, Turn

29

In monkeys deciding between alternative saccadic eye movements, lateral intraparietal (LIP) neurons representing each saccade fire at a rate proportional to the value of the reward expected upon its completion. This observation has been interpreted as indicating that LIP neurons encode saccadic value and that they mediate value-based decisions between saccades. Here, we show that LIP neurons representing a given saccade fire strongly not only if it will yield a large reward but also if it will incur a large penalty. This finding indicates that LIP neurons are sensitive to the motivational salience of cues. It is compatible neither with the idea that LIP neurons represent action value nor with the idea that value-based decisions take place in LIP neurons.

Concepts: Eye, Saccade, Eye movement, Eye tracking, Eye movement in language reading

28

When we make hand movements to visual targets, gaze usually leads hand position by a series of saccades to task-relevant locations. Recent research suggests that the slow smooth pursuit eye movement system may interact with the saccadic system in complex tasks, suggesting that the smooth pursuit system can receive non-retinal input. We hypothesise that a combination of saccades and smooth pursuit guides the hand movements towards a goal in a complex environment, using an internal representation of future trajectories as input to the visuomotor system. This would imply that smooth pursuit leads hand position, which is remarkable, as the general idea is that smooth pursuit is driven by retinal slip. To test this hypothesis, we designed a video-game task in which human subjects used their thumbs to move two cursors to a common goal position while avoiding stationary obstacles. We found that gaze led the cursors by a series of saccades interleaved with ocular fixation or pursuit. Smooth pursuit was correlated with neither cursor position nor cursor velocity. We conclude that a combination of fast and slow eye movements, driven by an internal goal instead of a retinal goal, led the cursor movements, and that both saccades and pursuit are driven by an internal representation of future trajectories of the hand. The lead distance of gaze relative to the hand may reflect a compromise between exploring future hand (cursor) paths and verifying that the cursors move along the desired paths.

Concepts: Eye, Microsaccade, Saccade, Eye movement, Eye tracking, Eye movement in language reading, Fixation, Smooth pursuit