Integration of visual and vestibular cues to heading

John Butler, Max Planck Institute for Biological Cybernetics, Tübingen, Germany

Abstract
Accurate perception of one’s self motion through the environment requires the successful integration of visual, vestibular, proprioceptive and auditory cues. We have applied Maximum Likelihood Estimation analysis to visual alone, vestibular alone and visual-vestibular linear self-motion (heading) estimation tasks. Using 2IFC method of constant stimuli and fitting the resulting psychometric data with the Matlab toolbox, psychofit (Wichman and Hill, 2001), we quantified perceptual uncertainty of heading discrimination by the standard deviation of the cumulative Gaussian fit. Our data show that when the uncertainty of visual and vestibular heading discrimination are matched in the combined information condition, there are two distinct classes of observers; those whose heading uncertainty is significantly reduced in the combined condition and those observers who’s combined heading uncertainty is significantly increased. Our results are discussed in relation to monkey behavioural and neurophysiological heading estimation data recently obtained by Angelaki and colleagues.

Not available

Back to Abstract