Integration of visual and vestibular cues to heading
Single Paper Presentation
John Butler
Max Planck Institute for Biological Cybernetics, Tübingen, Germany
Heinrich Bülthoff
Max Planck Institute for Biological Cybernetics Stuart Smith
Psychology, University College Dubin, Ireland Abstract ID Number: 61 Full text:
Not available Last modified:
March 5, 2007
Presentation date: 07/05/2007 4:10 PM in Quad General Lecture Theatre
(View Schedule)
Abstract
Accurate perception of one’s self motion through the environment requires the successful integration of visual, vestibular, proprioceptive and auditory cues. We have applied Maximum Likelihood Estimation analysis to visual alone, vestibular alone and visual-vestibular linear self-motion (heading) estimation tasks. Using 2IFC method of constant stimuli and fitting the resulting psychometric data with the Matlab toolbox, psychofit (Wichman and Hill, 2001), we quantified perceptual uncertainty of heading discrimination by the standard deviation of the cumulative Gaussian fit. Our data show that when the uncertainty of visual and vestibular heading discrimination are matched in the combined information condition, there are two distinct classes of observers; those whose heading uncertainty is significantly reduced in the combined condition and those observers who’s combined heading uncertainty is significantly increased. Our results are discussed in relation to monkey behavioural and neurophysiological heading estimation data recently obtained by Angelaki and colleagues.
|
|
Learn more
about this
publishing
project...
|
|
|