CROSS-SENSORY INTERACTION AND ADAPTATION UNDERLYING SPATIAL LOCALIZATION
Multiple Paper Presentation
Gary Paige
Dept. of Neurobiology & Anatomy
Babak
Razavi
Biomedical Engineering
William O'Neill
Dept. of Neurobiology & Anatomy Abstract ID Number: 10 Last modified:
January 14, 2006
Presentation date: 06/19/2006 2:00 PM in Hamilton Building, McNeil Theatre
(View Schedule)
Abstract
The brain uses vision and audition to construct a spatial map of the external world. The integrity of this map requires that the two senses maintain spatial calibration. This poses two key challenges. 1) Vision and audition are encoded using different mechanisms and coordinate schemes; visual images are topographically mapped directly onto the retina, while auditory space must be constructed centrally based upon interaural and spectral cues from the two ears. Spatial congruence requires adaptive mechanisms that co-calibrate the two sensory modalities over time, given sufficient cross-sensory interaction. 2) The visual and auditory frames of reference shift relative to one another during eye movements. The brain must account for this to maintain spatial register and constancy, presumably by exploiting an eye-in-head signal. We will discuss long-term adaptive as well as more immediate processes that ensure concordance between visual and auditory space. The latter focuses on how eye position systematically and dynamically shifts the localization of auditory, but not visual, targets across a broad spatial field. This may reflect dynamic errors in how eye position signals are used to align visual and auditory space, or that eccentric gaze gradually shifts (adapts) our sense of ‘straight-ahead.’
To be Presented at the Following Symposium:
Audio-visual integration subserving sensory-motor and cognitive function
Other papers in this Symposium:
|