CROSS-SENSORY INTERACTION AND ADAPTATION UNDERLYING SPATIAL LOCALIZATION

Gary Paige, Dept. of Neurobiology & Anatomy

Abstract
The brain uses vision and audition to construct a spatial map of the external world. The integrity of this map requires that the two senses maintain spatial calibration. This poses two key challenges. 1) Vision and audition are encoded using different mechanisms and coordinate schemes; visual images are topographically mapped directly onto the retina, while auditory space must be constructed centrally based upon interaural and spectral cues from the two ears. Spatial congruence requires adaptive mechanisms that co-calibrate the two sensory modalities over time, given sufficient cross-sensory interaction. 2) The visual and auditory frames of reference shift relative to one another during eye movements. The brain must account for this to maintain spatial register and constancy, presumably by exploiting an eye-in-head signal. We will discuss long-term adaptive as well as more immediate processes that ensure concordance between visual and auditory space. The latter focuses on how eye position systematically and dynamically shifts the localization of auditory, but not visual, targets across a broad spatial field. This may reflect dynamic errors in how eye position signals are used to align visual and auditory space, or that eccentric gaze gradually shifts (adapts) our sense of ‘straight-ahead.’

Not available

Back to Abstract