Visual and auditory cues for localization combine in a statistically optimal way.

Laurence Harris, Psychology, York University

Abstract
To assess how different senses contribute to our ability to localize events we presented two stimuli separated by 0.5s and asked subjects to judge whether the first stimulus was above or below the second while the distance between them was varied by the method of constant stimuli. Stimuli were lights, sounds or bimodal stimuli. Lights were made more difficult to localize by smearing them with a Gaussian distribution, sounds were made more difficult by asking them to be localized vertically. Each comparison was subject to sources of noise that could either be added together if the events were unrelated, such as the first and second stimulus, or in a statistically optimal way if both contributed to the estimate. The noise when bimodal stimuli were involved in the comparison was accurately predicted by a statistically optimal combination of visual and auditory information implying a multimodal localization mechanism.

Not available

Back to Abstract