Filling-in visual motion with sound

Aleksander Väljamäe, Division of Applied Acoustics, Chalmers Univ. of Technology, Gothenburg, Sweden

Abstract
The surrounding environment often provides fragmented, incomplete, cues which our perceptual systems assembles into a unitary object or event (e.g. one can imagine, for example, a wild animal rushing toward you through the undergrowth of a cluttered forest). In the present study we explored the mechanisms behind the perception of discrete moving (approaching and receding) audio-visual stimuli using a changing loudness aftereffect. Consistent with previous research on continuous motion in depth, discrete auditory and visual stimuli sampled at 12.5 Hz produced significant aftereffects both within and across sensory modalities. However, for a lower rate visual flicker (6.25 Hz) the aftereffect became negligible. Remarkably, combining this ineffective low-rate visual flicker with high rate (12.5 Hz) auditory flutter resulted in an aftereffect comparable to the one produced by a high-rate audio-visual adaptor, and superior to the after-effect produced by the acoustic stimulus in isolation. Control studies showed that this interaction effect is dependent both on the directional congruency of the audio-visual adaptor and on the rate of auditory flutter. We suggest that the observed multisensory interaction stems from the occurrence of sound-induced illusory flashes, earlier reported by Shams, L., Kamitani, Y., & Shimojo, S. (2000, What you see is what you hear. Nature, 408, 788). Such auditory filling-in of discontinuous visual motion provides a perceptual account for multisensory compression techniques used in broadcasting and virtual reality applications.

Not available

Back to Abstract