Investigation of auditory-visual integration in VR environments
Khoa-Van Nguyen, IRCAM
Abstract
Investigating the time and spatial constraints under which visual and auditory stimuli are perceived as a unique percept or as spatially coincident has been a topic of numerous researches. However, these findings have been derived up to now in extremely simplified stimulation context consisting in the combination of elementary auditory and visual stimuli usually displayed in dark and anechoic conditions.
The present experiment is conducted in a VR environment using a passive stereoscopic display and binaural audio rendering. Subjects have to indicate the point of subjective spatial alignment (PSSA) between a horizontally moving visual stimulus that crosses the direction of a stationary sound. Auditory stimuli are displayed on headphones using individualized head-related transfer functions and the visual stimulus is integrated in a visual background texture in order to convey visual perspective. Two types of audio stimuli are used to evaluate the influence of auditory localisation acuity on the auditory-visual integration: periodic white noise bursts providing optimal localisation cues and periodic 1kHz tone bursts.
The present study will indicate whether previous findings (Lewald et al., Behavioural Brain Research, 2001) still hold in more complex audio-visual contexts such as those offered by cutting edge VR environments.
Not available
Back to Abstract
|