Assessment of visuo-auditory cognition by immersion in virtual reality

Ludivine Sarlat, CNRS UPMC UMR 7593, Hôpital de la Salpêtrière, Paris, France

Abstract
The present study investigates visuo-auditory spatial cognition with a navigation task in visual virtual environment (VE) integrating the auditory modality.
Subjects (n=37) were equipped with a head-mounted display coupled with an electromagnetic sensor system and immersed in a virtual town. They also weared headphones, which delivered a soundscape updated in real-time according to their movement in the virtual town. Their task was to explore the VE to find auditory and visual landmarks. After immersion, subject made a recognition task with bimodal elements and a recall of landmarks on a schematic map of the town.
Results showed that subjects performed very well in exploration task (number of landmarks found, short time spent in the VE) and had a relative good report of landmarks. They presented a high level of presence (feeling of immersion). Subjects performance in the recognition task were quite good; errors were mainly rejected target items rather than false recognition.
Researches on multimodal VR have to develop more accurate knowledge of human spatial abilities and VR setup have to be improved specially in the mapping between the motor outflow and the multiple sensory feedbacks.

Not available

Back to Abstract