Multisensory Integration in Virtual Environments
Single Paper Presentation
Heinrich Bülthoff
Max Planck Institute for Biological Cybernetics
Abstract ID Number: 129 Full text:
Not available Last modified:
June 14, 2007
Presentation date: 07/06/2007 4:30 PM in Quad General Lecture Theatre
(View Schedule)
Abstract
Many experiments which study the mechanisms by which different senses interact in humans focus on perception. In most natural tasks, however, sensory signals are not ultimately used for perception, but rather for action. The effects of the action are sensed again by the sensory system, so that perception and action are complementary parts of a dynamic control system. In our cybernetics research group at the Max Planck Institute in Tübingen, we use psychophysical, physiological, modeling and simulation techniques to study how cues from different sensory modalities are integrated by the brain to perceive and act in the real world. In psychophysical studies, we could show that humans can integrate multimodal sensory information in a statistically optimal way, such that cues are weighted according to their reliability. A better understanding of multimodal sensory fusion will allow us to build new virtual reality platforms in which the design effort for simulating the relevant modalities (visual, auditory, haptic, vestibular and proprioceptive) is influenced by the weight of each. In this talk we will discuss which of these characteristics would be necessary to allow valuable improvements in high-fidelity simulator design.
|
|
Learn more
about this
publishing
project...
|
|
|