Modulating tactile crossed-hands deficit by way of auditory and visual capture.

Elena Azañón, GRNC - Parc Científic de Barcelona

Abstract
The representation of tactile space during perception depends on crossmodal organisation (remapping) between different sensory modalities such as somatosensation, vision and proprioception. As a consequence of this remapping process, the perceived order of two tactile events applied to different hands is often reversed when the observers adopt a crossed-hands posture. This happens because information about different spatial reference frames is set in conflict. However, the way in which this remapping is carried out, and the kind of information relevant for this crossmodal organization of tactile space is largely unknown. In order to help answer these questions, we addressed whether auditory stimuli can influence tactile temporal processing, thus alleviating the cross-hands deficit by way of auditory capture. In a second study, we attempted to modulate the crossed-hands deficit by way of visual capture of touch with rubber-hands. The potential of sounds modulating tactile perception was limited, whereas there was a clear influence of vision information about hand posture. We will discuss these results in the context of current theories of spatial representation of touch and multisensory integration.

Not available

Back to Abstract