An identification task reveals body-relative maps linking touch to vision
Poster Presentation
Alessandra Jacomuzzi
Dipartimento di psicologia, Università di Trieste
Nicola Bruno
Dipartimento di psicologia, Università di Trieste Abstract ID Number: 49 Full text:
Not available
Last modified: March 16, 2005
Abstract
When an object is touched and then seen, the perceptual system must link the neural signals across the two sensory modalities and across time. In the traditional view of the functional architecture of cognition, links from touch to vision use abstract, modality independent representations computed in “associative” areas. In more recent approaches, they instead exploit object structural descriptions possibly in extriate areas involved in object recognition. We studied links between touch and vision with random-dot stereograms that cointained a carefully calibrated amount of stereonoise. Contrary to either of the commonly accepted views, in our studies identification times for seen forms preceded by touched forms revealed haptic interference but no facilitation. Moreover, this intersensory effect occurred only when (i) touched forms were actual surfaces as opposed to raised letters (identifying the names of the seen forms); and when (ii) seen forms were in the same location relative to the participant's body. Given that perceiving forms in the stereogram required solving the stereo correspondence problem (a process that is generally believed to occur in the primary visual areas) these findings suggests that touch and vision can interact at an earlier level than suggested by previous studies of semantic or structural cross-modal priming.
|
|
Learn more
about this
publishing
project...
|
|
|