Symposium: Speech as a window to multisensory integration processes
Multiple Paper Presentation
Salvador Soto-Faraco
Dept. Psicologia Bàsica - Facultat de Psicologia and Parc Científic de Barcelona, Universitat de Barcelona
Abstract ID Number: 20 Symposium Overview
The audiovisual integration of speech, whereby seen lip movement information is bound to speech sounds, has been frequently considered as a special case amongst multisensory processes. Yet, the study of how the multisensory signal of speech is handled by the brain has often become one of the most successful test models from which multisensory processes are studied in the human brain. One of the examples is the existence of very early influences of the visual input in areas originally thought to be in charge of auditory processing. However, less is known about the actual mechanisms that, through these early interactions, are behind the classic phenomenological findings such as the McGurk illusion. This symposium will present recent advances on what the functional as well as the neural architecture of these mechanisms is, providing further insights to the general question of multisensory binding.
Papers in this Symposium:
|
|
Learn more
about this
publishing
project...
|
|
|