Electrophysiological correlates of multisensory integration of ecologically valid audiovisual events

Jeroen Stekelenburg, Psychonomics Laboratory, Tilburg University

Abstract
We investigated whether the neural mechanisms underlying the integration of auditory speech sounds and visual articulatory gestures are different from those underlying audiovisual integration of ecologically valid non-speech objects. Event related potentials (ERPs) of the syllables /bi/ and /fu/ were compared with ERPs evoked by the clapping of hands and the tapping of a spoon. Experiment 1 demonstrated that both speech and non-speech stimuli showed similar speeding up and amplitude depression of auditory N1 if the sounds were combined with visually congruent information. Experiment 2 explored which information of the visual stimulus – its content, its potential to predict when the sound is to occur, or both - was crucial for these effects. For speech and non-speech stimuli alike, visually congruent and incongruent information evoked a speeding up and amplitude depression of auditory N1, thus demonstrating that timing and not content was crucial. Visual speeding up and amplitude depression of auditory N1 is thus not a speech-specific phenomenon.

Not available

Back to Abstract