Integration of semantically unrelated and semantically contingent object features reveals cortical hierarchy in human audio-visual object recognition
Poster Presentation
Grit Hein
Brain Imaging Center, Cognitive Neurology Unit
Oliver Doehrmann
Institute of Medical Psychology, JW Goethe University Notger G. Müller
Brain Imaging Center, Cognitive Neurology Unit Jochen Kaiser
Institute of Medical Psychology, JW Goethe University Lars Muckli
Department of Neurophysiology, Max Planck Institute for Brain Research, Brain Imaging Center Marcus J. Naumer
Institute of Medical Psychology, JW Goethe University, Institute of Medical Psychology Abstract ID Number: 35 Full text:
Not available Last modified:
March 10, 2006
Presentation date: 06/19/2006 4:00 PM in Hamilton Building, Foyer
(View Schedule)
Abstract
Object recognition often requires the neuronal integration of auditory and visual features. In many cases, for example human machine interfaces, these object features are abstract and semantically unrelated. Using functional magnetic resonance imaging we investigated audio-visual integration of abstract, semantically unrelated object features. Neuronal integration sites for semantically unrelated abstract object images and sounds were compared to integration of animal sounds and images linked by varying semantic contingencies. Our results revealed a cortical hierarchy in audio-visual object recognition. Integration in precentral sulcus, the cortically highest integration level, was based on temporal and spatial contingencies, independently of semantic contingency. Posterior superior temporal sulcus (pSTS) preferred semantically contingent material. Semantically strongly related pairs were preferably integrated in dorsal portions of pSTS, object pairs with weaker semantic relationship in ventral pSTS parts. Auditory cortex, the cortically lowest integration level, was highest specialized, i.e., preferably integrated semantically strongly related object sounds and images.
|