Symposium: Audio-visual integration subserving sensory-motor and cognitive function
Multiple Paper Presentation
Douglas Munoz
Centre for Neuroscience Studies, Queen's University
Abstract ID Number: 9 Last modified:
January 14, 2006
Presentation date: 06/19/2006 2:00 PM in Hamilton Building, McNeil Theatre
(View Schedule)
Symposium Overview
A fundamental feature of advanced animals is the ability to sense, interact with, and navigate through a complex environment. Behavioural tasks such as speech perception and covert attentional orienting require neural systems that control a variety of motor actions under the guidance of different sensory systems. Here, we propose to examine and contrast the effects of visual and auditory cross-modal processing to guide different behaviours. Gary Paige will compare of eye position effects on the acquisition of visual or auditory spatial targets. Doug Munoz will discuss factors influencing covert orienting (i.e., attention capture and inhibition-of-return) that are differentially affected by visual versus auditory cueing. Martin Paré will describe how eye movements are used to help collect facial information during speech perception and the pattern of gaze fixations is dictated by strategies of subjects, not properties of the stimuli. Liz Romanski will discuss how single neurons integrate communication-relevant face and vocalization stimuli in the prefrontal cortex of non-human primates. These selective topics will generate a general discussion with the audience to identify commonalities and/or distinctions related to how information from auditory and visual systems combine to influence our perceptions of, and interactions with our environment.
Papers in this Symposium:
|
|
Learn more
about this
publishing
project...
|
|
|