7th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Alexandre Pouget
Alexandre Pouget

Neural basis of Bayes-optimal multisensory integration: theory and experiments
Multiple Paper Presentation

Alexandre Pouget
Brain and Cognitive Sciences, University of Rochester

     Abstract ID Number: 218
     Last modified: April 27, 2006
     Presentation date: 06/20/2006 2:00 PM in Hamilton Building, McNeil Theatre
     (View Schedule)

Abstract
Humans can combine multisensory inputs near optimally. This is quite remarkable considering that sensory inputs often come in very different formats, such as the voice of a speaker and the image of the lips movements. Moreover, sensory modalities are not equally reliable, and their degree of reliability can change from one instant to the next. I will present a neural model based on the notions of basis functions and probabilistic population codes, that can solve both of these problems optimally. The model makes two major predictions: 1- multisensory neurons should be mostly additive, and 2- their receptive fields are not necessarily in spatial correspondence. Both of these predictions are at odd with the current dogma in the multisensory integration literature, which claims that many neurons are superadditive, and that the receptive fields of most multisensory neurons are in spatial alignment. I will show that, in fact, experimental data support our predictions.

To be Presented at the Following Symposium:
Models of multisensory integration: synthetic vs. naturalistic situations?
Other papers in this Symposium:

Research
Support Tool
  For this 
refereed conference abstract
Capture Cite
View Metadata
Printer Friendly
Context
Author Bio
Define Terms
Related Studies
Media Reports
Google Search
Action
Email Author
Email Others
Add to Portfolio



    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | schedule | links
  Top