4th Annual Meeting of the International Multisensory Research Forum
    Home > Papers > Yale Cohen
Yale Cohen

LIP neurons are modulated by auditory and visual predictive cues
Single Paper Presentation

Yale Cohen
Psychological and Brain Sciences & Center for Cognitive Neuroscience, Dartmouth College

Gordon Gifford
Psychological and Brain Science, Dartmouth College

Ian Cohen
Psychological and Brain Science, Dartmouth College

     Abstract ID Number: 66
     Full text: Not available
     Last modified: May 20, 2003

Abstract
The posterior parietal cortex and specifically the lateral intraparietal area (area LIP) lie at an interface between sensation and action. If area LIP lies at such an interface, LIP activity should reflect the relationship between a stimulus and a specific task. However, data to date suggest that LIP neurons respond in a modality-dependent fashion. To further test this hypothesis, we examined whether LIP neurons were modulated similarly by auditory and visual cues that explicitly had the same task-related meaning. Specifically, in this study, we examined how auditory and visual cues, which predict, with 80% certainty, the future location of a visual target, modulate LIP activity. Two rhesus monkeys were trained on the task. Regardless of whether the cue was auditory or visual, monkeys’ responded faster to the target when the cue predicted its location (valid trial) than when the cue was not predictive (invalid trial). Moreover, the difference in response time was the same for the auditory and visual cues. 95 neurons were recorded from the two monkeys while they participated in the task. LIP neurons were modulated early by the auditory or visual cues, and, later in trial, were modulated by cue validity. The responses of LIP neurons to behaviorally-similar auditory and visual cues were highly correlated. However, on average, LIP neurons responded more robustly to visual cues than auditory cues. These results indicate that LIP neurons respond in a modality-dependent fashion even to auditory and visual cues that convey the same task-related meaning.


    Learn more
    about this
    publishing
    project...


Public Knowledge

 
Open Access Research
home | overview | program
papers | organization | links
  Top