Associative Learning in Multisensory Integration
Poster Presentation
Helen Bates
Psychology, Trinity College Dublin, Ireland
Ricky van der Zwan
Psycholgy, Southern Cross University, Coffs Harbour, Australia Stuart Smith
Psychology, Trinity College Dublin Abstract ID Number: 38 Full text:
Not available
Last modified: March 15, 2005
Abstract
A number of previous investigations have shown that perceptual learning can occur for a wide range of visual tasks including orientation discrimination, motion direction, spatial position detection and object recognition (review see Fine and Jacobs, 2002: Journal of Vision, 2(2),190-203). The present study investigates whether associative learning plays a key role in perception of bistable (Ternus) patterns when auditory information is integrated with the visual display. Our approach is situated within a Bayesian framework of perception, involving model-based matching of sensory data to stored “priors”/ data-independent knowledge. Associative learning mechanisms are hypothesized to provide perceptual systems with “prior” estimates of signal/modality reliability, which are used to guide optimal multimodal bindings. Following repeated presentation visual information (“Ternus” displays) with spatiotemporally coincident auditory information (high/low frequency tones) our data suggest that increasing probability of association between multimodal sources of information has a systematic effect on perceptual grouping phenomenon.
|
|
Learn more
about this
publishing
project...
|
|
|