Crossmodal interactions in perception, memory and learning: on the scene and behind the scene Ladan Shams, Ph.D.UCLA Psychology Department What are the principles that govern crossmodal interactions? Comparing human observers’ multisensory perception with that of a Bayesian observer, we have found that humans’ multisensory perception is consistent with Bayesian inference both in determining when to combine the crossmodal information and how to combine them. The former problem is a type of causal inference. Causal inference, which has been largely studied in the context of cognitive reasoning, is in fact a critical problem in perception. Our Bayesian causal inference model accounts for a wide range of phenomena including a wide range of multisensory illusions, as well as counter-intuitive phenomena such as partial integration and negative gain. In accounting for both perception of…
Organizers: Krish Sathian1 & Charles Spence2 1Penn State College of Medicine, Milton S. Hershey Medical Center 2Oxford University Abstract: Crossmodal correspondences are a very active field of study. A major issue is the nature of their relationship to language. In this symposium, we bring together a number of speakers to address this issue, based on recent work. Charles Spence, the symposium co-organizer, will lead off with introductory remarks. Next, Laura Speed will discuss the role of language in odor-color correspondences in synesthetes and non-synesthetes. This will be followed by a review of potential mechanisms underlying sound symbolism, by David Sidhu. Finally, Krish Sathian, the symposium organizer, will present findings on the neural basis of sound symbolic crossmodal correspondences. S3.1 Introductory remarks Charles Spence Oxford University S3.2 Language and…
This professionally guided tour leaving from the conference centre will transport you via air-conditioned coach to Niagara falls for a fun-filled afternoon and evening where you can enjoy the magnificence and the majesty of the Canadian Horseshoe falls along with the natural beauty of the Niagara Parks. Time Activity 2:30 pm Pick up from Chestnut Conference Centre 4:30 pm Arrive in Niagara Falls Hornblower Boat Tour beneath the Falls/ Free time in Niagara Falls 6:30 pm Stopover in Niagara-on-the-lake for 75 minutes. You will have time to explore the quaint shops and boutiques that line Queen Street, the main drag of Niagara-on-the-Lake, as well as the proximity of wineries, brew pubs and even a chocolate factory 8:30 pm Back to Niagara Falls for free time and a chance to…
Organizer: Daniel Chebat1 & Shachar Maidenbaum2 1Ariel University 2Columbia University Abstract: We perceive our surrounding environment using all of our senses in parallel, building a rich multisensory representation. This multisensory representation can be used to move through our environment and interact spatially with our surroundings. Vision is the most suited suited sense to assist spatial perception, but how essential is it to the process by which we navigate? And what happen when it is lacking, or unreliable? In this symposium we wish to explore different aspects of this process, and the role of vision and visual experience in guiding this process and the neural correlates thereof. We have put together a strong panel of speakers who have devoted their careers to the study of perceptual and spatial learning, the processing…
Organizer: Denise Henriques, York University Abstract: The brain can estimate hand position visually, from an image on the retina, and proprioceptively, from sensors in the joints, muscles, and skin. Neither perception is invariant, being subject to changes in lighting, movement history, and other factors. The brain is thought to make best use of available sensory estimates by weighting, aligning, and combining them to form an integrated estimate. Multisensory integration gives us flexibility to cope with the frequent sensory perturbations we experience. For example, when realigning one or both sensory estimates when they become spatially mismatched, as when wearing glasses which induce optical distortions, or when forces are exerted on the hand, including those of gravity. This panel will explore recent experimental and theoretical evidence to better understand how vision, proprioception,…
T4.1 Rapid improvement of audiovisual simultaneity perception after short-term music training Petrini, K., Di Mauro, M., Waters, G. & Jicol, C. Department of Psychology, University of Bath [show_more more="Show abstract" less="Hide abstract"]Several studies have shown that the ability to detect audiovisual simultaneity strongly increases in musicians compared to non-musicians (e.g. Lee and Noppeney, 2011). However, the amount of training required to achieve an improvement in audiovisual simultaneity precision is still unknown. Here we examined whether a short training with a musical instrument would improve audiovisual simultaneity precision in two experiments. In the first one, 13 participants were trained with the drums for two hours, one-hour training session repeated in two separate weeks. Another group of 13 participants passively observed the trainer playing the drums. Before and after the training,…
T5.1 Electrophysiological Evidence for the Effect of Tool Use on Visuo-Tactile Integration in Near and Far space Gherri, E., O'Dowd, A. & Forsberg, A. University of Edinburgh [show_more more="Show abstract" less="Hide abstract"]The representation of the body and the multisensory space near it is modulated by the active use of long tools as suggested by neuropsychological and behavioural evidence in humans. This might suggest that the tools becomes part of the body representation, extending near space into far space. However, little is known about the underlying neural mechanisms and recent studies have suggested that tool-mediated effects on visuo-tactile integration in far space are simply due to the salient tip of the tool which attracts visual attention in far space. Here, we investigate whether the electrophysiological correlates of visuo-tactile integration in…
T2.1 Training-induced plasticity with a visual-to-auditory conversion system. Seeing the thunder while still hearing it. Auvray, M., Arnold, G., & Pesnot-Lerousseau, J. CNRS - ISIR [show_more more="Show abstract" less="Hide abstract"]William James made the hypothesis that, if our eyes were connected to the auditory brain areas, and our ears to the visual brain areas, we would Òhear the lightning and see the thunderÓ [1]. Research suggests that modality-specific brain areas, such as the visual cortex, can process auditory stimuli, for instance in the case of brain alteration (e.g., rewired ferret's brain) or sensory deprivation (e.g., blindness). The study we conducted aimed at investigating behaviourally this question, by using a non-invasive technique of sensory plasticity. The participants learned to use a visual-to-auditory sensory substitution device, which translates visual images recorded by a…