The relationship of crossmodal correspondences to language

Organizers: Krish Sathian1 & Charles Spence2 1Penn State College of Medicine, Milton S. Hershey Medical Center 2Oxford University  Abstract: Crossmodal correspondences are a very active field of study. A major issue is the nature of their relationship to language. In this symposium, we bring together a number of speakers to address this issue, based on recent work. Charles Spence, the symposium co-organizer, will lead off with introductory remarks. Next, Laura Speed will discuss the role of language in odor-color correspondences in synesthetes and non-synesthetes. This will be followed by a review of potential mechanisms underlying sound symbolism, by David Sidhu. Finally, Krish Sathian, the symposium organizer, will present findings on the neural basis of sound symbolic crossmodal correspondences.   S3.1 Introductory remarks Charles Spence Oxford University   S3.2 Language and…
Read More

The Multisensory Space – Perception, Neural representation and Navigation

Organizer: Daniel Chebat1 & Shachar Maidenbaum2 1Ariel University 2Columbia University  Abstract: We perceive our surrounding environment using all of our senses in parallel, building a rich multisensory representation. This multisensory representation can be used to move through our environment and interact spatially with our surroundings. Vision is the most suited suited sense to assist spatial perception, but how essential is it to the process by which we navigate? And what happen when it is lacking, or unreliable? In this symposium we wish to explore different aspects of this process, and the role of vision and visual experience in guiding this process and the neural correlates thereof. We have put together a strong panel of speakers who have devoted their careers to the study of perceptual and spatial learning, the processing…
Read More

Where is my hand? On the flexibility of multisensory spatial calibration to encode hand positions and movements

Organizer: Denise Henriques, York University  Abstract: The brain can estimate hand position visually, from an image on the retina, and proprioceptively, from sensors in the joints, muscles, and skin. Neither perception is invariant, being subject to changes in lighting, movement history, and other factors. The brain is thought to make best use of available sensory estimates by weighting, aligning, and combining them to form an integrated estimate. Multisensory integration gives us flexibility to cope with the frequent sensory perturbations we experience. For example, when realigning one or both sensory estimates when they become spatially mismatched, as when wearing glasses which induce optical distortions, or when forces are exerted on the hand, including those of gravity. This panel will explore recent experimental and theoretical evidence to better understand how vision, proprioception,…
Read More

Music

T4.1 Rapid improvement of audiovisual simultaneity perception after short-term music training Petrini, K., Di Mauro, M., Waters, G. & Jicol, C. Department of Psychology, University of Bath                                                                               [show_more more="Show abstract" less="Hide abstract"]Several studies have shown that the ability to detect audiovisual simultaneity strongly increases in musicians compared to non-musicians (e.g. Lee and Noppeney, 2011). However, the amount of training required to achieve an improvement in audiovisual simultaneity precision is still unknown. Here we examined whether a short training with a musical instrument would improve audiovisual simultaneity precision in two experiments. In the first one, 13 participants were trained with the drums for two hours, one-hour training session repeated in two separate weeks. Another group of 13 participants passively observed the trainer playing the drums. Before and after the training,…
Read More

Haptics and Body Schema

T5.1 Electrophysiological Evidence for the Effect of Tool Use on Visuo-Tactile Integration in Near and Far space Gherri, E., O'Dowd, A. & Forsberg, A. University of Edinburgh                                                                                                   [show_more more="Show abstract" less="Hide abstract"]The representation of the body and the multisensory space near it is modulated by the active use of long tools as suggested by neuropsychological and behavioural evidence in humans. This might suggest that the tools becomes part of the body representation, extending near space into far space. However, little is known about the underlying neural mechanisms and recent studies have suggested that tool-mediated effects on visuo-tactile integration in far space are simply due to the salient tip of the tool which attracts visual attention in far space. Here, we investigate whether the electrophysiological correlates of visuo-tactile integration in…
Read More

Audio-visual substitutions and illusions

T2.1 Training-induced plasticity with a visual-to-auditory conversion system. Seeing the thunder while still hearing it. Auvray, M., Arnold, G., & Pesnot-Lerousseau, J. CNRS - ISIR [show_more more="Show abstract" less="Hide abstract"]William James made the hypothesis that, if our eyes were connected to the auditory brain areas, and our ears to the visual brain areas, we would Òhear the lightning and see the thunderÓ [1]. Research suggests that modality-specific brain areas, such as the visual cortex, can process auditory stimuli, for instance in the case of brain alteration (e.g., rewired ferret's brain) or sensory deprivation (e.g., blindness). The study we conducted aimed at investigating behaviourally this question, by using a non-invasive technique of sensory plasticity. The participants learned to use a visual-to-auditory sensory substitution device, which translates visual images recorded by a…
Read More

Developmental Perspectives

T3.1 Quantifying the weights of multisensory influences on postural control across development Schmuckler, M. A. University of Toronto Scarborough                                                                                       [show_more more="Show abstract" less="Hide abstract"]Balance control is fundamentally a multisensory process. Starting in infancy, people are sensitive to a variety of perceptual inputs for controlling balance, including the proprioceptive and kinesthetic inputs traditionally believed to control balance, along with both visual (e.g., presence versus absence of visual input, imposed optic flow information) and haptic (e.g., light-fingertip contact) information. Given such findings, one of the principal questions now facing researchers interested in posture involves quantifying the weighting, and potential reweighting, of sensory inputs across varying task environments, and across developmental time. Work in my laboratory over the years has explored the impact of a variety of such sensory components in different task…
Read More

Multisensory Integration & Aging

Organizer: Jeannette R. Mahoney, Albert Einstein College of Medicine Abstract: The ability to successfully integrate simultaneous information relayed across multiple sensory systems is an integral aspect of daily functioning. Unisensory impairments have been individually linked to slower gait, functional decline, increased risks of falls, cognitive decline, and worse quality of life in the elderly. To date, however, relatively few studies have set out to determine how multisensory integration processes change with increasing age. In what follows, we will discuss recent aging work investigating: 1) the temporal binding window of integration; 2) susceptibility to the sound-induced flash illusion; and 3) differential visual-somatosensory integration effects. Our overall objective is to demonstrate the clinical-translational value of multisensory integration effects in predicting important motor outcomes like balance and falls.   S6.1 Temporal Integration of…
Read More

Multisensory Integration and the Body

Organizer: Jared Medina, University of Delaware Abstract: Body perception is inherently multisensory, with information from various senses combining to create a coherent sense of the body. Along with the well-explored problem of multisensory integration across different modalities, other factors also influence multisensory integration of the body. For example, information from different spatial representations with their own frames of reference need to be weighted and integrated. Furthermore, top-down information from stored representations of the body may be combined with bottom-up sensory input from different modalities, leading to perception. In this symposium, we will review recent evidence addressing how information from different spatial representations and modalities are integrated. Our first set of speakers will describe recent results highlighting how spatial representations of the body influence the processing of tactile cues and motor…
Read More

The Role of Experience in the Development of Multimodal Integration

Organizer: Daphne Maurer, McMaster University Abstract: There are major postnatal changes in multimodal integration. This symposium will consider the role of experience in shaping those changes. First, David Lewkowicz will discuss normal human development during infancy. He will describe the transformation the rudimentary abilities present at birth over the first year of life by experience with specific languages and specific types of faces. Second, Daphne Maurer will discuss changes in the development of audiovisual integration in patients treated for dense congenital cataracts in one or both eyes, even when vision was restored during early infancy. Those results suggest that crossmodal re-organization begins to occur near birth but is compromised differentially by early deprivation to one versus both eyes. Third, Anu Sharma will consider a similar issue for patients deaf from…
Read More