The relationship of crossmodal correspondences to language

Organizers: Krish Sathian1 & Charles Spence2 1Penn State College of Medicine, Milton S. Hershey Medical Center 2Oxford University  Abstract: Crossmodal correspondences are a very active field of study. A major issue is the nature of their relationship to language. In this symposium, we bring together a number of speakers to address this issue, based on recent work. Charles Spence, the symposium co-organizer, will lead off with introductory remarks. Next, Laura Speed will discuss the role of language in odor-color correspondences in synesthetes and non-synesthetes. This will be followed by a review of potential mechanisms underlying sound symbolism, by David Sidhu. Finally, Krish Sathian, the symposium organizer, will present findings on the neural basis of sound symbolic crossmodal correspondences.   S3.1 Introductory remarks Charles Spence Oxford University   S3.2 Language and…
Read More

The Multisensory Space – Perception, Neural representation and Navigation

Organizer: Daniel Chebat1 & Shachar Maidenbaum2 1Ariel University 2Columbia University  Abstract: We perceive our surrounding environment using all of our senses in parallel, building a rich multisensory representation. This multisensory representation can be used to move through our environment and interact spatially with our surroundings. Vision is the most suited suited sense to assist spatial perception, but how essential is it to the process by which we navigate? And what happen when it is lacking, or unreliable? In this symposium we wish to explore different aspects of this process, and the role of vision and visual experience in guiding this process and the neural correlates thereof. We have put together a strong panel of speakers who have devoted their careers to the study of perceptual and spatial learning, the processing…
Read More

Where is my hand? On the flexibility of multisensory spatial calibration to encode hand positions and movements

Organizer: Denise Henriques, York University  Abstract: The brain can estimate hand position visually, from an image on the retina, and proprioceptively, from sensors in the joints, muscles, and skin. Neither perception is invariant, being subject to changes in lighting, movement history, and other factors. The brain is thought to make best use of available sensory estimates by weighting, aligning, and combining them to form an integrated estimate. Multisensory integration gives us flexibility to cope with the frequent sensory perturbations we experience. For example, when realigning one or both sensory estimates when they become spatially mismatched, as when wearing glasses which induce optical distortions, or when forces are exerted on the hand, including those of gravity. This panel will explore recent experimental and theoretical evidence to better understand how vision, proprioception,…
Read More

Multisensory Integration & Aging

Organizer: Jeannette R. Mahoney, Albert Einstein College of Medicine Abstract: The ability to successfully integrate simultaneous information relayed across multiple sensory systems is an integral aspect of daily functioning. Unisensory impairments have been individually linked to slower gait, functional decline, increased risks of falls, cognitive decline, and worse quality of life in the elderly. To date, however, relatively few studies have set out to determine how multisensory integration processes change with increasing age. In what follows, we will discuss recent aging work investigating: 1) the temporal binding window of integration; 2) susceptibility to the sound-induced flash illusion; and 3) differential visual-somatosensory integration effects. Our overall objective is to demonstrate the clinical-translational value of multisensory integration effects in predicting important motor outcomes like balance and falls.   S6.1 Temporal Integration of…
Read More

Multisensory Integration and the Body

Organizer: Jared Medina, University of Delaware Abstract: Body perception is inherently multisensory, with information from various senses combining to create a coherent sense of the body. Along with the well-explored problem of multisensory integration across different modalities, other factors also influence multisensory integration of the body. For example, information from different spatial representations with their own frames of reference need to be weighted and integrated. Furthermore, top-down information from stored representations of the body may be combined with bottom-up sensory input from different modalities, leading to perception. In this symposium, we will review recent evidence addressing how information from different spatial representations and modalities are integrated. Our first set of speakers will describe recent results highlighting how spatial representations of the body influence the processing of tactile cues and motor…
Read More

The Role of Experience in the Development of Multimodal Integration

Organizer: Daphne Maurer, McMaster University Abstract: There are major postnatal changes in multimodal integration. This symposium will consider the role of experience in shaping those changes. First, David Lewkowicz will discuss normal human development during infancy. He will describe the transformation the rudimentary abilities present at birth over the first year of life by experience with specific languages and specific types of faces. Second, Daphne Maurer will discuss changes in the development of audiovisual integration in patients treated for dense congenital cataracts in one or both eyes, even when vision was restored during early infancy. Those results suggest that crossmodal re-organization begins to occur near birth but is compromised differentially by early deprivation to one versus both eyes. Third, Anu Sharma will consider a similar issue for patients deaf from…
Read More

Progresses in Vestibular Cognition

Organizer: Elisa Raffaella Ferré, Royal Holloway University of London  Abstract: The vestibular system is essential for successful interactions with the environment, providing an absolute reference for orientation and gravity. Vestibular information has been traditionally considered a cue for basic behaviours, such as balance, oculo-motor adjustments, and self-motion. However, recent studies have highlighted the fundamental role played by the vestibular system in brain functions beyond reflexes and postural adjustment. These include vestibular contributions to several aspects of cognition, including multisensory perception, spatial representation, emotion, attention and body models. This symposium brings together international experts with their own unique interests to the vestibular system. Laurence Harris will present experimental results on vestibular-somatosensory interaction highlighting its role in perceiving the timing of sensory events. Elisa Ferré will focus on how vestibular inputs integrate…
Read More

Recovering from Blindness: Learning to see using multisensory information

Organizer: Marc Ernst & Irene Senna, Ulm University Abstract: Would a person born blind who regained sight via some surgical intervention be able to learn to ‘see’? That is, would that individual be able to interpret the images that reach the retina and combine them with other senses in order to build a multisensory representation of the world, and to interact with the environment? Surgically treating congenitally blind individuals (e.g., born with bilateral cataract) after extended periods of blindness provides a unique opportunity to study the development of visual skills, and the ability to combine vision with other senses. For example, whether newly sighted individuals can learn to use their vision to recognize objects previously recognized only through touch, and to build multimodal representation of objects is still an open question. Behavioural…
Read More