Opening Keynote Lecture

Multisensory Flavour Perception: Insights for/from the Spatial SensesProf. Charles SpenceHead of the Crossmodal Research Laboratory, Oxford University Food is both fundamental to our survival and fun to study. Furthermore, there is nothing that gets your brain going quite like the sight/smell of one’s favourite food when hungry.[1] And, as the eminent British biologist J. Z. Young once noted, it is perhaps no coincidence that the mouth and the brain lie so close together in most species.[2] No wonder then that the brain rapidly estimates the energy-density of potential food sources in the environment and devotes our limited attentional resources accordingly.[3] At the same time, however, it is much harder, practically-speaking, to study flavour (i.e., the chemical senses) than it is to study the spatial senses of vision, hearing, and touch. This…
Read More

Developmental Perspectives

T3.1 Quantifying the weights of multisensory influences on postural control across development Schmuckler, M. A. University of Toronto Scarborough                                                                                       [show_more more="Show abstract" less="Hide abstract"]Balance control is fundamentally a multisensory process. Starting in infancy, people are sensitive to a variety of perceptual inputs for controlling balance, including the proprioceptive and kinesthetic inputs traditionally believed to control balance, along with both visual (e.g., presence versus absence of visual input, imposed optic flow information) and haptic (e.g., light-fingertip contact) information. Given such findings, one of the principal questions now facing researchers interested in posture involves quantifying the weighting, and potential reweighting, of sensory inputs across varying task environments, and across developmental time. Work in my laboratory over the years has explored the impact of a variety of such sensory components in different task…
Read More

Multisensory Integration & Aging

Organizer: Jeannette R. Mahoney, Albert Einstein College of Medicine Abstract: The ability to successfully integrate simultaneous information relayed across multiple sensory systems is an integral aspect of daily functioning. Unisensory impairments have been individually linked to slower gait, functional decline, increased risks of falls, cognitive decline, and worse quality of life in the elderly. To date, however, relatively few studies have set out to determine how multisensory integration processes change with increasing age. In what follows, we will discuss recent aging work investigating: 1) the temporal binding window of integration; 2) susceptibility to the sound-induced flash illusion; and 3) differential visual-somatosensory integration effects. Our overall objective is to demonstrate the clinical-translational value of multisensory integration effects in predicting important motor outcomes like balance and falls.   S6.1 Temporal Integration of…
Read More

Multisensory Integration and the Body

Organizer: Jared Medina, University of Delaware Abstract: Body perception is inherently multisensory, with information from various senses combining to create a coherent sense of the body. Along with the well-explored problem of multisensory integration across different modalities, other factors also influence multisensory integration of the body. For example, information from different spatial representations with their own frames of reference need to be weighted and integrated. Furthermore, top-down information from stored representations of the body may be combined with bottom-up sensory input from different modalities, leading to perception. In this symposium, we will review recent evidence addressing how information from different spatial representations and modalities are integrated. Our first set of speakers will describe recent results highlighting how spatial representations of the body influence the processing of tactile cues and motor…
Read More

The Role of Experience in the Development of Multimodal Integration

Organizer: Daphne Maurer, McMaster University Abstract: There are major postnatal changes in multimodal integration. This symposium will consider the role of experience in shaping those changes. First, David Lewkowicz will discuss normal human development during infancy. He will describe the transformation the rudimentary abilities present at birth over the first year of life by experience with specific languages and specific types of faces. Second, Daphne Maurer will discuss changes in the development of audiovisual integration in patients treated for dense congenital cataracts in one or both eyes, even when vision was restored during early infancy. Those results suggest that crossmodal re-organization begins to occur near birth but is compromised differentially by early deprivation to one versus both eyes. Third, Anu Sharma will consider a similar issue for patients deaf from…
Read More

Progresses in Vestibular Cognition

Organizer: Elisa Raffaella Ferré, Royal Holloway University of London  Abstract: The vestibular system is essential for successful interactions with the environment, providing an absolute reference for orientation and gravity. Vestibular information has been traditionally considered a cue for basic behaviours, such as balance, oculo-motor adjustments, and self-motion. However, recent studies have highlighted the fundamental role played by the vestibular system in brain functions beyond reflexes and postural adjustment. These include vestibular contributions to several aspects of cognition, including multisensory perception, spatial representation, emotion, attention and body models. This symposium brings together international experts with their own unique interests to the vestibular system. Laurence Harris will present experimental results on vestibular-somatosensory interaction highlighting its role in perceiving the timing of sensory events. Elisa Ferré will focus on how vestibular inputs integrate…
Read More

Poster session 2

P2.1 Virtual Reality modulates Vestibular Brain Responses Gallagher, M., Dowsett, R. & Ferrè, E.R. Royal Holloway University of London [show_more more="Show abstract" less="Hide abstract"]Virtual reality (VR) has become increasingly popular in the past decade. Key to the user's VR experience are multimodal interactions involving all senses. However, sensory information for self-motion is often conflicting in VR: while vision signals that the user is moving in a certain direction with a certain acceleration (i.e. vection), the vestibular organs provide no cues for linear or angular acceleration. To solve this conflict, the brain might down-weight vestibular signals. Here we recorded participants' physiological responses to actual vestibular events while being exposed to VR-induced vection. We predicted that exposure to a few minutes of linear vection would modulate vestibular processing. Vestibular-evoked myogenic potentials (VEMPs)…
Read More

Poster session 1

P1.1 The prevalence of between-hands spatial codes in a tactile Simon task Gherri, E., & Theodoropoulos, N. University of Edinburgh [show_more more="Show abstract" less="Hide abstract"]When a tactile stimulus is presented to our body, its spatial location is automatically coded, modulating behavioural performance, even when space is completely task-irrelevant (Tactile Simon effect). Here we present a series of studies investigating whether multiple spatial codes are created for the location of tactile stimuli in a tactile Simon task. In the two hands task (Exp. 1 and 3), in which stimuli were presented to one of four possible locations (left and right finger on the left and right hand), the tactile target was automatically coded according to the location of the stimulated hand (between-hands Simon effect) but not according to the location of…
Read More