Multisensory Integration and the Body

Organizer: Jared Medina, University of Delaware

Abstract: Body perception is inherently multisensory, with information from various senses combining to create a coherent sense of the body. Along with the well-explored problem of multisensory integration across different modalities, other factors also influence multisensory integration of the body. For example, information from different spatial representations with their own frames of reference need to be weighted and integrated. Furthermore, top-down information from stored representations of the body may be combined with bottom-up sensory input from different modalities, leading to perception. In this symposium, we will review recent evidence addressing how information from different spatial representations and modalities are integrated. Our first set of speakers will describe recent results highlighting how spatial representations of the body influence the processing of tactile cues and motor control. Our second set of speakers will describe recent results highlighting how representations of the body and of space generally can be manipulated through vision, galvanic vestibular stimulation, and tool use. Collectively, our session will leverage novel technologies, brain stimulation, computational modeling, and behavior to provide an updated perspective on the interplay between body representations and multimodal cue processing.

 


S5.1 Feeling a touch to the hand on the foot

Stephanie Badde1,2*, Brigitte Röder2, & Tobias Heed2,3
1 Department of Psychology, New York University

2 Biological Psychology and Neuropsychology, University of Hamburg
3 Biopsychology and Cognitive Neuroscience, Bielefeld University

Show abstract

Where we perceive a touch presumably depends on somatotopic maps that code its location on the skin surface, as well as on parietal maps laid out to represent tactile locations in external space. However, the location of a touch is characterized also by non-spatial features such as the type of the stimulated limb. We found that healthy adults sometimes attribute brief tactile stimuli to completely wrong limbs. These errors were used to investigate the contributions of somatotopic, external and feature-based coding to touch localization. In each trial, two randomly selected limbs were successively stimulated, and participants indicated the perceived location of the first touch, choosing from all four limbs. Hands and feet were positioned uncrossed or crossed to disentangle the influence of somatotopically and externally coded stimulus location. Although mostly being accurate, participants regularly reported touch at non-stimulated limbs. These phantom errors occurred preferentially at limbs of the homologous limb or on the same body side as the physical touch, with the former being much more frequent when the stimulated limb was crossed. Moreover, phantom errors did not depend on external-spatial alignment of stimulus and erroneously responding limb, but on alignment of the responding limb with the stimulus’s body side. The phenomenon of phantom errors indicates that touch can be perceived at a limb without any peripheral tactile input to the respective somatosensory map region or adjacent areas. Moreover, the pattern of phantom errors suggests that although body posture influences tactile localization, touch is attributed to a limb based on anatomically defined features that abstract from spatial coding.

Hide abstract

 

S5.2 Canonical computations mediate cue combination in touch

Jeffrey M. Yau* & Md. Shoaibur Rahman
Department of Neuroscience, Baylor College of Medicine

Show abstract

Our remarkable ability to sense and manipulate objects bimanually likely requires the integration of sensory inputs that signal both the local events occurring at the fingertips (touch) and the relative locations of the hands (proprioception). Yet, the neural computations that support bimanual touch are unknown. Here, we show that tactile signals experienced on one hand systematically influence the perceived frequency of vibration cues experienced on the other hand. Moreover, the strength of bimanual interactions in the frequency domain, indexed by bias and threshold changes, vary according to the proximity of the hands: Hands held further apart interact less than hands held closely together. Surprisingly, a different pattern of bimanual interactions is observed when subjects perform an analogous intensity discrimination task: Distractors only attenuate the perceived intensity of a vibration cue and the magnitude of attenuation is independent of hand location. These idiosyncratic cue combination patterns could be well explained by distinct computational models. Our collective results reveal feature-specific cue combination patterns in bimanual touch that are consistent with canonical computations like normalization.

Hide abstract

 

S5.3 Moving with a growing body: development of visual-proprioceptive integration for hand motor control

Marie Martel* & Tobias Heed
Faculty of Psychology and Sports Science and Center of Excellence in Cognitive Interaction Technology, Bielefeld University

Show abstract

Spatial integration across sensory modalities supports fine-tuned motor control. During movement, both vision and proprioception signal the location of our body parts. Yet, we lack knowledge on how interactions between multisensory integration and motor control develop in humans. In children, some multisensory functions develop over a protracted period, and adult-like weighted integration that yields to optimality principles is not evident until at least 8-10 years of age. At present, it is unclear whether a similar protracted development occurs for the use of multisensory information in motor control.

Here, we investigated the performance of 4-10-year-old children in unimanual and bimanual motor tasks with proprioceptive, visual, and proprioceptive-visual input. Children operated an apparatus that had two handles that could move in circles. Handle positions could be displayed as cursors on a cover over the workspace. Children had to symmetrically coordinate circular movements with the two unseen hands (proprioceptive only), or additionally received cursor feedback (proprioceptive-visual). In a third condition, they coordinated one hand with the circular movement of a cursor (visual only). In two further conditions, we tested whether children could maintain an asymmetrical rhythm (2 circles with one, and 3 with the other hand) when visual feedback was veridical or modified to appear symmetrical.

Whereas performance was adequate during symmetric coordination after 5 years of age when proprioceptive information was available, it was not improved by adding visual information. Performance with vision alone was markedly impaired compared to proprioceptive conditions for the younger groups. Moreover, asymmetrical rhythms were not adequately performed even with symmetrical visual feedback.

Our results reveal a lack of visual-proprioceptive integration in children that contrasts with the dedicated use of vision for motor control in adults. This result suggests that the development of multisensory integration for motor control may continue well into adolescence.

Hide abstract

 

S5.4 Influence of stored body representations on multisensory integration

Jared Medina*, Yuqi Liu
Department of Psychological and Brain Sciences, University of Delaware

Show abstract

Information from different sensory modalities needs to be combined to make a coherent, multisensory percept. In addition to known principles of optimal integration based on the precision of each modality, prior knowledge from experience may also influence multisensory integration. We review how information from stored representations of the properties of the body also contribute to multisensory integration, using examples from the mirror box illusion. Over multiple experiments, we presented participants with variants of the mirror box illusion in which the participant’s limbs are positioned in opposing postures. After synchronous bimanual movement, participants report an illusory rotation of their hidden hand, such that they feel like it is in the viewed (mirror) position. First, even though participants did not rotate their hands, the illusion was strongly modulated by stored knowledge regarding biomechanical constraints of the body. Second, we examined whether there is differential weighting, not only for different modalities, but information from different types of representations. We found more integration for congruent movements in externally-based versus motor-based frames of reference. Third, examining the time course of this illusion, we found that evidence for two different mechanisms for multisensory integration of the body – immediate visual capture of perceived limb position, or a gradual shift from the proprioceptively-defined to the visually-defined limb position. We discuss how prior information from body representations can be incorporated into existing models of multisensory integration.

Hide abstract

 

S5.5 Body size perception in healthy adults can be manipulated using galvanic vestibular stimulation and distorted visual exposure

Sarah D’Amour*, Deborah Alexe, Isabella Lim and Laurence R. Harris
Centre for Vision Research, York University

Show abstract

The brain has an implicit, internal representation of the body that is maintained, adjusted, and updated in response to changes in body shape during growth and development. Here, we investigated how perceived body size accuracy may be affected when body representation is manipulated. We attempted to alter body perception by either having participants look at a distorted image of their own body for five minutes or by the use of disruptive galvanic vestibular stimulation (dGVS sum of sines). Participants were tested with the body or body parts presented in different viewpoints to see if performance changed for familiar and unfamiliar views. The Body Shape Questionnaire (Cooper et al., 1986) was also administered. Accuracy was measured using a novel psychophysical method for determining perceived body size that taps into the implicit body representation (D’Amour and Harris, 2017). The time course of visual adaptation effects was measured. Control experiments were also carried out using a familiar inanimate object (e.g., a Coke can). Manipulating body representation using both visual and vestibular methods resulted in changes to perceived body size accuracy. These results provide insights into how the brain represents the body, revealing that body size perception is flexible and plastic.

Hide abstract

 

S5.6 Two flavors of tool embodiment

Luke E. Miller
Lyon Neuroscience Research Center

Show abstract

A hallmark feature of the sense of our body is that it is incredibly flexible. Perhaps the most striking example of this is the fact that the brain readily incorporates external objects into the way it represents and controls the body, a group of phenomena collectively referred to as “embodiment”. For example, using a tool for only a few minutes extends multisensory body representations underlying both perception (e.g. Miller, et al. 2014) and action (e.g. Cardinali, et al. 2009). “Tool embodiment”, as this modulation is often called, has been studied in detail over the last two decades (Martel, et al. 2016). However, there is a second “flavor” of tool embodiment that has received far less attention’the user’s ability to use a tool as a functional extension of their body. Here, we investigated whether the nervous system treats a tool as a sensory extension of the body. Using a classic tactile localization paradigm, we found that users can sense where an object contacts the surface of a wooden rod with surprising accuracy, just as is possible on the skin itself. Follow-up experiments showed that: (1) this sensory ability does not require prior experience with the rod; (2) the ability to predict the vibratory dynamics of the rod is crucial; (3) users can flexibly update their sensing behaviour to wield novel tools; and (4) the tool (like the body) is represented in both egocentric and tool-centred spatial reference frames. Future research in our lab will investigate how both flavours of embodiment are related. Doing so will provide a more complete picture of how embodiment emerges from the functional coupling between technological and neural levels of information processing.

Hide abstract


Event Timeslots (1)

Saturday, June 16
-