Where is my hand? On the flexibility of multisensory spatial calibration to encode hand positions and movements

Organizer: Denise Henriques, York University

 Abstract: The brain can estimate hand position visually, from an image on the retina, and proprioceptively, from sensors in the joints, muscles, and skin. Neither perception is invariant, being subject to changes in lighting, movement history, and other factors. The brain is thought to make best use of available sensory estimates by weighting, aligning, and combining them to form an integrated estimate. Multisensory integration gives us flexibility to cope with the frequent sensory perturbations we experience. For example, when realigning one or both sensory estimates when they become spatially mismatched, as when wearing glasses which induce optical distortions, or when forces are exerted on the hand, including those of gravity. This panel will explore recent experimental and theoretical evidence to better understand how vision, proprioception, and even vestibular information, interact and influence arm-motor control. The panelists represent a broad range of approaches (theoretical, behavioral, neurophysiological), traditions (basic and computational neuroscience, psychology, kinesiology) and hand movement paradigms (2D, 3D, bimanual, unimanual). Yet, their combined work all emphasize how the multiple senses differently and flexibly contribute to the encoding of hand position and movements.

 


S7.1 Motor cortex effects of recalibrating visuo-proprioceptive estimates of hand position

Hannah J. Block*, Felipe Munoz-Rubke, Jasmine L. Mirdamadi
Indiana University

Show abstract

Spatial realignment of visual and proprioceptive estimates of hand position can occur in response to a perturbation. For example, viewing the hand underwater while washing dishes: light refraction by the water shifts the visual estimate of the hand away from the proprioceptive estimate. The brain compensates for such a misalignment by realigning visual and proprioceptive estimates of the hand. Such perceptual learning might be expected to affect the hand perception used in motor planning, raising the question of whether multisensory integration and motor control share a common sensorimotor map. Our recent experiments support this hypothesis. Using transcranial magnetic stimulation (TMS), we detected excitability changes in the primary motor cortex (M1) index finger representation after subjects experienced misaligned but not veridical visuo-proprioceptive information about the index finger (Munoz-Rubke 2017). Interestingly, subjects who realigned proprioception more than vision had decreased M1 excitability, while subjects who realigned vision more than proprioception had increased M1 excitability. This suggests a modality-specific neural mechanism, such as modulation of somatosensory cortex or dorsal stream visual areas that impact M1. We next asked whether these effects were somatotopically focal. I.e., if misaligned information about the index finger is presented, would changes to M1 excitability be limited to that finger, or extend to the entire effector (including biceps and forearm)? The former might suggest a role for subject’s locus of attention (on their misaligned finger position), while the latter might suggest that the brain generalizes perceptual learning to any motor representation that might be involved in finger positioning; i.e., the whole arm. Results support the first option: M1 changes were limited to the misaligned finger representation. These results suggest visuo-proprioceptive realignment is associated with somatotopically focal physiological changes in the motor system, consistent with a common sensorimotor map for multisensory and motor control.

Hide abstract

 

S7.2 Retention of implicit sensorimotor spatial recalibration

Erin K. Cressman, Stefan Maksimovic, Kristin-Marie Neville, Jean-Michel Bouchard
University of Ottawa

Show abstract

Sensorimotor changes are well documented following reaches with altered visual feedback of the hand. Specifically, reaches are adapted and proprioceptive estimates of felt hand position shifted in the direction of the visual feedback experienced. While motor and sensory changes arise simultaneously, the contribution of proprioceptive recalibration to reach adaptation is unclear. Current research in our lab looks to address this question by establishing the relationship between proprioceptive recalibration and implicit reach adaptation, when reach adaptation due to explicit knowledge and strategies has been accounted for.

Within this presentation I will discuss a series of experiments in which we tracked proprioceptive recalibration and implicit reach adaptation over time and examined their potential to be retained in the short- and long-term. Implicit reach adaptation was promoted by manipulating the size of the visuomotor distortion introduced (i.e., error signal experienced by participants) and strategies provided. With respect to sensory changes, results reveal that (1) proprioceptive estimates of hand position develop over time, though initial changes occur immediately, (2) proprioceptive recalibration is limited in magnitude, such that it does not increase with additional training, and (3) retention of proprioceptive recalibration can be seen in the short- (1 day) and long-term (4 days) in the form of recall and savings, respectively. Implicit reach adaptation is also shown to (1) take time to develop, and (2) be limited in magnitude, such that it does not increase with larger error signals. Finally, (3) retention of implicit reach adaptation is dependent on the extent of explicit reach adaptation experienced.

The similar time course of implicit proprioceptive recalibration and reach adaptation suggest that they are driven by the same slow process. Furthermore, the limited potential of this process to lead to changes in the short and long term, indicate that additional (explicit) processes are primarily responsible for reach adaptation.

Hide abstract

 

S7.3 Where’s my hand? Afferent and efferent signals of hand position in visuomotor adapation.

Denise Henriques, Jennifer Ruttle, Shanaathanan Modchalingam, Chad Vachon and Marius ‘t Hart
York University

Show abstract

Knowing the position of one’s limbs is essential for moving and hence it makes sense that several signals provide information on limb position. Apart from vision, we use a predicted position given the efference copy of past movements, as well as afferent proprioceptive information. Both of these have been shown to change when we adapt our movements to altered visual feedback of the hand (i.e., a visuomotor rotation), but how much does each contribute to post-adaptation changes in where we localize our hand? By having participants localize their hand with and without efference signal, we can tease the two contributions apart. Here I will discuss our results investigating the afferent and efferent changes as a function of the size of the visual discrepancy, of the type of training, and of age. Furthermore, I will characterize the time course of the afferent and efferent changes by measuring them after every visuomotor training trial. In summary, we find that 1) active visuomotor training leads to changes in both efference-based predictions and proprioceptive estimates of hand location, but that the change in prediction is smaller that of perception, 2) and this discrepancy was even larger in older adults due to a greater proprioceptive recalibration, 3) passive visual-proprioceptive exposure training led only to changes in hand perception; not prediction. Lastly (4) Proprioception-based changes occur very rapidly, while efference-based contributions come about less rapidly, at about the same rate as motor changes. These findings imply that proprioceptive changes following visuomotor adaptation are separate from prediction-based changes, with a different process underlying each. This means that the plasticity in our estimates of limb position depends on multiple sources of feedback, and our brains likely take into account the peculiarities of the separate signals to arrive at a robust limb position signal.

Hide abstract

 

S7.4 Models of visuo-vestibulo-proprioceptive integration for sensorimotor coordination

Joseph McIntyre (Tecnalia Research and Innovation, Ikebasque Research Foundation), Michele Tagliabue (Université Paris Descartes)

Show abstract

In the last decade the application of optimal statistical modelling to understand sensory-motor integration has been very fruitful, allowing a much better understanding on how the CNS uses sensory information to control movements. Guided by these computational models, we have exploited experimental techniques based on virtual reality to probe the workings of the multisensory perceptual system. The combination of experimental and modelling approaches has been very effective for elucidating the origin of a number of behavioural phenomena whose causes were previously unclear.

In a body of completed and on-going studies carried out in Earth-based models of weightlessness, we have exploited these models to better understand how humans combine visual, proprioceptive and graviceptor information when aligning the hand in preparation for grasping an object. We argue for a distributed architecture for the processing of multisensory information in which error signals that drive the hand to the proper alignment are first computed separately in each sensory modality, and then these individual errors are combined in an optimal fashion. This is in contrast to a more conventional conception in which a single, convergent representation of the target is subtracted from a single, convergent representation of the hand. In this symposium we will report our most recent results and collect feedback on experimental paradigms to be performed in the near future on board the International Space Station.

Hide abstract

 

S7.5 Proprioceptive feedback utilization during visually-guided movements: Impulse vs. limb-target regulation processes

Luc Tremblay, Rachel Goodman, Stephen Bested, Gerome Manson, John de Grosbois
University of Toronto

Show abstract

Clinical cases of significant losses of proprioceptive inputs have reported extremely debilitating effects, indicating an important need for these sensory signals to perform activities of daily living. In addition, many researchers have reported on the important contribution of proprioceptive inputs for the control of voluntary movements towards visual targets. Accordingly, our laboratory has expanded our investigations of visual feedback utilization during movement execution (i.e., online control) to the proprioceptive modality. Our early studies employed tendon vibration manipulations during movement execution, which may have been contaminated by reflexive activity. Therefore, we subsequently leveraged the aftereffects of tendon vibration on Type Ia proprioceptive fibers by applying tendon vibration between trials. When asking participants to perform a limb matching task while being blindfolded, adding tendon vibration between trials yielded increased joint angle variability. Also, applying tendon vibration between upper-limb reaches towards a visual target yields larger trajectory and endpoint variability. In contrast, between-trial tendon vibration failed to yield significant effects of the proprioceptive manipulation on limb-target regulation processes (e.g., correcting for a target jump). Altogether, our research indicates significant contributions of vision and proprioception to goal-directed actions towards visual targets solely for online impulse regulation processes but not online limb-target regulation processes.

Hide abstract

 


 

Event Timeslots (1)

Sunday, June 17
-