Haptics and Body Schema

T5.1 Electrophysiological Evidence for the Effect of Tool Use on Visuo-Tactile Integration in Near and Far space

Gherri, E., O’Dowd, A. & Forsberg, A.
University of Edinburgh                                                                                                  

Show abstract

The representation of the body and the multisensory space near it is modulated by the active use of long tools as suggested by neuropsychological and behavioural evidence in humans. This might suggest that the tools becomes part of the body representation, extending near space into far space. However, little is known about the underlying neural mechanisms and recent studies have suggested that tool-mediated effects on visuo-tactile integration in far space are simply due to the salient tip of the tool which attracts visual attention in far space. Here, we investigate whether the electrophysiological correlates of visuo-tactile integration in near and far space are modulated by active tool use in healthy humans. ERPs elicited by visuo-tactile stimuli in near and far space were measured after short and long tool use. ERPs recorded close to the somatosensory cortex in the P100 time-range were enhanced after long as compared to short tool use. No such modulation was observed over occipital areas where effects of visual attention would be expected, ruling out a role of visual attention in these effect. This pattern of results provides the first electrophysiological evidence that the active use of long tools increased neural activity within somatosensory areas of the brain, in line with the idea of plastic changes to the representation of the body induced by the use of tools.

Hide abstract

 

T5.2 Influences of Conflicting Visual Information on Body Schema and Haptic Perception of Hands

Okajima, K. & Hanxiao, L.
Yokohama National University                                                                                            

Show abstract

We conducted three experiments to confirm how conflicting visual information modify the body schema where the visual movement was quantitatively incoherent with the actual hand movement in a VR environment. Participants observed their CG hands through a HMD while repeated opening and closing their hands. We defined Gain which is the ratio of the angle in vision to the angle in hand between thumb and index finger. When Gain is 1, it means the normal (coherence) environment. However, when Gain is not 1, it is an abnormal (incoherent) environment in the relationship between hand motion and visual motion. First, participants repeated opening and closing their hands in an incoherent environment for a few minutes. After that, they adjusted the Gain so as to perceive their hand movements naturally. In Experiment 1, results showed that we can adapt to such a conflicting visual environment in a short period of time and build a new body schema related to the finger joint of both hands. In Experiment 2, we demonstrated that such a multimodal adaptation effect can be transferred between the right and left hands simultaneously, suggesting that a common mechanism for both hands in order to normalize the relationship between the actual hand motion and visual motion of fingers exists in our brains. Finally, in Experiment 3, we showed that the new body schema can also influence the tactile perception while participants estimate the size of a small object by using two fingers after adapting to an incoherent environment between actual hand motion and visual motion of fingers. The results promise that comfortable VR content or new VR interface can be developed by using the plasticity of the human body schema.

Hide abstract

 

T5.3 The interplay of visual and haptic cues in multisensory grasping

Volcic, R. & Camponogara, I.
New York University Abu Dhabi                                                                                            

Show abstract

The target of a grasping action is usually represented in visual coordinates, but it can also be specified by additional haptic cues when we grasp with one hand an object held by the other hand (e.g., reaching for a lid while grasping a jar). In such cases we can plan and execute our actions based on a combination of distance and size cues provided through both vision and haptics. Here, we investigate which of these visual and haptic signals are integrated in multisensory grasping. We contrasted visual and haptic unisensory conditions, and, visuo-haptic conditions in which vision was combined with both haptic distance and haptic shape cues, or, with only haptic distance cues. Participants (n = 20) performed grasping movements toward five differently sized objects located at three egocentric distances. In the visual condition (V), participants had full vision of the object and the workspace. In the haptic condition (H), vision was prevented and the action was under haptic guidance from the other non-grasping hand. In the visuo-haptic condition (VH-full), all visual and haptic cues were available throughout the movement. In an additional visuo-haptic condition (VH-distance), participants held a post which supported the object, instead of holding the object itself, while vision was fully available. In this case, haptics was informative only about the egocentric distance of the object, but not about its size. The availability of both vision and haptics (VH-full) produced faster grasping movements with considerably smaller maximum grip apertures than in the unisensory conditions. Critically, in the VH-distance condition, grasping movements were indistinguishable from those in the VH-full condition. In sum, these findings show that only haptic distance cues in concert with visual signals are needed for optimal multisensory grasping.   

Hide abstract

 

T5.4 A meaningful pairing between action and the senses

Juravle, G., Yon, D.J., Farnè, A., & Binsted, G.
Impact Team, INSERM U1028, CNRS UMR5292, Lyon Neuroscience Research Center, University Claude Bernard Lyon 

Show abstract

We present a series of experiments designed to test, in a naturalistic task, how the different senses inform our actions, with the specific goal to investigate meaningful pairings between sensory information and goal-directed actions. Participants were asked to report (with a movement of their choice) how many marbles they believed there were in a jar based on either auditory information only (Experiment 1a) or tactile information only (Experiment 2a). They also estimated marbles numerosities while they searched for a larger marble placed in the jar in half of trials (Experiment 2b). Further, another experiment where all senses were available was performed for both movement tasks (non-meaningful estimate only vs. meaningful search plus estimation, Experiment 3a-b), as well as, in separate experiments, participants gave estimations on the number of marbles based on visual information only (Experiment 4), and passively listening to a recording (Experiment 1b). Overall, participants significantly underestimated the number of marbles in the jar. For low numerosities, results indicated very good estimation performance for visual information, followed by touch, and lastly by audition. Meaningful action improved estimates in audition for higher numerosities, but not in touch. Nevertheless, when examining the movement profiles, meaningful searches based on only tactile information proved to be more ample and consistent as evidenced by higher mean amplitude difference between velocity peaks, higher number of peaks, and higher mean peak velocity. Importantly, as a complement to the substantial behavioural underestimation, movement variability significantly increased for larger marbles numerosities. Our results demonstrate that sensory information intrinsic to the action performed guides goal-directed movement, defines the meaning of our actions, and thus informs cognition.   

Hide abstract

 


 

 

Event Timeslots (1)

Saturday, June 16
-