Comparing haptic, visual, and computational similarity-based maps of novel, 3D objects
Poster Presentation
Theresa Cooke
Max Planck Institute for Biological Cybernetics
Christian Wallraven
Max Planck Institute for Biological Cybernetics Heinrich Buelthoff
Max Planck Institute for Biological Cybernetics Abstract ID Number: 26 Full text:
Not available
Last modified: March 3, 2005
Abstract
Do similarity relationships between objects differ for vision and touch? We investigated this fundamental question using psychophysical experiments in which subjects rated similarity between objects presented either visually or haptically. The stimuli were novel, three-dimensional objects which parametrically varied in microgeometry (“texture”) and macrogeometry (“shape”). Multidimensional scaling (MDS) of the similarity data was used to reconstruct haptic and visual perceptual spaces. For both modalities, a two-dimensional perceptual space was found. Perceptual dimensions clearly corresponded to shape and texture. Interestingly, shape dominated for vision, whereas both shape and texture dominated for touch. In order to correlate these perceptual features with physical features, we extracted computational features from 3D object geometry and from 2D images. Similarity ratings were computed using these features and maps of the objects in these physical feature spaces were generated using MDS. Maps based on 2D subtraction, 2D correlation, and 3D subtraction correlated surprisingly well with visual maps. In contrast, maps based on edge detection and Gabor jets correlated poorly with both haptic and visual perceptual maps. This study presents a unique approach for quantitative analysis of visual and haptic similarity relationships, exploration of the physical basis of perceptual features, as well as perceptual validation of computational features.
|
|
Learn more
about this
publishing
project...
|
|
|