Deep learning for tactile understanding from visual and haptic data

TitleDeep learning for tactile understanding from visual and haptic data
Publication TypeConference Proceedings
Year of Publication2016
AuthorsGao, Y., Hendricks L. Anne, Kuchenbecker K. J., & Darrell T.
Published inIEEE International Conference on Robotics and Automation (ICRA)
Page(s)536-543
Date Published05/2016
PublisherIEEE
ISBN Number978-1-4673-8026-3
Accession Number16055322
KeywordsConvolution, Data models, Haptic interfaces, neural networks, Robots, Training, Visualization
Abstract

Robots which interact with the physical world will benefit from a fine-grained tactile understanding of objects and surfaces. Additionally, for certain tasks, robots may need to know the haptic properties of an object before touching it. To enable better tactile understanding for robots, we propose a method of classifying surfaces with haptic adjectives (e.g., compressible or smooth) from both visual and physical interaction data. Humans typically combine visual predictions and feedback from physical interactions to accurately predict haptic properties and interact with the world. Inspired by this cognitive pattern, we propose and explore a purely visual haptic prediction model. Purely visual models enable a robot to “feel” without physical interaction. Furthermore, we demonstrate that using both visual and physical interaction signals together yields more accurate haptic classification. Our models take advantage of recent advances in deep neural networks by employing a unified approach to learning features for physical interaction and visual observations. Even though we employ little domain specific knowledge, our model still achieves better results than methods based on hand-designed features.

URLhttp://www.icsi.berkeley.edu/pubs/vision/deeplearningfortactile16.pdf
DOI10.1109/ICRA.2016.7487176
ICSI Research Group

Vision