Robotic Learning of Haptic Adjectives Through Physical Interaction

TitleRobotic Learning of Haptic Adjectives Through Physical Interaction
Publication TypeJournal Article
Year of Publication2015
AuthorsChu, V., McMahon I., Riano L., McDonald C. G., He Q., Perez-Tejada J. Martinez, Arrigo M., Darrell T., & Kuchenbecker K. J.
Published inRobot. Auton. Syst.
Date Published01/2015
PublisherNorth-Holland Publishing Co.
Place PublishedAmsterdam, The Netherlands, The Netherlands
Type of ArticleJournal Article
KeywordsHaptics, machine learning, Robotic perception, Tactile sensing, Time-series classification

To perform useful tasks in everyday human environments, robots must be able to both understand and communicate the sensations they experience during haptic interactions with objects. Toward this goal, we augmented the Willow Garage PR2 robot with a pair of SynTouch BioTac sensors to capture rich tactile signals during the execution of four exploratory procedures on 60 household objects. In a parallel experiment, human subjects blindly touched the same objects and selected binary haptic adjectives from a predetermined set of 25 labels. We developed several machine-learning algorithms to discover the meaning of each adjective from the robot's sensory data. The most successful algorithms were those that intelligently combine static and dynamic components of the data recorded during all four exploratory procedures. The best of our approaches produced an average adjective classification F 1 score of 0.77, a score higher than that of an average human subject. We equipped a PR2 robot with a pair of BioTac tactile sensors.Both the robot and human subjects blindly touched sixty diverse objects.We calculated static and dynamic features from the haptic data felt by the robot.A multi-kernel SVM was used to learn the meaning of twenty-five haptic adjectives.The robot performed as well as the average human subject at labeling objects.

ICSI Research Group