Robotics: Science and Systems XX
Octopi: Object Property Reasoning with Large Tactile-Language Models
Samson Yu, Lin Kelvin, Anxing Xiao, Jiafei Duan, Harold SohAbstract:
Physical reasoning is important for effective robot manipulation. Recent work has investigated both vision and language modalities for physical reasoning; vision can reveal information about objects in the environment and language serves as an abstraction and communication medium for additional context. Although these works have demonstrated success on a variety of physical reasoning tasks, they are limited to physical properties that can be inferred from visual or language inputs. In this work, we investigate combining tactile perception with language, which enables embodied systems to obtain physical properties through interaction and apply commonsense reasoning. We contribute a new dataset PhysiCLeAR, which comprises both physical/property reasoning tasks and annotated tactile videos obtained using a GelSight tactile sensor. We then introduce Octopi, a system that leverages both tactile representation learning and large vision-language models to predict and reason about tactile inputs with minimal language fine-tuning. Our evaluations on PhysiCLeAR show that Octopi is able to effectively use intermediate physical property predictions to improve its performance on various tactile-related tasks. PhysiCLeAR and Octopi are available at https://github.com/clear-nus/octopi.
Bibtex:
@INPROCEEDINGS{Yu-RSS-24, AUTHOR = {Samson Yu AND Lin Kelvin AND Anxing Xiao AND Jiafei Duan AND Harold Soh}, TITLE = {{Octopi: Object Property Reasoning with Large Tactile-Language Models}}, BOOKTITLE = {Proceedings of Robotics: Science and Systems}, YEAR = {2024}, ADDRESS = {Delft, Netherlands}, MONTH = {July}, DOI = {10.15607/RSS.2024.XX.066} }