Robotics: Science and Systems XVIII

Robotic Telekinesis: Learning a Robotic Hand Imitator by Watching Humans on YouTube

Aravind Sivakumar*, Kenneth Shaw*, Deepak Pathak
* These authors contributed equally

Abstract:

We build a system that enables any human to control a robot hand and arm, simply by demonstrating motions with their own hand. The robot observes the human operator via a single RGB camera and imitates their actions in real-time. Human hands and robot hands differ in shape, size, and joint structure, and performing this translation from a single uncalibrated camera is a highly underconstrained problem. Moreover, the retargeted trajectories must effectively execute tasks on a physical robot, which requires them to be temporally smooth and free of self-collisions. Our key insight is that while paired human-robot correspondence data is expensive to collect, the internet contains a massive corpus of rich and diverse human hand videos. We leverage this data to train a system that understands human hands and retargets a human video stream into a robot hand-arm trajectory that is smooth, swift, safe, and semantically similar to the guiding demonstration. We demonstrate that it enables previously untrained people to teleoperate a robot on various dexterous manipulation tasks. Our low-cost, glove-free, marker-free remote teleoperation system makes robot teaching more accessible and we hope that it can aid robots in learning to act autonomously in the real world. Video demos can be found at: https://robotic-telekinesis.github.io

Download:

Bibtex:

  
@INPROCEEDINGS{Sivakumar-RSS-22, 
    AUTHOR    = {Aravind Sivakumar AND Kenneth Shaw AND Deepak Pathak}, 
    TITLE     = {{Robotic Telekinesis: Learning a Robotic Hand Imitator by Watching Humans on YouTube}}, 
    BOOKTITLE = {Proceedings of Robotics: Science and Systems}, 
    YEAR      = {2022}, 
    ADDRESS   = {New York City, NY, USA}, 
    MONTH     = {June}, 
    DOI       = {10.15607/RSS.2022.XVIII.023} 
}