Robotics: Science and Systems XXI
Sketch-to-Skill: Bootstrapping Robot Learning with Human Drawn Trajectory Sketches
Peihong Yu, Amisha Bhaskar, Anukriti Singh, Zahiruddin Mahammad, Pratap TokekarAbstract:
Training robotic manipulation policies traditionally requires numerous demonstrations and/or environmental rollouts. While recent Imitation Learning (IL) and Reinforcement Learning (RL) methods have reduced the number of required demonstrations, they still rely on expert knowledge to collect high-quality data, limiting scalability and accessibility. We propose SKETCH-TO-SKILL, a novel framework that leverages human-drawn 2D sketch trajectories to bootstrap and guide RL for robotic manipulation. Our approach extends beyond previous sketch-based methods, which were primarily focused on imitation learning or policy conditioning, limited to specific trained tasks. SKETCH-TO-SKILL employs a Sketch-to-3D Trajectory Generator that translates 2D sketches into 3D trajectories, which are then used to autonomously collect initial demonstrations. We utilize these sketch-generated demonstrations in two ways: to pre-train an initial policy through behavior cloning and to refine this policy through RL with guided exploration. Experimental results demonstrate that SKETCH-TO-SKILL achieves ~96% of the performance of the baseline model that leverages teleoperated demonstration data, while exceeding the performance of a pure reinforcement learning policy by ~170%, only from sketch inputs. This makes robotic manipulation learning more accessible and potentially broadens its applications across various domains.
Bibtex:
@INPROCEEDINGS{YuP-RSS-25, AUTHOR = {Peihong Yu AND Amisha Bhaskar AND Anukriti Singh AND Zahiruddin Mahammad AND Pratap Tokekar}, TITLE = {{Sketch-to-Skill: Bootstrapping Robot Learning with Human Drawn Trajectory Sketches}}, BOOKTITLE = {Proceedings of Robotics: Science and Systems}, YEAR = {2025}, ADDRESS = {LosAngeles, CA, USA}, MONTH = {June}, DOI = {10.15607/RSS.2025.XXI.151} }