CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation
Wen, Bowen, Lian, Wenzhao, Bekris, Kostas, and Schaal, Stefan

Publication: Arxiv

Abstract: Task-relevant grasping is critical for industrial assembly, where downstream manipulation tasks constrain the set of valid grasps. Learning how to perform this task, however, is challenging, since task-relevant grasp labels are hard to define and annotate. There is also yet no consensus on proper representations for modeling or off-the-shelf tools for performing task-relevant grasps. This work proposes a framework to learn task-relevant grasping for industrial objects without the need of time-consuming real-world data collection or manual annotation. To achieve this, the entire framework is trained solely in simulation, including supervised training with synthetic label generation and self-supervised, hand-object interaction. In the context of this framework, this paper proposes a novel, object-centric canonical representation at the category level, which allows establishing dense correspondence across object instances and transferring task-relevant grasps to novel instances. Extensive experiments on task-relevant grasping of densely-cluttered industrial objects are conducted in both simulation and real-world setups, demonstrating the effectiveness of the proposed framework. Code and data are available at https://sites.google.com/view/catgrasp

Bibtex:

@inproceedings{wencatgrasp,
  title = {CaTGrasp: Learning Category-Level Task-Relevant Grasping in Clutter from Simulation},
  author = {Wen, Bowen and Lian, Wenzhao and Bekris, Kostas and Schaal, Stefan},
  booktitle = {Arxiv},
  year = {2021}
}