Contact-GraspNet: Efficient 6-DoF Grasp Generation in Cluttered Scenes
Sundermeyer, Martin, Mousavian, Arsalan, Triebel, Rudolph, and Fox, Dieter

Publication: 2021 IEEE International Conference on Robotics and Automation (ICRA)

Abstract: Grasping unseen objects in unconstrained, cluttered environments is an essential skill for autonomous robotic manipulation. Despite recent progress in full 6-DoF grasp learning, existing approaches often consist of complex sequential pipelines that possess several potential failure points and run-times unsuitable for closed-loop grasping. Therefore, we propose an end-to-end network that efficiently generates a distribution of 6-DoF parallel-jaw grasps directly from a depth recording of a scene. Our novel grasp representation treats 3D points of the recorded point cloud as potential grasp contacts. By rooting the full 6-DoF grasp pose and width in the observed point cloud, we can reduce the dimensionality of our grasp representation to 4-DoF which greatly facilitates the learning process. Our class-agnostic approach is trained on 17 million simulated grasps and generalizes well to real world sensor data. In a robotic grasping study of unseen objects in structured clutter we achieve over 90% success rate, cutting the failure rate in half compared to a recent state-of-the-art method.

Bibtex:

@article{sundermeyer2021contactgraspnet,
  title = {Contact-GraspNet: Efficient 6-DoF Grasp Generation in Cluttered Scenes},
  author = {Sundermeyer, Martin and Mousavian, Arsalan and Triebel, Rudolph and Fox, Dieter},
  booktitle = {2021 IEEE International Conference on Robotics and Automation (ICRA)},
  year = {2021}
}