Graspness Discovery in Clutters for Fast and Accurate Grasp Detection
- URL: http://arxiv.org/abs/2406.11142v1
- Date: Mon, 17 Jun 2024 02:06:47 GMT
- Title: Graspness Discovery in Clutters for Fast and Accurate Grasp Detection
- Authors: Chenxi Wang, Hao-Shu Fang, Minghao Gou, Hongjie Fang, Jin Gao, Cewu Lu,
- Abstract summary: "graspness" is a quality based on geometry cues that distinguishes graspable areas in cluttered scenes.
We develop a neural network named cascaded graspness model to approximate the searching process.
Experiments on a large-scale benchmark, GraspNet-1Billion, show that our method outperforms previous arts by a large margin.
- Score: 57.81325062171676
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Efficient and robust grasp pose detection is vital for robotic manipulation. For general 6 DoF grasping, conventional methods treat all points in a scene equally and usually adopt uniform sampling to select grasp candidates. However, we discover that ignoring where to grasp greatly harms the speed and accuracy of current grasp pose detection methods. In this paper, we propose "graspness", a quality based on geometry cues that distinguishes graspable areas in cluttered scenes. A look-ahead searching method is proposed for measuring the graspness and statistical results justify the rationality of our method. To quickly detect graspness in practice, we develop a neural network named cascaded graspness model to approximate the searching process. Extensive experiments verify the stability, generality and effectiveness of our graspness model, allowing it to be used as a plug-and-play module for different methods. A large improvement in accuracy is witnessed for various previous methods after equipping our graspness model. Moreover, we develop GSNet, an end-to-end network that incorporates our graspness model for early filtering of low-quality predictions. Experiments on a large-scale benchmark, GraspNet-1Billion, show that our method outperforms previous arts by a large margin (30+ AP) and achieves a high inference speed. The library of GSNet has been integrated into AnyGrasp, which is at https://github.com/graspnet/anygrasp_sdk.
Related papers
- Normality Learning-based Graph Anomaly Detection via Multi-Scale
Contrastive Learning [61.57383634677747]
Graph anomaly detection (GAD) has attracted increasing attention in machine learning and data mining.
Here, we propose a normality learning-based GAD framework via multi-scale contrastive learning networks (NLGAD for abbreviation)
Notably, the proposed algorithm improves the detection performance (up to 5.89% AUC gain) compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-09-12T08:06:04Z) - Calibrating the Rigged Lottery: Making All Tickets Reliable [14.353428281239665]
We propose a new sparse training method to produce sparse models with improved confidence calibration.
Our method simultaneously maintains or even improves accuracy with only a slight increase in computation and storage burden.
arXiv Detail & Related papers (2023-02-18T15:53:55Z) - Hybrid Physical Metric For 6-DoF Grasp Pose Detection [46.84694505427047]
We propose a hybrid physical metric to generate elaborate confidence scores for 6-DoF grasp pose detection.
To learn the new confidence scores effectively, we design a multi-resolution network called Flatness Gravity Collision GraspNet.
Our method achieves 90.5% success rate in real-world cluttered scenes.
arXiv Detail & Related papers (2022-06-22T14:35:48Z) - Effective Model Sparsification by Scheduled Grow-and-Prune Methods [73.03533268740605]
We propose a novel scheduled grow-and-prune (GaP) methodology without pre-training the dense models.
Experiments have shown that such models can match or beat the quality of highly optimized dense models at 80% sparsity on a variety of tasks.
arXiv Detail & Related papers (2021-06-18T01:03:13Z) - GKNet: grasp keypoint network for grasp candidates detection [15.214390498300101]
This paper presents a different approach to grasp detection by treating it as keypoint detection.
The deep network detects each grasp candidate as a pair of keypoints, convertible to the grasp representation g = x, y, w, thetaT, rather than a triplet or quartet of corner points.
arXiv Detail & Related papers (2021-06-16T00:34:55Z) - Sample and Computation Redistribution for Efficient Face Detection [137.19388513633484]
Training data sampling and computation distribution strategies are the keys to efficient and accurate face detection.
scrfdf34 outperforms the best competitor, TinaFace, by $3.86%$ (AP at hard set) while being more than emph3$times$ faster on GPUs with VGA-resolution images.
arXiv Detail & Related papers (2021-05-10T23:51:14Z) - SuctionNet-1Billion: A Large-Scale Benchmark for Suction Grasping [47.221326169627666]
We propose a new physical model to analytically evaluate seal formation and wrench resistance of a suction grasping.
A two-step methodology is adopted to generate annotations on a large-scale dataset collected in real-world cluttered scenarios.
A standard online evaluation system is proposed to evaluate suction poses in continuous operation space.
arXiv Detail & Related papers (2021-03-23T05:02:52Z) - Lightweight Convolutional Neural Network with Gaussian-based Grasping
Representation for Robotic Grasping Detection [4.683939045230724]
Current object detectors are difficult to strike a balance between high accuracy and fast inference speed.
We present an efficient and robust fully convolutional neural network model to perform robotic grasping pose estimation.
The network is an order of magnitude smaller than other excellent algorithms.
arXiv Detail & Related papers (2021-01-25T16:36:53Z) - Learning a Unified Sample Weighting Network for Object Detection [113.98404690619982]
Region sampling or weighting is significantly important to the success of modern region-based object detectors.
We argue that sample weighting should be data-dependent and task-dependent.
We propose a unified sample weighting network to predict a sample's task weights.
arXiv Detail & Related papers (2020-06-11T16:19:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.