Narrowing the Gap: Improved Detector Training with Noisy Location
Annotations
- URL: http://arxiv.org/abs/2206.05708v1
- Date: Sun, 12 Jun 2022 10:03:01 GMT
- Title: Narrowing the Gap: Improved Detector Training with Noisy Location
Annotations
- Authors: Shaoru Wang, Jin Gao, Bing Li, Weiming Hu
- Abstract summary: In this paper, we focus on the impact of noisy location annotations on the performance of object detection approaches.
We propose a self-correction technique based on a Bayesian filter for prediction ensemble to better exploit the noisy location annotations.
- Score: 32.6077497559231
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep learning methods require massive of annotated data for optimizing
parameters. For example, datasets attached with accurate bounding box
annotations are essential for modern object detection tasks. However, labeling
with such pixel-wise accuracy is laborious and time-consuming, and elaborate
labeling procedures are indispensable for reducing man-made noise, involving
annotation review and acceptance testing. In this paper, we focus on the impact
of noisy location annotations on the performance of object detection approaches
and aim to, on the user side, reduce the adverse effect of the noise. First,
noticeable performance degradation is experimentally observed for both
one-stage and two-stage detectors when noise is introduced to the bounding box
annotations. For instance, our synthesized noise results in performance
decrease from 38.9% AP to 33.6% AP for FCOS detector on COCO test split, and
37.8%AP to 33.7%AP for Faster R-CNN. Second, a self-correction technique based
on a Bayesian filter for prediction ensemble is proposed to better exploit the
noisy location annotations following a Teacher-Student learning paradigm.
Experiments for both synthesized and real-world scenarios consistently
demonstrate the effectiveness of our approach, e.g., our method increases the
degraded performance of the FCOS detector from 33.6% AP to 35.6% AP on COCO.
Related papers
- Noise2Score3D:Unsupervised Tweedie's Approach for Point Cloud Denoising [0.0]
Noise2Score3D learns the gradient of the underlying point cloud distribution directly from noisy data.
Our method performs inference in a single step, avoiding the iterative processes used in existing unsupervised methods.
We introduce Total Variation for Point Cloud, a criterion that allows for the estimation of unknown noise parameters.
arXiv Detail & Related papers (2025-02-24T04:23:21Z) - SoftPatch: Unsupervised Anomaly Detection with Noisy Data [67.38948127630644]
This paper considers label-level noise in image sensory anomaly detection for the first time.
We propose a memory-based unsupervised AD method, SoftPatch, which efficiently denoises the data at the patch level.
Compared with existing methods, SoftPatch maintains a strong modeling ability of normal data and alleviates the overconfidence problem in coreset.
arXiv Detail & Related papers (2024-03-21T08:49:34Z) - Learning Contrastive Feature Representations for Facial Action Unit Detection [13.834540490373818]
Facial action unit (AU) detection has long encountered the challenge of detecting subtle feature differences when AUs activate.
We introduce a novel contrastive learning framework aimed for AU detection that incorporates both self-supervised and supervised signals.
arXiv Detail & Related papers (2024-02-09T03:48:20Z) - Robust Tiny Object Detection in Aerial Images amidst Label Noise [50.257696872021164]
This study addresses the issue of tiny object detection under noisy label supervision.
We propose a DeNoising Tiny Object Detector (DN-TOD), which incorporates a Class-aware Label Correction scheme.
Our method can be seamlessly integrated into both one-stage and two-stage object detection pipelines.
arXiv Detail & Related papers (2024-01-16T02:14:33Z) - Label Noise: Correcting the Forward-Correction [0.0]
Training neural network classifiers on datasets with label noise poses a risk of overfitting them to the noisy labels.
We propose an approach to tackling overfitting caused by label noise.
Motivated by this observation, we propose imposing a lower bound on the training loss to mitigate overfitting.
arXiv Detail & Related papers (2023-07-24T19:41:19Z) - Neighborhood Collective Estimation for Noisy Label Identification and
Correction [92.20697827784426]
Learning with noisy labels (LNL) aims at designing strategies to improve model performance and generalization by mitigating the effects of model overfitting to noisy labels.
Recent advances employ the predicted label distributions of individual samples to perform noise verification and noisy label correction, easily giving rise to confirmation bias.
We propose Neighborhood Collective Estimation, in which the predictive reliability of a candidate sample is re-estimated by contrasting it against its feature-space nearest neighbors.
arXiv Detail & Related papers (2022-08-05T14:47:22Z) - Robust Meta-learning with Sampling Noise and Label Noise via
Eigen-Reptile [78.1212767880785]
meta-learner is prone to overfitting since there are only a few available samples.
When handling the data with noisy labels, the meta-learner could be extremely sensitive to label noise.
We present Eigen-Reptile (ER) that updates the meta- parameters with the main direction of historical task-specific parameters.
arXiv Detail & Related papers (2022-06-04T08:48:02Z) - Partial Identification with Noisy Covariates: A Robust Optimization
Approach [94.10051154390237]
Causal inference from observational datasets often relies on measuring and adjusting for covariates.
We show that this robust optimization approach can extend a wide range of causal adjustment methods to perform partial identification.
Across synthetic and real datasets, we find that this approach provides ATE bounds with a higher coverage probability than existing methods.
arXiv Detail & Related papers (2022-02-22T04:24:26Z) - Noisy Annotation Refinement for Object Detection [47.066070566714984]
We propose a new problem setting of training object detectors on datasets with entangled noises of annotations of class labels and bounding boxes.
Our proposed method efficiently decouples the entangled noises, corrects the noisy annotations, and subsequently trains the detector using the corrected annotations.
arXiv Detail & Related papers (2021-10-20T09:39:50Z) - Towards Noise-resistant Object Detection with Noisy Annotations [119.63458519946691]
Training deep object detectors requires significant amount of human-annotated images with accurate object labels and bounding box coordinates.
Noisy annotations are much more easily accessible, but they could be detrimental for learning.
We address the challenging problem of training object detectors with noisy annotations, where the noise contains a mixture of label noise and bounding box noise.
arXiv Detail & Related papers (2020-03-03T01:32:16Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.