ScarceNet: Animal Pose Estimation with Scarce Annotations
- URL: http://arxiv.org/abs/2303.15023v1
- Date: Mon, 27 Mar 2023 09:15:53 GMT
- Title: ScarceNet: Animal Pose Estimation with Scarce Annotations
- Authors: Chen Li and Gim Hee Lee
- Abstract summary: ScarceNet is a pseudo label-based approach to generate artificial labels for the unlabeled images.
We evaluate our approach on the challenging AP-10K dataset, where our approach outperforms existing semi-supervised approaches by a large margin.
- Score: 74.48263583706712
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Animal pose estimation is an important but under-explored task due to the
lack of labeled data. In this paper, we tackle the task of animal pose
estimation with scarce annotations, where only a small set of labeled data and
unlabeled images are available. At the core of the solution to this problem
setting is the use of the unlabeled data to compensate for the lack of
well-labeled animal pose data. To this end, we propose the ScarceNet, a pseudo
label-based approach to generate artificial labels for the unlabeled images.
The pseudo labels, which are generated with a model trained with the small set
of labeled images, are generally noisy and can hurt the performance when
directly used for training. To solve this problem, we first use a small-loss
trick to select reliable pseudo labels. Although effective, the selection
process is improvident since numerous high-loss samples are left unused. We
further propose to identify reusable samples from the high-loss samples based
on an agreement check. Pseudo labels are re-generated to provide supervision
for those reusable samples. Lastly, we introduce a student-teacher framework to
enforce a consistency constraint since there are still samples that are neither
reliable nor reusable. By combining the reliable pseudo label selection with
the reusable sample re-labeling and the consistency constraint, we can make
full use of the unlabeled data. We evaluate our approach on the challenging
AP-10K dataset, where our approach outperforms existing semi-supervised
approaches by a large margin. We also test on the TigDog dataset, where our
approach can achieve better performance than domain adaptation based approaches
when only very few annotations are available. Our code is available at the
project website.
Related papers
- Pseudo-Labeling by Multi-Policy Viewfinder Network for Image Cropping [19.12798332848528]
We explore the possibility of utilizing both labeled and unlabeled data together to expand the scale of training data for image cropping models.
This idea can be implemented in a pseudo-labeling way: producing pseudo labels for unlabeled data by a teacher model and training a student model with these pseudo labels.
We propose the multi-policy viewfinder network (MPV-Net) that offers diverse refining policies to rectify the mistakes in original pseudo labels from the teacher.
arXiv Detail & Related papers (2024-07-02T06:02:05Z) - You can't handle the (dirty) truth: Data-centric insights improve pseudo-labeling [60.27812493442062]
We show the importance of investigating labeled data quality to improve any pseudo-labeling method.
Specifically, we introduce a novel data characterization and selection framework called DIPS to extend pseudo-labeling.
We demonstrate the applicability and impact of DIPS for various pseudo-labeling methods across an extensive range of real-world datasets.
arXiv Detail & Related papers (2024-06-19T17:58:40Z) - Robust Assignment of Labels for Active Learning with Sparse and Noisy
Annotations [0.17188280334580192]
Supervised classification algorithms are used to solve a growing number of real-life problems around the globe.
Unfortunately, acquiring good-quality annotations for many tasks is infeasible or too expensive to be done in practice.
We propose two novel annotation unification algorithms that utilize unlabeled parts of the sample space.
arXiv Detail & Related papers (2023-07-25T19:40:41Z) - Soft Curriculum for Learning Conditional GANs with Noisy-Labeled and
Uncurated Unlabeled Data [70.25049762295193]
We introduce a novel conditional image generation framework that accepts noisy-labeled and uncurated data during training.
We propose soft curriculum learning, which assigns instance-wise weights for adversarial training while assigning new labels for unlabeled data.
Our experiments show that our approach outperforms existing semi-supervised and label-noise robust methods in terms of both quantitative and qualitative performance.
arXiv Detail & Related papers (2023-07-17T08:31:59Z) - Partial-Label Regression [54.74984751371617]
Partial-label learning is a weakly supervised learning setting that allows each training example to be annotated with a set of candidate labels.
Previous studies on partial-label learning only focused on the classification setting where candidate labels are all discrete.
In this paper, we provide the first attempt to investigate partial-label regression, where each training example is annotated with a set of real-valued candidate labels.
arXiv Detail & Related papers (2023-06-15T09:02:24Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Pseudo-Label Noise Suppression Techniques for Semi-Supervised Semantic
Segmentation [21.163070161951868]
Semi-consuming learning (SSL) can reduce the need for large labelled datasets by incorporating unsupervised data into the training.
Current SSL approaches use an initially supervised trained model to generate predictions for unlabelled images, called pseudo-labels.
We use three mechanisms to control pseudo-label noise and errors.
arXiv Detail & Related papers (2022-10-19T09:46:27Z) - Boosting Semi-Supervised Face Recognition with Noise Robustness [54.342992887966616]
This paper presents an effective solution to semi-supervised face recognition that is robust to the label noise aroused by the auto-labelling.
We develop a semi-supervised face recognition solution, named Noise Robust Learning-Labelling (NRoLL), which is based on the robust training ability empowered by GN.
arXiv Detail & Related papers (2021-05-10T14:43:11Z) - Complementary Pseudo Labels For Unsupervised Domain Adaptation On Person
Re-identification [46.17084786039097]
We propose a joint learning framework to learn better feature embeddings via high precision neighbor pseudo labels and high recall group pseudo labels.
Our method can achieve state-of-the-art performance under the unsupervised domain adaptation re-ID setting.
arXiv Detail & Related papers (2021-01-29T11:06:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.