Semi-Supervised Class-Agnostic Motion Prediction with Pseudo Label
Regeneration and BEVMix
- URL: http://arxiv.org/abs/2312.08009v2
- Date: Thu, 14 Dec 2023 11:16:05 GMT
- Title: Semi-Supervised Class-Agnostic Motion Prediction with Pseudo Label
Regeneration and BEVMix
- Authors: Kewei Wang, Yizheng Wu, Zhiyu Pan, Xingyi Li, Ke Xian, Zhe Wang,
Zhiguo Cao, Guosheng Lin
- Abstract summary: We study the potential of semi-supervised learning for class-agnostic motion prediction.
Our framework adopts a consistency-based self-training paradigm, enabling the model to learn from unlabeled data.
Our method exhibits comparable performance to weakly and some fully supervised methods.
- Score: 59.55173022987071
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Class-agnostic motion prediction methods aim to comprehend motion within
open-world scenarios, holding significance for autonomous driving systems.
However, training a high-performance model in a fully-supervised manner always
requires substantial amounts of manually annotated data, which can be both
expensive and time-consuming to obtain. To address this challenge, our study
explores the potential of semi-supervised learning (SSL) for class-agnostic
motion prediction. Our SSL framework adopts a consistency-based self-training
paradigm, enabling the model to learn from unlabeled data by generating pseudo
labels through test-time inference. To improve the quality of pseudo labels, we
propose a novel motion selection and re-generation module. This module
effectively selects reliable pseudo labels and re-generates unreliable ones.
Furthermore, we propose two data augmentation strategies: temporal sampling and
BEVMix. These strategies facilitate consistency regularization in SSL.
Experiments conducted on nuScenes demonstrate that our SSL method can surpass
the self-supervised approach by a large margin by utilizing only a tiny
fraction of labeled data. Furthermore, our method exhibits comparable
performance to weakly and some fully supervised methods. These results
highlight the ability of our method to strike a favorable balance between
annotation costs and performance. Code will be available at
https://github.com/kwwcv/SSMP.
Related papers
- Dual-Decoupling Learning and Metric-Adaptive Thresholding for Semi-Supervised Multi-Label Learning [81.83013974171364]
Semi-supervised multi-label learning (SSMLL) is a powerful framework for leveraging unlabeled data to reduce the expensive cost of collecting precise multi-label annotations.
Unlike semi-supervised learning, one cannot select the most probable label as the pseudo-label in SSMLL due to multiple semantics contained in an instance.
We propose a dual-perspective method to generate high-quality pseudo-labels.
arXiv Detail & Related papers (2024-07-26T09:33:53Z) - Learning Label Refinement and Threshold Adjustment for Imbalanced Semi-Supervised Learning [6.904448748214652]
Semi-supervised learning algorithms struggle to perform well when exposed to imbalanced training data.
We introduce SEmi-supervised learning with pseudo-label optimization based on VALidation data (SEVAL)
SEVAL adapts to specific tasks with improved pseudo-labels accuracy and ensures pseudo-labels correctness on a per-class basis.
arXiv Detail & Related papers (2024-07-07T13:46:22Z) - Reinforcement Learning-Guided Semi-Supervised Learning [20.599506122857328]
We propose a novel Reinforcement Learning Guided SSL method, RLGSSL, that formulates SSL as a one-armed bandit problem.
RLGSSL incorporates a carefully designed reward function that balances the use of labeled and unlabeled data to enhance generalization performance.
We demonstrate the effectiveness of RLGSSL through extensive experiments on several benchmark datasets and show that our approach achieves consistent superior performance compared to state-of-the-art SSL methods.
arXiv Detail & Related papers (2024-05-02T21:52:24Z) - A Channel-ensemble Approach: Unbiased and Low-variance Pseudo-labels is Critical for Semi-supervised Classification [61.473485511491795]
Semi-supervised learning (SSL) is a practical challenge in computer vision.
Pseudo-label (PL) methods, e.g., FixMatch and FreeMatch, obtain the State Of The Art (SOTA) performances in SSL.
We propose a lightweight channel-based ensemble method to consolidate multiple inferior PLs into the theoretically guaranteed unbiased and low-variance one.
arXiv Detail & Related papers (2024-03-27T09:49:37Z) - Progressive Feature Adjustment for Semi-supervised Learning from
Pretrained Models [39.42802115580677]
Semi-supervised learning (SSL) can leverage both labeled and unlabeled data to build a predictive model.
Recent literature suggests that naively applying state-of-the-art SSL with a pretrained model fails to unleash the full potential of training data.
We propose to use pseudo-labels from the unlabelled data to update the feature extractor that is less sensitive to incorrect labels.
arXiv Detail & Related papers (2023-09-09T01:57:14Z) - Dash: Semi-Supervised Learning with Dynamic Thresholding [72.74339790209531]
We propose a semi-supervised learning (SSL) approach that uses unlabeled examples to train models.
Our proposed approach, Dash, enjoys its adaptivity in terms of unlabeled data selection.
arXiv Detail & Related papers (2021-09-01T23:52:29Z) - Self-Tuning for Data-Efficient Deep Learning [75.34320911480008]
Self-Tuning is a novel approach to enable data-efficient deep learning.
It unifies the exploration of labeled and unlabeled data and the transfer of a pre-trained model.
It outperforms its SSL and TL counterparts on five tasks by sharp margins.
arXiv Detail & Related papers (2021-02-25T14:56:19Z) - In Defense of Pseudo-Labeling: An Uncertainty-Aware Pseudo-label
Selection Framework for Semi-Supervised Learning [53.1047775185362]
Pseudo-labeling (PL) is a general SSL approach that does not have this constraint but performs relatively poorly in its original formulation.
We argue that PL underperforms due to the erroneous high confidence predictions from poorly calibrated models.
We propose an uncertainty-aware pseudo-label selection (UPS) framework which improves pseudo labeling accuracy by drastically reducing the amount of noise encountered in the training process.
arXiv Detail & Related papers (2021-01-15T23:29:57Z) - ESL: Entropy-guided Self-supervised Learning for Domain Adaptation in
Semantic Segmentation [35.03150829133562]
We propose Entropy-guided Self-supervised Learning, leveraging entropy as the confidence indicator for producing more accurate pseudo-labels.
On different UDA benchmarks, ESL consistently outperforms strong SSL baselines and achieves state-of-the-art results.
arXiv Detail & Related papers (2020-06-15T18:10:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.