Exploiting Completeness and Uncertainty of Pseudo Labels for Weakly
Supervised Video Anomaly Detection
- URL: http://arxiv.org/abs/2212.04090v1
- Date: Thu, 8 Dec 2022 05:53:53 GMT
- Title: Exploiting Completeness and Uncertainty of Pseudo Labels for Weakly
Supervised Video Anomaly Detection
- Authors: Chen Zhang, Guorong Li, Yuankai Qi, Shuhui Wang, Laiyun Qing, Qingming
Huang, Ming-Hsuan Yang
- Abstract summary: Weakly supervised video anomaly detection aims to identify abnormal events in videos using only video-level labels.
Two-stage self-training methods have achieved significant improvements by self-generating pseudo labels.
We propose an enhancement framework by exploiting completeness and uncertainty properties for effective self-training.
- Score: 149.23913018423022
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Weakly supervised video anomaly detection aims to identify abnormal events in
videos using only video-level labels. Recently, two-stage self-training methods
have achieved significant improvements by self-generating pseudo labels and
self-refining anomaly scores with these labels. As the pseudo labels play a
crucial role, we propose an enhancement framework by exploiting completeness
and uncertainty properties for effective self-training. Specifically, we first
design a multi-head classification module (each head serves as a classifier)
with a diversity loss to maximize the distribution differences of predicted
pseudo labels across heads. This encourages the generated pseudo labels to
cover as many abnormal events as possible. We then devise an iterative
uncertainty pseudo label refinement strategy, which improves not only the
initial pseudo labels but also the updated ones obtained by the desired
classifier in the second stage. Extensive experimental results demonstrate the
proposed method performs favorably against state-of-the-art approaches on the
UCF-Crime, TAD, and XD-Violence benchmark datasets.
Related papers
- Reduction-based Pseudo-label Generation for Instance-dependent Partial Label Learning [41.345794038968776]
We propose to leverage reduction-based pseudo-labels to alleviate the influence of incorrect candidate labels.
We show that reduction-based pseudo-labels exhibit greater consistency with the Bayes optimal classifier compared to pseudo-labels directly generated from the predictive model.
arXiv Detail & Related papers (2024-10-28T07:32:20Z) - Online Multi-Label Classification under Noisy and Changing Label Distribution [9.17381554071824]
We propose an online multi-label classification algorithm under Noisy and Changing Label Distribution (NCLD)
The objective is to simultaneously model the label scoring and the label ranking for high accuracy, whose robustness to NCLD benefits from three novel works.
arXiv Detail & Related papers (2024-10-03T11:16:43Z) - HPL-ESS: Hybrid Pseudo-Labeling for Unsupervised Event-based Semantic Segmentation [47.271784693700845]
We propose a novel hybrid pseudo-labeling framework for unsupervised event-based semantic segmentation, HPL-ESS, to alleviate the influence of noisy pseudo labels.
Our proposed method outperforms existing state-of-the-art methods by a large margin on the DSEC-Semantic dataset.
arXiv Detail & Related papers (2024-03-25T14:02:33Z) - Perceptual Quality-based Model Training under Annotator Label Uncertainty [15.015925663078377]
Annotators exhibit disagreement during data labeling, which can be termed as annotator label uncertainty.
We introduce a novel perceptual quality-based model training framework to objectively generate multiple labels for model training.
arXiv Detail & Related papers (2024-03-15T10:52:18Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Debiased Pseudo Labeling in Self-Training [77.83549261035277]
Deep neural networks achieve remarkable performances on a wide range of tasks with the aid of large-scale labeled datasets.
To mitigate the requirement for labeled data, self-training is widely used in both academia and industry by pseudo labeling on readily-available unlabeled data.
We propose Debiased, in which the generation and utilization of pseudo labels are decoupled by two independent heads.
arXiv Detail & Related papers (2022-02-15T02:14:33Z) - Rethinking Pseudo Labels for Semi-Supervised Object Detection [84.697097472401]
We introduce certainty-aware pseudo labels tailored for object detection.
We dynamically adjust the thresholds used to generate pseudo labels and reweight loss functions for each category to alleviate the class imbalance problem.
Our approach improves supervised baselines by up to 10% AP using only 1-10% labeled data from COCO.
arXiv Detail & Related papers (2021-06-01T01:32:03Z) - Semi-supervised Relation Extraction via Incremental Meta Self-Training [56.633441255756075]
Semi-Supervised Relation Extraction methods aim to leverage unlabeled data in addition to learning from limited samples.
Existing self-training methods suffer from the gradual drift problem, where noisy pseudo labels on unlabeled data are incorporated during training.
We propose a method called MetaSRE, where a Relation Label Generation Network generates quality assessment on pseudo labels by (meta) learning from the successful and failed attempts on Relation Classification Network as an additional meta-objective.
arXiv Detail & Related papers (2020-10-06T03:54:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.