Uncertainty-Aware Label Refinement for Sequence Labeling
- URL: http://arxiv.org/abs/2012.10608v1
- Date: Sat, 19 Dec 2020 06:56:59 GMT
- Title: Uncertainty-Aware Label Refinement for Sequence Labeling
- Authors: Tao Gui, Jiacheng Ye, Qi Zhang, Zhengyan Li, Zichu Fei, Yeyun Gong and
Xuanjing Huang
- Abstract summary: We introduce a novel two-stage label decoding framework to model long-term label dependencies.
A base model first predicts draft labels, and then a novel two-stream self-attention model makes refinements on these draft predictions.
- Score: 47.67853514765981
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Conditional random fields (CRF) for label decoding has become ubiquitous in
sequence labeling tasks. However, the local label dependencies and inefficient
Viterbi decoding have always been a problem to be solved. In this work, we
introduce a novel two-stage label decoding framework to model long-term label
dependencies, while being much more computationally efficient. A base model
first predicts draft labels, and then a novel two-stream self-attention model
makes refinements on these draft predictions based on long-range label
dependencies, which can achieve parallel decoding for a faster prediction. In
addition, in order to mitigate the side effects of incorrect draft labels,
Bayesian neural networks are used to indicate the labels with a high
probability of being wrong, which can greatly assist in preventing error
propagation. The experimental results on three sequence labeling benchmarks
demonstrated that the proposed method not only outperformed the CRF-based
methods but also greatly accelerated the inference process.
Related papers
- Online Multi-Label Classification under Noisy and Changing Label Distribution [9.17381554071824]
We propose an online multi-label classification algorithm under Noisy and Changing Label Distribution (NCLD)
The objective is to simultaneously model the label scoring and the label ranking for high accuracy, whose robustness to NCLD benefits from three novel works.
arXiv Detail & Related papers (2024-10-03T11:16:43Z) - Generating Unbiased Pseudo-labels via a Theoretically Guaranteed
Chebyshev Constraint to Unify Semi-supervised Classification and Regression [57.17120203327993]
threshold-to-pseudo label process (T2L) in classification uses confidence to determine the quality of label.
In nature, regression also requires unbiased methods to generate high-quality labels.
We propose a theoretically guaranteed constraint for generating unbiased labels based on Chebyshev's inequality.
arXiv Detail & Related papers (2023-11-03T08:39:35Z) - All Points Matter: Entropy-Regularized Distribution Alignment for
Weakly-supervised 3D Segmentation [67.30502812804271]
Pseudo-labels are widely employed in weakly supervised 3D segmentation tasks where only sparse ground-truth labels are available for learning.
We propose a novel learning strategy to regularize the generated pseudo-labels and effectively narrow the gaps between pseudo-labels and model predictions.
arXiv Detail & Related papers (2023-05-25T08:19:31Z) - Exploiting Completeness and Uncertainty of Pseudo Labels for Weakly
Supervised Video Anomaly Detection [149.23913018423022]
Weakly supervised video anomaly detection aims to identify abnormal events in videos using only video-level labels.
Two-stage self-training methods have achieved significant improvements by self-generating pseudo labels.
We propose an enhancement framework by exploiting completeness and uncertainty properties for effective self-training.
arXiv Detail & Related papers (2022-12-08T05:53:53Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - CLS: Cross Labeling Supervision for Semi-Supervised Learning [9.929229055862491]
Cross Labeling Supervision ( CLS) is a framework that generalizes the typical pseudo-labeling process.
CLS allows the creation of both pseudo and complementary labels to support both positive and negative learning.
arXiv Detail & Related papers (2022-02-17T08:09:40Z) - Learning with Noisy Labels by Targeted Relabeling [52.0329205268734]
Crowdsourcing platforms are often used to collect datasets for training deep neural networks.
We propose an approach which reserves a fraction of annotations to explicitly relabel highly probable labeling errors.
arXiv Detail & Related papers (2021-10-15T20:37:29Z) - Modeling Diagnostic Label Correlation for Automatic ICD Coding [37.79764232289666]
We propose a two-stage framework to improve automatic ICD coding by capturing the label correlation.
Specifically, we train a label set distribution estimator to rescore the probability of each label set candidate.
In the experiments, our proposed framework is able to improve upon best-performing predictors on the benchmark MIMIC datasets.
arXiv Detail & Related papers (2021-06-24T07:26:30Z) - Enhancing Label Correlation Feedback in Multi-Label Text Classification
via Multi-Task Learning [6.1538971100140145]
We introduce a novel approach with multi-task learning to enhance label correlation feedback.
We propose two auxiliary label co-occurrence prediction tasks to enhance label correlation learning.
arXiv Detail & Related papers (2021-06-06T12:26:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.