Label distribution learning via label correlation grid
- URL: http://arxiv.org/abs/2210.08184v1
- Date: Sat, 15 Oct 2022 03:58:15 GMT
- Title: Label distribution learning via label correlation grid
- Authors: Qimeng Guo, Zhuoran Zheng, Xiuyi Jia, Liancheng Xu
- Abstract summary: We propose a textbfLabel textbfCorrelation textbfGrid (LCG) to model the uncertainty of label relationships.
Our network learns the LCG to accurately estimate the label distribution for each instance.
- Score: 9.340734188957727
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Label distribution learning can characterize the polysemy of an instance
through label distributions. However, some noise and uncertainty may be
introduced into the label space when processing label distribution data due to
artificial or environmental factors. To alleviate this problem, we propose a
\textbf{L}abel \textbf{C}orrelation \textbf{G}rid (LCG) to model the
uncertainty of label relationships. Specifically, we compute a covariance
matrix for the label space in the training set to represent the relationships
between labels, then model the information distribution (Gaussian distribution
function) for each element in the covariance matrix to obtain an LCG. Finally,
our network learns the LCG to accurately estimate the label distribution for
each instance. In addition, we propose a label distribution projection
algorithm as a regularization term in the model training process. Extensive
experiments verify the effectiveness of our method on several real benchmarks.
Related papers
- Towards Better Performance in Incomplete LDL: Addressing Data Imbalance [48.54894491724677]
We propose textIncomplete and Imbalance Label Distribution Learning (I(2)LDL), a framework that simultaneously handles incomplete labels and imbalanced label distributions.
Our method decomposes the label distribution matrix into a low-rank component for frequent labels and a sparse component for rare labels, effectively capturing the structure of both head and tail labels.
arXiv Detail & Related papers (2024-10-17T14:12:57Z) - Inaccurate Label Distribution Learning with Dependency Noise [52.08553913094809]
We introduce the Dependent Noise-based Inaccurate Label Distribution Learning (DN-ILDL) framework to tackle the challenges posed by noise in label distribution learning.
We show that DN-ILDL effectively addresses the ILDL problem and outperforms existing LDL methods.
arXiv Detail & Related papers (2024-05-26T07:58:07Z) - Contrastive Label Enhancement [13.628665406039609]
We propose Contrastive Label Enhancement (ConLE) to generate high-level features by contrastive learning strategy.
We leverage the obtained high-level features to gain label distributions through a welldesigned training strategy.
arXiv Detail & Related papers (2023-05-16T14:53:07Z) - Class-Distribution-Aware Pseudo Labeling for Semi-Supervised Multi-Label
Learning [97.88458953075205]
Pseudo-labeling has emerged as a popular and effective approach for utilizing unlabeled data.
This paper proposes a novel solution called Class-Aware Pseudo-Labeling (CAP) that performs pseudo-labeling in a class-aware manner.
arXiv Detail & Related papers (2023-05-04T12:52:18Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Label Distribution Learning via Implicit Distribution Representation [12.402054374952485]
In this paper, we introduce the implicit distribution in the label distribution learning framework to characterize the uncertainty of each label value.
Specifically, we use deep implicit representation learning to construct a label distribution matrix with Gaussian prior constraints.
Each row component of the label distribution matrix is transformed into a standard label distribution form by using the self-attention algorithm.
arXiv Detail & Related papers (2022-09-28T04:13:53Z) - Instance-Dependent Partial Label Learning [69.49681837908511]
Partial label learning is a typical weakly supervised learning problem.
Most existing approaches assume that the incorrect labels in each training example are randomly picked as the candidate labels.
In this paper, we consider instance-dependent and assume that each example is associated with a latent label distribution constituted by the real number of each label.
arXiv Detail & Related papers (2021-10-25T12:50:26Z) - Distribution-Aware Semantics-Oriented Pseudo-label for Imbalanced
Semi-Supervised Learning [80.05441565830726]
This paper addresses imbalanced semi-supervised learning, where heavily biased pseudo-labels can harm the model performance.
We propose a general pseudo-labeling framework to address the bias motivated by this observation.
We term the novel pseudo-labeling framework for imbalanced SSL as Distribution-Aware Semantics-Oriented (DASO) Pseudo-label.
arXiv Detail & Related papers (2021-06-10T11:58:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.