Latent label distribution grid representation for modeling uncertainty
- URL: http://arxiv.org/abs/2505.21180v1
- Date: Tue, 27 May 2025 13:31:37 GMT
- Title: Latent label distribution grid representation for modeling uncertainty
- Authors: ShuNing Sun, YinSong Xiong, Yu Zhang, Zhuoran Zheng,
- Abstract summary: textbfLatent textbfDistribution textbfLearning (LDL) has promising representation capabilities for characterizing the polysemy of an instance.<n>We construct a textbfLatent textbfDistribution textbfGrid (LLDG) to form a low-noise representation space.
- Score: 4.597656573673848
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Although \textbf{L}abel \textbf{D}istribution \textbf{L}earning (LDL) has promising representation capabilities for characterizing the polysemy of an instance, the complexity and high cost of the label distribution annotation lead to inexact in the construction of the label space. The existence of a large number of inexact labels generates a label space with uncertainty, which misleads the LDL algorithm to yield incorrect decisions. To alleviate this problem, we model the uncertainty of label distributions by constructing a \textbf{L}atent \textbf{L}abel \textbf{D}istribution \textbf{G}rid (LLDG) to form a low-noise representation space. Specifically, we first construct a label correlation matrix based on the differences between labels, and then expand each value of the matrix into a vector that obeys a Gaussian distribution, thus building a LLDG to model the uncertainty of the label space. Finally, the LLDG is reconstructed by the LLDG-Mixer to generate an accurate label distribution. Note that we enforce a customized low-rank scheme on this grid, which assumes that the label relations may be noisy and it needs to perform noise-reduction with the help of a Tucker reconstruction technique. Furthermore, we attempt to evaluate the effectiveness of the LLDG by considering its generation as an upstream task to achieve the classification of the objects. Extensive experimental results show that our approach performs competitively on several benchmarks.
Related papers
- Diffusion Disambiguation Models for Partial Label Learning [7.869159280247732]
Learning from ambiguous labels is a long-standing problem in practical machine learning applications.<n>Inspired by the remarkable performance of diffusion models in various generation tasks, this paper explores their potential to denoise ambiguous labels.<n>This paper proposes a emphdiffusion disambiguation model for (DDMP), which first uses the potential complementary information between instances and labels.
arXiv Detail & Related papers (2025-07-01T03:53:45Z) - Label Distribution Learning with Biased Annotations by Learning Multi-Label Representation [120.97262070068224]
Multi-label learning (MLL) has gained attention for its ability to represent real-world data.<n>Label Distribution Learning (LDL) faces challenges in collecting accurate label distributions.
arXiv Detail & Related papers (2025-02-03T09:04:03Z) - Reduction-based Pseudo-label Generation for Instance-dependent Partial Label Learning [41.345794038968776]
We propose to leverage reduction-based pseudo-labels to alleviate the influence of incorrect candidate labels.
We show that reduction-based pseudo-labels exhibit greater consistency with the Bayes optimal classifier compared to pseudo-labels directly generated from the predictive model.
arXiv Detail & Related papers (2024-10-28T07:32:20Z) - Towards Better Performance in Incomplete LDL: Addressing Data Imbalance [48.54894491724677]
We propose textIncomplete and Imbalance Label Distribution Learning (I(2)LDL), a framework that simultaneously handles incomplete labels and imbalanced label distributions.
Our method decomposes the label distribution matrix into a low-rank component for frequent labels and a sparse component for rare labels, effectively capturing the structure of both head and tail labels.
arXiv Detail & Related papers (2024-10-17T14:12:57Z) - Inaccurate Label Distribution Learning with Dependency Noise [52.08553913094809]
We introduce the Dependent Noise-based Inaccurate Label Distribution Learning (DN-ILDL) framework to tackle the challenges posed by noise in label distribution learning.
We show that DN-ILDL effectively addresses the ILDL problem and outperforms existing LDL methods.
arXiv Detail & Related papers (2024-05-26T07:58:07Z) - Dist-PU: Positive-Unlabeled Learning from a Label Distribution
Perspective [89.5370481649529]
We propose a label distribution perspective for PU learning in this paper.
Motivated by this, we propose to pursue the label distribution consistency between predicted and ground-truth label distributions.
Experiments on three benchmark datasets validate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-12-06T07:38:29Z) - Neighbour Consistency Guided Pseudo-Label Refinement for Unsupervised
Person Re-Identification [80.98291772215154]
Unsupervised person re-identification (ReID) aims at learning discriminative identity features for person retrieval without any annotations.
Recent advances accomplish this task by leveraging clustering-based pseudo labels.
We propose a Neighbour Consistency guided Pseudo Label Refinement framework.
arXiv Detail & Related papers (2022-11-30T09:39:57Z) - Label distribution learning via label correlation grid [9.340734188957727]
We propose a textbfLabel textbfCorrelation textbfGrid (LCG) to model the uncertainty of label relationships.
Our network learns the LCG to accurately estimate the label distribution for each instance.
arXiv Detail & Related papers (2022-10-15T03:58:15Z) - Leveraging Instance Features for Label Aggregation in Programmatic Weak
Supervision [75.1860418333995]
Programmatic Weak Supervision (PWS) has emerged as a widespread paradigm to synthesize training labels efficiently.
The core component of PWS is the label model, which infers true labels by aggregating the outputs of multiple noisy supervision sources as labeling functions.
Existing statistical label models typically rely only on the outputs of LF, ignoring the instance features when modeling the underlying generative process.
arXiv Detail & Related papers (2022-10-06T07:28:53Z) - Label Distribution Learning via Implicit Distribution Representation [12.402054374952485]
In this paper, we introduce the implicit distribution in the label distribution learning framework to characterize the uncertainty of each label value.
Specifically, we use deep implicit representation learning to construct a label distribution matrix with Gaussian prior constraints.
Each row component of the label distribution matrix is transformed into a standard label distribution form by using the self-attention algorithm.
arXiv Detail & Related papers (2022-09-28T04:13:53Z) - Instance-Dependent Partial Label Learning [69.49681837908511]
Partial label learning is a typical weakly supervised learning problem.
Most existing approaches assume that the incorrect labels in each training example are randomly picked as the candidate labels.
In this paper, we consider instance-dependent and assume that each example is associated with a latent label distribution constituted by the real number of each label.
arXiv Detail & Related papers (2021-10-25T12:50:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.