ProPML: Probability Partial Multi-label Learning
- URL: http://arxiv.org/abs/2403.07603v1
- Date: Tue, 12 Mar 2024 12:40:23 GMT
- Title: ProPML: Probability Partial Multi-label Learning
- Authors: {\L}ukasz Struski, Adam Pardyl, Jacek Tabor, Bartosz Zieli\'nski
- Abstract summary: Partial Multi-label Learning (PML) is a type of weakly supervised learning where each training instance corresponds to a set of candidate labels, among which only some are true.
In this paper, we introduce our, a novel probabilistic approach to this problem that extends the binary cross entropy to the PML setup.
- Score: 12.814910734614351
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Partial Multi-label Learning (PML) is a type of weakly supervised learning
where each training instance corresponds to a set of candidate labels, among
which only some are true. In this paper, we introduce \our{}, a novel
probabilistic approach to this problem that extends the binary cross entropy to
the PML setup. In contrast to existing methods, it does not require suboptimal
disambiguation and, as such, can be applied to any deep architecture.
Furthermore, experiments conducted on artificial and real-world datasets
indicate that \our{} outperforms existing approaches, especially for high noise
in a candidate set.
Related papers
- Learning with Complementary Labels Revisited: The Selected-Completely-at-Random Setting Is More Practical [66.57396042747706]
Complementary-label learning is a weakly supervised learning problem.
We propose a consistent approach that does not rely on the uniform distribution assumption.
We find that complementary-label learning can be expressed as a set of negative-unlabeled binary classification problems.
arXiv Detail & Related papers (2023-11-27T02:59:17Z) - Can Class-Priors Help Single-Positive Multi-Label Learning? [40.312419865957224]
Single-positive multi-label learning (SPMLL) is a typical weakly supervised multi-label learning problem.
Class-priors estimator is introduced, which could estimate the class-priors that are theoretically guaranteed to converge to the ground-truth class-priors.
Based on the estimated class-priors, an unbiased risk estimator for classification is derived, and the corresponding risk minimizer could be guaranteed to approximately converge to the optimal risk minimizer on fully supervised data.
arXiv Detail & Related papers (2023-09-25T05:45:57Z) - Robust Representation Learning for Unreliable Partial Label Learning [86.909511808373]
Partial Label Learning (PLL) is a type of weakly supervised learning where each training instance is assigned a set of candidate labels, but only one label is the ground-truth.
This is known as Unreliable Partial Label Learning (UPLL) that introduces an additional complexity due to the inherent unreliability and ambiguity of partial labels.
We propose the Unreliability-Robust Representation Learning framework (URRL) that leverages unreliability-robust contrastive learning to help the model fortify against unreliable partial labels effectively.
arXiv Detail & Related papers (2023-08-31T13:37:28Z) - Sample-Efficient Learning of POMDPs with Multiple Observations In
Hindsight [105.6882315781987]
This paper studies the sample-efficiency of learning in Partially Observable Markov Decision Processes (POMDPs)
Motivated by real-world settings such as loading in game playing, we propose an enhanced feedback model called multiple observations in hindsight''
We show that sample-efficient learning is possible for two new subclasses of POMDPs: emphmulti-observation revealing POMDPs and emphdistinguishable POMDPs
arXiv Detail & Related papers (2023-07-06T09:39:01Z) - ProPaLL: Probabilistic Partial Label Learning [14.299728437638512]
Partial label learning is a type of weakly supervised learning, where each training instance corresponds to a set of candidate labels, among which only one is true.
In this paper, we introduce ProPaLL, a novel probabilistic approach to this problem, which has at least three advantages compared to the existing approaches.
Experiments conducted on artificial and real-world datasets indicate that ProPaLL outperforms the existing approaches.
arXiv Detail & Related papers (2022-08-21T17:47:44Z) - A Deep Model for Partial Multi-Label Image Classification with Curriculum Based Disambiguation [42.0958430465578]
We study the partial multi-label (PML) image classification problem.
Existing PML methods typically design a disambiguation strategy to filter out noisy labels.
We propose a deep model for PML to enhance the representation and discrimination ability.
arXiv Detail & Related papers (2022-07-06T02:49:02Z) - Risk Consistent Multi-Class Learning from Label Proportions [64.0125322353281]
This study addresses a multiclass learning from label proportions (MCLLP) setting in which training instances are provided in bags.
Most existing MCLLP methods impose bag-wise constraints on the prediction of instances or assign them pseudo-labels.
A risk-consistent method is proposed for instance classification using the empirical risk minimization framework.
arXiv Detail & Related papers (2022-03-24T03:49:04Z) - Fast Multi-label Learning [19.104773591885884]
The goal of this paper is to provide a simple method, yet with provable guarantees, which can achieve competitive performance without a complex training process.
arXiv Detail & Related papers (2021-08-31T01:07:42Z) - pRSL: Interpretable Multi-label Stacking by Learning Probabilistic Rules [0.0]
We present the probabilistic rule stacking (pRSL) which uses probabilistic propositional logic rules and belief propagation to combine the predictions of several underlying classifiers.
We derive algorithms for exact and approximate inference and learning, and show that pRSL reaches state-of-the-art performance on various benchmark datasets.
arXiv Detail & Related papers (2021-05-28T14:06:21Z) - Provably Consistent Partial-Label Learning [120.4734093544867]
Partial-label learning (PLL) is a multi-class classification problem, where each training example is associated with a set of candidate labels.
In this paper, we propose the first generation model of candidate label sets, and develop two novel methods that are guaranteed to be consistent.
Experiments on benchmark and real-world datasets validate the effectiveness of the proposed generation model and two methods.
arXiv Detail & Related papers (2020-07-17T12:19:16Z) - Progressive Identification of True Labels for Partial-Label Learning [112.94467491335611]
Partial-label learning (PLL) is a typical weakly supervised learning problem, where each training instance is equipped with a set of candidate labels among which only one is the true label.
Most existing methods elaborately designed as constrained optimizations that must be solved in specific manners, making their computational complexity a bottleneck for scaling up to big data.
This paper proposes a novel framework of classifier with flexibility on the model and optimization algorithm.
arXiv Detail & Related papers (2020-02-19T08:35:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.