Multi-Label Contrastive Learning : A Comprehensive Study
- URL: http://arxiv.org/abs/2412.00101v2
- Date: Fri, 03 Jan 2025 13:10:09 GMT
- Title: Multi-Label Contrastive Learning : A Comprehensive Study
- Authors: Alexandre Audibert, Aurélien Gauffre, Massih-Reza Amini,
- Abstract summary: Multi-label classification has emerged as a key area in both research and industry.
Applying contrastive learning to multi-label classification presents unique challenges.
We conduct an in-depth study of contrastive learning loss for multi-label classification across diverse settings.
- Score: 48.81069245141415
- License:
- Abstract: Multi-label classification, which involves assigning multiple labels to a single input, has emerged as a key area in both research and industry due to its wide-ranging applications. Designing effective loss functions is crucial for optimizing deep neural networks for this task, as they significantly influence model performance and efficiency. Traditional loss functions, which often maximize likelihood under the assumption of label independence, may struggle to capture complex label relationships. Recent research has turned to supervised contrastive learning, a method that aims to create a structured representation space by bringing similar instances closer together and pushing dissimilar ones apart. Although contrastive learning offers a promising approach, applying it to multi-label classification presents unique challenges, particularly in managing label interactions and data structure. In this paper, we conduct an in-depth study of contrastive learning loss for multi-label classification across diverse settings. These include datasets with both small and large numbers of labels, datasets with varying amounts of training data, and applications in both computer vision and natural language processing. Our empirical results indicate that the promising outcomes of contrastive learning are attributable not only to the consideration of label interactions but also to the robust optimization scheme of the contrastive loss. Furthermore, while the supervised contrastive loss function faces challenges with datasets containing a small number of labels and ranking-based metrics, it demonstrates excellent performance, particularly in terms of Macro-F1, on datasets with a large number of labels.
Related papers
- Multi-Label Bayesian Active Learning with Inter-Label Relationships [3.88369051454137]
We propose a new multi-label active learning strategy to address both challenges.
Our method incorporates progressively updated positive and negative correlation matrices to capture co-occurrence and disjoint relationships.
Our strategy consistently achieves more reliable and superior performance, compared to several established methods.
arXiv Detail & Related papers (2024-11-26T23:28:54Z) - Similarity-Dissimilarity Loss with Supervised Contrastive Learning for Multi-label Classification [11.499489446062054]
We propose a Similarity-Dissimilarity Loss with contrastive learning for multi-label classification.
Our proposed loss effectively improves the performance on all encoders under supervised contrastive learning paradigm.
arXiv Detail & Related papers (2024-10-17T11:12:55Z) - Exploring Contrastive Learning for Long-Tailed Multi-Label Text Classification [48.81069245141415]
We introduce a novel contrastive loss function for multi-label text classification.
It attains Micro-F1 scores that either match or surpass those obtained with other frequently employed loss functions.
It demonstrates a significant improvement in Macro-F1 scores across three multi-label datasets.
arXiv Detail & Related papers (2024-04-12T11:12:16Z) - Label Dropout: Improved Deep Learning Echocardiography Segmentation Using Multiple Datasets With Domain Shift and Partial Labelling [3.2322708710124815]
We propose a novel label dropout scheme to break the link between domain characteristics and the presence or absence of labels.
We demonstrate that label dropout improves echo segmentation Dice score by 62% and 25% on two cardiac structures when training using multiple diverse partially labelled datasets.
arXiv Detail & Related papers (2024-03-12T16:57:56Z) - Learning Semantic Segmentation from Multiple Datasets with Label Shifts [101.24334184653355]
This paper proposes UniSeg, an effective approach to automatically train models across multiple datasets with differing label spaces.
Specifically, we propose two losses that account for conflicting and co-occurring labels to achieve better generalization performance in unseen domains.
arXiv Detail & Related papers (2022-02-28T18:55:19Z) - GuidedMix-Net: Learning to Improve Pseudo Masks Using Labeled Images as
Reference [153.354332374204]
We propose a novel method for semi-supervised semantic segmentation named GuidedMix-Net.
We first introduce a feature alignment objective between labeled and unlabeled data to capture potentially similar image pairs.
MITrans is shown to be a powerful knowledge module for further progressive refining features of unlabeled data.
Along with supervised learning for labeled data, the prediction of unlabeled data is jointly learned with the generated pseudo masks.
arXiv Detail & Related papers (2021-06-29T02:48:45Z) - PseudoSeg: Designing Pseudo Labels for Semantic Segmentation [78.35515004654553]
We present a re-design of pseudo-labeling to generate structured pseudo labels for training with unlabeled or weakly-labeled data.
We demonstrate the effectiveness of the proposed pseudo-labeling strategy in both low-data and high-data regimes.
arXiv Detail & Related papers (2020-10-19T17:59:30Z) - Learning What Makes a Difference from Counterfactual Examples and
Gradient Supervision [57.14468881854616]
We propose an auxiliary training objective that improves the generalization capabilities of neural networks.
We use pairs of minimally-different examples with different labels, a.k.a counterfactual or contrasting examples, which provide a signal indicative of the underlying causal structure of the task.
Models trained with this technique demonstrate improved performance on out-of-distribution test sets.
arXiv Detail & Related papers (2020-04-20T02:47:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.