Center Contrastive Loss for Metric Learning
- URL: http://arxiv.org/abs/2308.00458v1
- Date: Tue, 1 Aug 2023 11:22:51 GMT
- Title: Center Contrastive Loss for Metric Learning
- Authors: Bolun Cai, Pengfei Xiong, Shangxuan Tian
- Abstract summary: We propose a novel metric learning function called Center Contrastive Loss.
It maintains a class-wise center bank and compares the category centers with the query data points using a contrastive loss.
The proposed loss combines the advantages of both contrastive and classification methods.
- Score: 8.433000039153407
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Contrastive learning is a major studied topic in metric learning. However,
sampling effective contrastive pairs remains a challenge due to factors such as
limited batch size, imbalanced data distribution, and the risk of overfitting.
In this paper, we propose a novel metric learning function called Center
Contrastive Loss, which maintains a class-wise center bank and compares the
category centers with the query data points using a contrastive loss. The
center bank is updated in real-time to boost model convergence without the need
for well-designed sample mining. The category centers are well-optimized
classification proxies to re-balance the supervisory signal of each class.
Furthermore, the proposed loss combines the advantages of both contrastive and
classification methods by reducing intra-class variations and enhancing
inter-class differences to improve the discriminative power of embeddings. Our
experimental results, as shown in Figure 1, demonstrate that a standard network
(ResNet50) trained with our loss achieves state-of-the-art performance and
faster convergence.
Related papers
- Covariance-corrected Whitening Alleviates Network Degeneration on Imbalanced Classification [6.197116272789107]
Class imbalance is a critical issue in image classification that significantly affects the performance of deep recognition models.
We propose a novel framework called Whitening-Net to mitigate the degenerate solutions.
In scenarios with extreme class imbalance, the batch covariance statistic exhibits significant fluctuations, impeding the convergence of the whitening operation.
arXiv Detail & Related papers (2024-08-30T10:49:33Z) - Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric [99.19559537966538]
DML aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval.
To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss.
Comprehensive experiments on benchmark datasets demonstrate that our proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-07-03T13:44:20Z) - Bayesian Learning-driven Prototypical Contrastive Loss for Class-Incremental Learning [42.14439854721613]
We propose a prototypical network with a Bayesian learning-driven contrastive loss (BLCL) tailored specifically for class-incremental learning scenarios.
Our approach dynamically adapts the balance between the cross-entropy and contrastive loss functions with a Bayesian learning technique.
arXiv Detail & Related papers (2024-05-17T19:49:02Z) - DRoP: Distributionally Robust Pruning [11.930434318557156]
We conduct the first systematic study of the impact of data pruning on classification bias of trained models.
We propose DRoP, a distributionally robust approach to pruning and empirically demonstrate its performance on standard computer vision benchmarks.
arXiv Detail & Related papers (2024-04-08T14:55:35Z) - Noisy Correspondence Learning with Self-Reinforcing Errors Mitigation [63.180725016463974]
Cross-modal retrieval relies on well-matched large-scale datasets that are laborious in practice.
We introduce a novel noisy correspondence learning framework, namely textbfSelf-textbfReinforcing textbfErrors textbfMitigation (SREM)
arXiv Detail & Related papers (2023-12-27T09:03:43Z) - Balanced Classification: A Unified Framework for Long-Tailed Object
Detection [74.94216414011326]
Conventional detectors suffer from performance degradation when dealing with long-tailed data due to a classification bias towards the majority head categories.
We introduce a unified framework called BAlanced CLassification (BACL), which enables adaptive rectification of inequalities caused by disparities in category distribution.
BACL consistently achieves performance improvements across various datasets with different backbones and architectures.
arXiv Detail & Related papers (2023-08-04T09:11:07Z) - Semi-supervised Contrastive Learning with Similarity Co-calibration [72.38187308270135]
We propose a novel training strategy, termed as Semi-supervised Contrastive Learning (SsCL)
SsCL combines the well-known contrastive loss in self-supervised learning with the cross entropy loss in semi-supervised learning.
We show that SsCL produces more discriminative representation and is beneficial to few shot learning.
arXiv Detail & Related papers (2021-05-16T09:13:56Z) - Center Prediction Loss for Re-identification [65.58923413172886]
We propose a new loss based on center predictivity, that is, a sample must be positioned in a location of the feature space such that from it we can roughly predict the location of the center of same-class samples.
We show that this new loss leads to a more flexible intra-class distribution constraint while ensuring the between-class samples are well-separated.
arXiv Detail & Related papers (2021-04-30T03:57:31Z) - Solving Inefficiency of Self-supervised Representation Learning [87.30876679780532]
Existing contrastive learning methods suffer from very low learning efficiency.
Under-clustering and over-clustering problems are major obstacles to learning efficiency.
We propose a novel self-supervised learning framework using a median triplet loss.
arXiv Detail & Related papers (2021-04-18T07:47:10Z) - Analyzing Overfitting under Class Imbalance in Neural Networks for Image
Segmentation [19.259574003403998]
In image segmentation neural networks may overfit to the foreground samples from small structures.
In this study, we provide new insights on the problem of overfitting under class imbalance by inspecting the network behavior.
arXiv Detail & Related papers (2021-02-20T14:57:58Z) - Fed-Focal Loss for imbalanced data classification in Federated Learning [2.2172881631608456]
Federated Learning has a central server coordinating the training of a model on a network of devices.
One of the challenges is variable training performance when the dataset has a class imbalance.
We propose to address the class imbalance by reshaping cross-entropy loss such that it down-weights the loss assigned to well-classified examples along the lines of focal loss.
arXiv Detail & Related papers (2020-11-12T09:52:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.