Embedding Transfer with Label Relaxation for Improved Metric Learning
- URL: http://arxiv.org/abs/2103.14908v1
- Date: Sat, 27 Mar 2021 13:35:03 GMT
- Title: Embedding Transfer with Label Relaxation for Improved Metric Learning
- Authors: Sungyeon Kim, Dongwon Kim, Minsu Cho, Suha Kwak
- Abstract summary: We present a novel method for embedding transfer, a task of transferring knowledge of a learned embedding model to another.
Our method exploits pairwise similarities between samples in the source embedding space as the knowledge, and transfers them through a loss used for learning target embedding models.
- Score: 43.94511888670419
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper presents a novel method for embedding transfer, a task of
transferring knowledge of a learned embedding model to another. Our method
exploits pairwise similarities between samples in the source embedding space as
the knowledge, and transfers them through a loss used for learning target
embedding models. To this end, we design a new loss called relaxed contrastive
loss, which employs the pairwise similarities as relaxed labels for
inter-sample relations. Our loss provides a rich supervisory signal beyond
class equivalence, enables more important pairs to contribute more to training,
and imposes no restriction on manifolds of target embedding spaces. Experiments
on metric learning benchmarks demonstrate that our method largely improves
performance, or reduces sizes and output dimensions of target models
effectively. We further show that it can be also used to enhance quality of
self-supervised representation and performance of classification models. In all
the experiments, our method clearly outperforms existing embedding transfer
techniques.
Related papers
- Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric [99.19559537966538]
DML aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval.
To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss.
Comprehensive experiments on benchmark datasets demonstrate that our proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-07-03T13:44:20Z) - Robust Transfer Learning with Unreliable Source Data [13.276850367115333]
We introduce a novel quantity called the ''ambiguity level'' that measures the discrepancy between the target and source regression functions.
We propose a simple transfer learning procedure, and establish a general theorem that shows how this new quantity is related to the transferability of learning.
arXiv Detail & Related papers (2023-10-06T21:50:21Z) - Metric Learning as a Service with Covariance Embedding [7.5989847759545155]
Metric learning aims to maximize and minimize inter- and intra-class similarities.
Existing models mainly rely on distance measures to obtain a separable embedding space.
We argue that to enable metric learning as a service for high-performance deep learning applications, we should also wisely deal with inter-class relationships.
arXiv Detail & Related papers (2022-11-28T10:10:59Z) - Why Do Self-Supervised Models Transfer? Investigating the Impact of
Invariance on Downstream Tasks [79.13089902898848]
Self-supervised learning is a powerful paradigm for representation learning on unlabelled images.
We show that different tasks in computer vision require features to encode different (in)variances.
arXiv Detail & Related papers (2021-11-22T18:16:35Z) - Adaptive Hierarchical Similarity Metric Learning with Noisy Labels [138.41576366096137]
We propose an Adaptive Hierarchical Similarity Metric Learning method.
It considers two noise-insensitive information, textiti.e., class-wise divergence and sample-wise consistency.
Our method achieves state-of-the-art performance compared with current deep metric learning approaches.
arXiv Detail & Related papers (2021-10-29T02:12:18Z) - Boosting Deep Transfer Learning for COVID-19 Classification [18.39034705389625]
COVID-19 classification using chest Computed Tomography (CT) has been found pragmatically useful.
It is still unknown if there are better strategies than vanilla transfer learning for more accurate COVID-19 classification with limited CT data.
This paper devises a novel model' augmentation technique that allows a considerable performance boost to transfer learning for the task.
arXiv Detail & Related papers (2021-02-16T11:15:23Z) - Spatial Contrastive Learning for Few-Shot Classification [9.66840768820136]
We propose a novel attention-based spatial contrastive objective to learn locally discriminative and class-agnostic features.
With extensive experiments, we show that the proposed method outperforms state-of-the-art approaches.
arXiv Detail & Related papers (2020-12-26T23:39:41Z) - Towards Accurate Knowledge Transfer via Target-awareness Representation
Disentanglement [56.40587594647692]
We propose a novel transfer learning algorithm, introducing the idea of Target-awareness REpresentation Disentanglement (TRED)
TRED disentangles the relevant knowledge with respect to the target task from the original source model and used as a regularizer during fine-tuning the target model.
Experiments on various real world datasets show that our method stably improves the standard fine-tuning by more than 2% in average.
arXiv Detail & Related papers (2020-10-16T17:45:08Z) - Self-Supervised Prototypical Transfer Learning for Few-Shot
Classification [11.96734018295146]
Self-supervised transfer learning approach ProtoTransfer outperforms state-of-the-art unsupervised meta-learning methods on few-shot tasks.
In few-shot experiments with domain shift, our approach even has comparable performance to supervised methods, but requires orders of magnitude fewer labels.
arXiv Detail & Related papers (2020-06-19T19:00:11Z) - Learning Diverse Representations for Fast Adaptation to Distribution
Shift [78.83747601814669]
We present a method for learning multiple models, incorporating an objective that pressures each to learn a distinct way to solve the task.
We demonstrate our framework's ability to facilitate rapid adaptation to distribution shift.
arXiv Detail & Related papers (2020-06-12T12:23:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.