Towards Improved Proxy-based Deep Metric Learning via Data-Augmented
Domain Adaptation
- URL: http://arxiv.org/abs/2401.00617v1
- Date: Mon, 1 Jan 2024 00:10:58 GMT
- Title: Towards Improved Proxy-based Deep Metric Learning via Data-Augmented
Domain Adaptation
- Authors: Li Ren, Chen Chen, Liqiang Wang, Kien Hua
- Abstract summary: We present a novel proxy-based Deep Metric Learning framework.
We propose the Data-Augmented Domain Adaptation (DADA) method to adapt the domain gap between the group of samples and proxies.
Our experiments on benchmarks, including the popular CUB-200-2011, show that our learning algorithm significantly improves the existing proxy losses.
- Score: 15.254782791542329
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep Metric Learning (DML) plays an important role in modern computer vision
research, where we learn a distance metric for a set of image representations.
Recent DML techniques utilize the proxy to interact with the corresponding
image samples in the embedding space. However, existing proxy-based DML methods
focus on learning individual proxy-to-sample distance while the overall
distribution of samples and proxies lacks attention. In this paper, we present
a novel proxy-based DML framework that focuses on aligning the sample and proxy
distributions to improve the efficiency of proxy-based DML losses.
Specifically, we propose the Data-Augmented Domain Adaptation (DADA) method to
adapt the domain gap between the group of samples and proxies. To the best of
our knowledge, we are the first to leverage domain adaptation to boost the
performance of proxy-based DML. We show that our method can be easily plugged
into existing proxy-based DML losses. Our experiments on benchmarks, including
the popular CUB-200-2011, CARS196, Stanford Online Products, and In-Shop
Clothes Retrieval, show that our learning algorithm significantly improves the
existing proxy losses and achieves superior results compared to the existing
methods.
Related papers
- Deep Metric Learning with Soft Orthogonal Proxies [1.823505080809275]
We propose a novel approach that introduces Soft Orthogonality (SO) constraint on proxies.
Our approach leverages Data-Efficient Image Transformer (DeiT) as an encoder to extract contextual features from images along with a DML objective.
Our evaluations demonstrate the superiority of our proposed approach over state-of-the-art methods by a significant margin.
arXiv Detail & Related papers (2023-06-22T17:22:15Z) - Robust Calibrate Proxy Loss for Deep Metric Learning [6.784952050036532]
We propose a Calibrate Proxy structure, which uses the real sample information to improve the similarity calculation in proxy-based loss.
We show that our approach can effectively improve the performance of commonly used proxy-based losses on both regular and noisy datasets.
arXiv Detail & Related papers (2023-04-06T02:43:10Z) - Exploiting Instance-based Mixed Sampling via Auxiliary Source Domain
Supervision for Domain-adaptive Action Detection [75.38704117155909]
We propose a novel domain adaptive action detection approach and a new adaptation protocol.
Self-training combined with cross-domain mixed sampling has shown remarkable performance gain in UDA context.
We name our proposed framework as domain-adaptive action instance mixing (DA-AIM)
arXiv Detail & Related papers (2022-09-28T22:03:25Z) - Deep Metric Learning with Chance Constraints [6.965621436414179]
Deep metric learning (DML) aims to empirical expected loss of the pairwise intra-/inter-class proximity violations in the embedding space.
We show that minimizer of proxy-based DML satisfies certain chance constraints, and that the worst case generalization-based methods can be characterized by the radius of the smallest ball around a class proxy to cover the entire domain of the corresponding class samples, suggesting multiple proxies per class helps performance.
arXiv Detail & Related papers (2022-09-19T14:50:48Z) - A Non-isotropic Probabilistic Take on Proxy-based Deep Metric Learning [49.999268109518255]
Proxy-based Deep Metric Learning learns by embedding images close to their class representatives (proxies)
In addition, proxy-based DML struggles to learn class-internal structures.
We introduce non-isotropic probabilistic proxy-based DML to address both issues.
arXiv Detail & Related papers (2022-07-08T09:34:57Z) - ProxyMix: Proxy-based Mixup Training with Label Refinery for Source-Free
Domain Adaptation [73.14508297140652]
Unsupervised domain adaptation (UDA) aims to transfer knowledge from a labeled source domain to an unlabeled target domain.
We propose an effective method named Proxy-based Mixup training with label refinery ( ProxyMix)
Experiments on three 2D image and one 3D point cloud object recognition benchmarks demonstrate that ProxyMix yields state-of-the-art performance for source-free UDA tasks.
arXiv Detail & Related papers (2022-05-29T03:45:00Z) - One-Class Knowledge Distillation for Face Presentation Attack Detection [53.30584138746973]
This paper introduces a teacher-student framework to improve the cross-domain performance of face PAD with one-class domain adaptation.
Student networks are trained to mimic the teacher network and learn similar representations for genuine face samples of the target domain.
In the test phase, the similarity score between the representations of the teacher and student networks is used to distinguish attacks from genuine ones.
arXiv Detail & Related papers (2022-05-08T06:20:59Z) - Non-isotropy Regularization for Proxy-based Deep Metric Learning [78.18860829585182]
We propose non-isotropy regularization ($mathbbNIR$) for proxy-based Deep Metric Learning.
This allows us to explicitly induce a non-isotropic distribution of samples around a proxy to optimize for.
Experiments highlight consistent generalization benefits of $mathbbNIR$ while achieving competitive and state-of-the-art performance.
arXiv Detail & Related papers (2022-03-16T11:13:20Z) - Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer
Proxies [65.92826041406802]
We propose a Proxy-based deep Graph Metric Learning approach from the perspective of graph classification.
Multiple global proxies are leveraged to collectively approximate the original data points for each class.
We design a novel reverse label propagation algorithm, by which the neighbor relationships are adjusted according to ground-truth labels.
arXiv Detail & Related papers (2020-10-26T14:52:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.