Deep Metric Learning with Chance Constraints
- URL: http://arxiv.org/abs/2209.09060v3
- Date: Wed, 6 Sep 2023 13:42:28 GMT
- Title: Deep Metric Learning with Chance Constraints
- Authors: Yeti Z. Gurbuz, Ogul Can and A. Aydin Alatan
- Abstract summary: Deep metric learning (DML) aims to empirical expected loss of the pairwise intra-/inter-class proximity violations in the embedding space.
We show that minimizer of proxy-based DML satisfies certain chance constraints, and that the worst case generalization-based methods can be characterized by the radius of the smallest ball around a class proxy to cover the entire domain of the corresponding class samples, suggesting multiple proxies per class helps performance.
- Score: 6.965621436414179
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Deep metric learning (DML) aims to minimize empirical expected loss of the
pairwise intra-/inter- class proximity violations in the embedding space. We
relate DML to feasibility problem of finite chance constraints. We show that
minimizer of proxy-based DML satisfies certain chance constraints, and that the
worst case generalization performance of the proxy-based methods can be
characterized by the radius of the smallest ball around a class proxy to cover
the entire domain of the corresponding class samples, suggesting multiple
proxies per class helps performance. To provide a scalable algorithm as well as
exploiting more proxies, we consider the chance constraints implied by the
minimizers of proxy-based DML instances and reformulate DML as finding a
feasible point in intersection of such constraints, resulting in a problem to
be approximately solved by iterative projections. Simply put, we repeatedly
train a regularized proxy-based loss and re-initialize the proxies with the
embeddings of the deliberately selected new samples. We applied our method with
4 well-accepted DML losses and show the effectiveness with extensive
evaluations on 4 popular DML benchmarks. Code is available at:
https://github.com/yetigurbuz/ccp-dml
Related papers
- Anti-Collapse Loss for Deep Metric Learning Based on Coding Rate Metric [99.19559537966538]
DML aims to learn a discriminative high-dimensional embedding space for downstream tasks like classification, clustering, and retrieval.
To maintain the structure of embedding space and avoid feature collapse, we propose a novel loss function called Anti-Collapse Loss.
Comprehensive experiments on benchmark datasets demonstrate that our proposed method outperforms existing state-of-the-art methods.
arXiv Detail & Related papers (2024-07-03T13:44:20Z) - Towards Improved Proxy-based Deep Metric Learning via Data-Augmented
Domain Adaptation [15.254782791542329]
We present a novel proxy-based Deep Metric Learning framework.
We propose the Data-Augmented Domain Adaptation (DADA) method to adapt the domain gap between the group of samples and proxies.
Our experiments on benchmarks, including the popular CUB-200-2011, show that our learning algorithm significantly improves the existing proxy losses.
arXiv Detail & Related papers (2024-01-01T00:10:58Z) - Amortizing intractable inference in large language models [56.92471123778389]
We use amortized Bayesian inference to sample from intractable posterior distributions.
We empirically demonstrate that this distribution-matching paradigm of LLM fine-tuning can serve as an effective alternative to maximum-likelihood training.
As an important application, we interpret chain-of-thought reasoning as a latent variable modeling problem.
arXiv Detail & Related papers (2023-10-06T16:36:08Z) - Deep Metric Learning with Soft Orthogonal Proxies [1.823505080809275]
We propose a novel approach that introduces Soft Orthogonality (SO) constraint on proxies.
Our approach leverages Data-Efficient Image Transformer (DeiT) as an encoder to extract contextual features from images along with a DML objective.
Our evaluations demonstrate the superiority of our proposed approach over state-of-the-art methods by a significant margin.
arXiv Detail & Related papers (2023-06-22T17:22:15Z) - A Non-isotropic Probabilistic Take on Proxy-based Deep Metric Learning [49.999268109518255]
Proxy-based Deep Metric Learning learns by embedding images close to their class representatives (proxies)
In addition, proxy-based DML struggles to learn class-internal structures.
We introduce non-isotropic probabilistic proxy-based DML to address both issues.
arXiv Detail & Related papers (2022-07-08T09:34:57Z) - Non-isotropy Regularization for Proxy-based Deep Metric Learning [78.18860829585182]
We propose non-isotropy regularization ($mathbbNIR$) for proxy-based Deep Metric Learning.
This allows us to explicitly induce a non-isotropic distribution of samples around a proxy to optimize for.
Experiments highlight consistent generalization benefits of $mathbbNIR$ while achieving competitive and state-of-the-art performance.
arXiv Detail & Related papers (2022-03-16T11:13:20Z) - Adaptive neighborhood Metric learning [184.95321334661898]
We propose a novel distance metric learning algorithm, named adaptive neighborhood metric learning (ANML)
ANML can be used to learn both the linear and deep embeddings.
The emphlog-exp mean function proposed in our method gives a new perspective to review the deep metric learning methods.
arXiv Detail & Related papers (2022-01-20T17:26:37Z) - Masked Proxy Loss For Text-Independent Speaker Verification [27.417484680749784]
This paper proposes a Masked Proxy (MP) loss which directly incorporates both proxy-based relationships and pair-based relationships.
We further propose Multinomial Masked Proxy (MMP) loss to leverage the hardness of speaker pairs.
arXiv Detail & Related papers (2020-11-09T15:16:29Z) - Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer
Proxies [65.92826041406802]
We propose a Proxy-based deep Graph Metric Learning approach from the perspective of graph classification.
Multiple global proxies are leveraged to collectively approximate the original data points for each class.
We design a novel reverse label propagation algorithm, by which the neighbor relationships are adjusted according to ground-truth labels.
arXiv Detail & Related papers (2020-10-26T14:52:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.