Global Proxy-based Hard Mining for Visual Place Recognition
- URL: http://arxiv.org/abs/2302.14217v1
- Date: Tue, 28 Feb 2023 00:43:13 GMT
- Title: Global Proxy-based Hard Mining for Visual Place Recognition
- Authors: Amar Ali-bey, Brahim Chaib-draa, Philippe Gigu\`ere
- Abstract summary: We introduce a new technique that performs global hard mini-batch sampling based on proxies.
To do so, we add a new end-to-end trainable branch to the network, which generates efficient place descriptors.
Our method can be used in combination with all existing pairwise and triplet loss functions with negligible additional memory and computation cost.
- Score: 3.6739949215165164
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning deep representations for visual place recognition is commonly
performed using pairwise or triple loss functions that highly depend on the
hardness of the examples sampled at each training iteration. Existing
techniques address this by using computationally and memory expensive offline
hard mining, which consists of identifying, at each iteration, the hardest
samples from the training set. In this paper we introduce a new technique that
performs global hard mini-batch sampling based on proxies. To do so, we add a
new end-to-end trainable branch to the network, which generates efficient place
descriptors (one proxy for each place). These proxy representations are thus
used to construct a global index that encompasses the similarities between all
places in the dataset, allowing for highly informative mini-batch sampling at
each training iteration. Our method can be used in combination with all
existing pairwise and triplet loss functions with negligible additional memory
and computation cost. We run extensive ablation studies and show that our
technique brings new state-of-the-art performance on multiple large-scale
benchmarks such as Pittsburgh, Mapillary-SLS and SPED. In particular, our
method provides more than 100% relative improvement on the challenging Nordland
dataset. Our code is available at https://github.com/amaralibey/GPM
Related papers
- AANet: Aggregation and Alignment Network with Semi-hard Positive Sample
Mining for Hierarchical Place Recognition [48.043749855085025]
Visual place recognition (VPR) is one of the research hotspots in robotics, which uses visual information to locate robots.
We present a unified network capable of extracting global features for retrieving candidates via an aggregation module.
We also propose a Semi-hard Positive Sample Mining (ShPSM) strategy to select appropriate hard positive images for training more robust VPR networks.
arXiv Detail & Related papers (2023-10-08T14:46:11Z) - Multi-Level Contrastive Learning for Dense Prediction Task [59.591755258395594]
We present Multi-Level Contrastive Learning for Dense Prediction Task (MCL), an efficient self-supervised method for learning region-level feature representation for dense prediction tasks.
Our method is motivated by the three key factors in detection: localization, scale consistency and recognition.
Our method consistently outperforms the recent state-of-the-art methods on various datasets with significant margins.
arXiv Detail & Related papers (2023-04-04T17:59:04Z) - A Maximum Log-Likelihood Method for Imbalanced Few-Shot Learning Tasks [3.2895195535353308]
We propose a new maximum log-likelihood metric for few-shot architectures.
We demonstrate that the proposed metric achieves superior performance accuracy w.r.t. conventional similarity metrics.
We also show that our algorithm achieves state-of-the-art transductive few-shot performance when the evaluation data is imbalanced.
arXiv Detail & Related papers (2022-11-26T21:31:00Z) - Graph Sampling Based Deep Metric Learning for Generalizable Person
Re-Identification [114.56752624945142]
We argue that the most popular random sampling method, the well-known PK sampler, is not informative and efficient for deep metric learning.
We propose an efficient mini batch sampling method called Graph Sampling (GS) for large-scale metric learning.
arXiv Detail & Related papers (2021-04-04T06:44:15Z) - Learning Intra-Batch Connections for Deep Metric Learning [3.5665681694253903]
metric learning aims to learn a function that maps samples to a lower-dimensional space where similar samples lie closer than dissimilar ones.
Most approaches rely on losses that only take the relations between pairs or triplets of samples into account.
We propose an approach based on message passing networks that takes into account all the relations in a mini-batch.
arXiv Detail & Related papers (2021-02-15T18:50:00Z) - Exploiting Shared Representations for Personalized Federated Learning [54.65133770989836]
We propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
Our algorithm harnesses the distributed computational power across clients to perform many local-updates with respect to the low-dimensional local parameters for every update of the representation.
This result is of interest beyond federated learning to a broad class of problems in which we aim to learn a shared low-dimensional representation among data distributions.
arXiv Detail & Related papers (2021-02-14T05:36:25Z) - Fewer is More: A Deep Graph Metric Learning Perspective Using Fewer
Proxies [65.92826041406802]
We propose a Proxy-based deep Graph Metric Learning approach from the perspective of graph classification.
Multiple global proxies are leveraged to collectively approximate the original data points for each class.
We design a novel reverse label propagation algorithm, by which the neighbor relationships are adjusted according to ground-truth labels.
arXiv Detail & Related papers (2020-10-26T14:52:42Z) - Learning to Count in the Crowd from Limited Labeled Data [109.2954525909007]
We focus on reducing the annotation efforts by learning to count in the crowd from limited number of labeled samples.
Specifically, we propose a Gaussian Process-based iterative learning mechanism that involves estimation of pseudo-ground truth for the unlabeled data.
arXiv Detail & Related papers (2020-07-07T04:17:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.