Online Backfilling with No Regret for Large-Scale Image Retrieval
- URL: http://arxiv.org/abs/2301.03767v1
- Date: Tue, 10 Jan 2023 03:10:32 GMT
- Title: Online Backfilling with No Regret for Large-Scale Image Retrieval
- Authors: Seonguk Seo, Mustafa Gokhan Uzunbas, Bohyung Han, Sara Cao, Joena
Zhang, Taipeng Tian, Ser-Nam Lim
- Abstract summary: Backfilling is the process of re-extracting all gallery embeddings from upgraded models in image retrieval systems.
We propose an online backfilling algorithm, which enables us to achieve a progressive performance improvement during the backfilling process.
We incorporate a reverse transformation module for more effective and efficient merging, which is further enhanced by adopting a metric-compatible contrastive learning approach.
- Score: 50.162438586686356
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Backfilling is the process of re-extracting all gallery embeddings from
upgraded models in image retrieval systems. It inevitably requires a
prohibitively large amount of computational cost and even entails the downtime
of the service. Although backward-compatible learning sidesteps this challenge
by tackling query-side representations, this leads to suboptimal solutions in
principle because gallery embeddings cannot benefit from model upgrades. We
address this dilemma by introducing an online backfilling algorithm, which
enables us to achieve a progressive performance improvement during the
backfilling process while not sacrificing the final performance of new model
after the completion of backfilling. To this end, we first propose a simple
distance rank merge technique for online backfilling. Then, we incorporate a
reverse transformation module for more effective and efficient merging, which
is further enhanced by adopting a metric-compatible contrastive learning
approach. These two components help to make the distances of old and new models
compatible, resulting in desirable merge results during backfilling with no
extra computational overhead. Extensive experiments show the effectiveness of
our framework on four standard benchmarks in various settings.
Related papers
- EnsIR: An Ensemble Algorithm for Image Restoration via Gaussian Mixture Models [70.60381055741391]
Image restoration challenges related to illposed problems, resulting in deviations between single model predictions and ground-truths.
Ensemble learning aims to address these deviations by combining the predictions of multiple base models.
We employ an expectation (EM)-based algorithm to estimate ensemble weights for prediction candidates.
Our algorithm is model-agnostic and training-free, allowing seamless integration and enhancement of various pre-trained image restoration models.
arXiv Detail & Related papers (2024-10-30T12:16:35Z) - Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer [20.96380700548786]
Visual retrieval systems face challenges when updating models with improved representations due to misalignment between the old and new representations.
Prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling.
In this paper, we address achieving a balance between backward compatibility and the performance of independently trained models.
arXiv Detail & Related papers (2024-08-16T15:05:28Z) - Any Image Restoration with Efficient Automatic Degradation Adaptation [132.81912195537433]
We propose a unified manner to achieve joint embedding by leveraging the inherent similarities across various degradations for efficient and comprehensive restoration.
Our network sets new SOTA records while reducing model complexity by approximately -82% in trainable parameters and -85% in FLOPs.
arXiv Detail & Related papers (2024-07-18T10:26:53Z) - Unified-Width Adaptive Dynamic Network for All-In-One Image Restoration [50.81374327480445]
We introduce a novel concept positing that intricate image degradation can be represented in terms of elementary degradation.
We propose the Unified-Width Adaptive Dynamic Network (U-WADN), consisting of two pivotal components: a Width Adaptive Backbone (WAB) and a Width Selector (WS)
The proposed U-WADN achieves better performance while simultaneously reducing up to 32.3% of FLOPs and providing approximately 15.7% real-time acceleration.
arXiv Detail & Related papers (2024-01-24T04:25:12Z) - Parameter Efficient Adaptation for Image Restoration with Heterogeneous Mixture-of-Experts [52.39959535724677]
We introduce an alternative solution to improve the generalization of image restoration models.
We propose AdaptIR, a Mixture-of-Experts (MoE) with multi-branch design to capture local, global, and channel representation bases.
Our AdaptIR achieves stable performance on single-degradation tasks, and excels in hybrid-degradation tasks, with fine-tuning only 0.6% parameters for 8 hours.
arXiv Detail & Related papers (2023-12-12T14:27:59Z) - FastFill: Efficient Compatible Model Update [40.27741553705222]
FastFill is a compatible model update process using feature alignment and policy based partial backfilling.
We show that previous backfilling strategies suffer from decreased performance and demonstrate the importance of both the training objective and the ordering in online partial backfilling.
arXiv Detail & Related papers (2023-03-08T18:03:51Z) - Towards Universal Backward-Compatible Representation Learning [29.77801805854168]
backward-compatible representation learning is introduced to support backfill-free model upgrades.
We first introduce a new problem of universal backward-compatible representation learning, covering all possible data split in model upgrades.
We propose a simple yet effective method, dubbed Universal Backward- Training (UniBCT) with a novel structural prototype refinement algorithm.
arXiv Detail & Related papers (2022-03-03T09:23:51Z) - Plug-and-Play Image Restoration with Deep Denoiser Prior [186.84724418955054]
We show that a denoiser can implicitly serve as the image prior for model-based methods to solve many inverse problems.
We set up a benchmark deep denoiser prior by training a highly flexible and effective CNN denoiser.
We then plug the deep denoiser prior as a modular part into a half quadratic splitting based iterative algorithm to solve various image restoration problems.
arXiv Detail & Related papers (2020-08-31T17:18:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.