FastFill: Efficient Compatible Model Update
- URL: http://arxiv.org/abs/2303.04766v1
- Date: Wed, 8 Mar 2023 18:03:51 GMT
- Title: FastFill: Efficient Compatible Model Update
- Authors: Florian Jaeckle, Fartash Faghri, Ali Farhadi, Oncel Tuzel, and Hadi
Pouransari
- Abstract summary: FastFill is a compatible model update process using feature alignment and policy based partial backfilling.
We show that previous backfilling strategies suffer from decreased performance and demonstrate the importance of both the training objective and the ordering in online partial backfilling.
- Score: 40.27741553705222
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In many retrieval systems the original high dimensional data (e.g., images)
is mapped to a lower dimensional feature through a learned embedding model. The
task of retrieving the most similar data from a gallery set to a given query
data is performed through a similarity comparison on features. When the
embedding model is updated, it might produce features that are not
comparable/compatible with features already in the gallery computed with the
old model. Subsequently, all features in the gallery need to be re-computed
using the new embedding model -- a computationally expensive process called
backfilling. Recently, compatible representation learning methods have been
proposed to avoid backfilling. Despite their relative success, there is an
inherent trade-off between the new model performance and its compatibility with
the old model. In this work, we introduce FastFill: a compatible model update
process using feature alignment and policy based partial backfilling to
promptly elevate retrieval performance. We show that previous backfilling
strategies suffer from decreased performance and demonstrate the importance of
both the training objective and the ordering in online partial backfilling. We
propose a new training method for feature alignment between old and new
embedding models using uncertainty estimation. Compared to previous works, we
obtain significantly improved backfilling results on a variety of datasets: mAP
on ImageNet (+4.4\%), Places-365 (+2.7\%), and VGG-Face2 (+1.3\%). Further, we
demonstrate that when updating a biased model with FastFill, the minority
subgroup accuracy gap promptly vanishes with a small fraction of partial
backfilling.
Related papers
- Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer [20.96380700548786]
Visual retrieval systems face challenges when updating models with improved representations due to misalignment between the old and new representations.
Prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling.
In this paper, we address achieving a balance between backward compatibility and the performance of independently trained models.
arXiv Detail & Related papers (2024-08-16T15:05:28Z) - Stationary Representations: Optimally Approximating Compatibility and Implications for Improved Model Replacements [20.96380700548786]
Learning compatible representations enables the interchangeable use of semantic features as models are updated over time.
This is particularly relevant in search and retrieval systems where it is crucial to avoid reprocessing of the gallery images with the updated model.
We show that the stationary representations learned by the $d$-Simplex fixed classifier optimally approximate compatibility representation according to the two inequality constraints of its formal definition.
arXiv Detail & Related papers (2024-05-04T06:31:38Z) - Lifelong Person Re-Identification with Backward-Compatibility [9.94228688034577]
Lifelong person re-identification (LReID) assumes a practical scenario where the model is sequentially trained on continuously incoming datasets.
In this paper, we address the above mentioned problem by incorporating the backward-compatibility to LReID for the first time.
arXiv Detail & Related papers (2024-03-15T05:08:59Z) - MixBCT: Towards Self-Adapting Backward-Compatible Training [66.52766344751635]
We propose MixBCT, a simple yet highly effective backward-compatible training method.
We conduct experiments on the large-scale face recognition datasets MS1Mv3 and IJB-C.
arXiv Detail & Related papers (2023-08-14T05:55:38Z) - Online Backfilling with No Regret for Large-Scale Image Retrieval [50.162438586686356]
Backfilling is the process of re-extracting all gallery embeddings from upgraded models in image retrieval systems.
We propose an online backfilling algorithm, which enables us to achieve a progressive performance improvement during the backfilling process.
We incorporate a reverse transformation module for more effective and efficient merging, which is further enhanced by adopting a metric-compatible contrastive learning approach.
arXiv Detail & Related papers (2023-01-10T03:10:32Z) - $BT^2$: Backward-compatible Training with Basis Transformation [107.37014712361788]
Retrieval system often requires recomputing the representation of every piece of data in the gallery when updating to a better representation model.
This process is known as backfilling and can be especially costly in the real world where the gallery often contains billions of samples.
Recently, researchers have proposed the idea of Backward compatible Training (BCT) where the new representation model can be trained with an auxiliary loss to make it backward compatible with the old representation.
arXiv Detail & Related papers (2022-11-08T04:00:23Z) - Hot-Refresh Model Upgrades with Regression-Alleviating Compatible
Training in Image Retrieval [34.84329831602699]
cold-refresh model upgrades can only deploy new models after the gallery is overall backfilled, taking weeks or even months for massive data.
In contrast, hot-refresh model upgrades deploy the new model immediately and then gradually improve the retrieval accuracy by backfilling the gallery on-the-fly.
arXiv Detail & Related papers (2022-01-24T14:59:12Z) - Forward Compatible Training for Representation Learning [53.300192863727226]
backward compatible training (BCT) modifies training of the new model to make its representations compatible with those of the old model.
BCT can significantly hinder the performance of the new model.
In this work, we propose a new learning paradigm for representation learning: forward compatible training (FCT)
arXiv Detail & Related papers (2021-12-06T06:18:54Z) - Towards Backward-Compatible Representation Learning [86.39292571306395]
We propose a way to learn visual features that are compatible with previously computed ones even when they have different dimensions.
This enables visual search systems to bypass computing new features for all previously seen images when updating the embedding models.
We propose a framework to train embedding models, called backward-compatible training (BCT), as a first step towards backward compatible representation learning.
arXiv Detail & Related papers (2020-03-26T14:34:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.