Forward Compatible Training for Representation Learning
- URL: http://arxiv.org/abs/2112.02805v1
- Date: Mon, 6 Dec 2021 06:18:54 GMT
- Title: Forward Compatible Training for Representation Learning
- Authors: Vivek Ramanujan, Pavan Kumar Anasosalu Vasu, Ali Farhadi, Oncel Tuzel,
Hadi Pouransari
- Abstract summary: backward compatible training (BCT) modifies training of the new model to make its representations compatible with those of the old model.
BCT can significantly hinder the performance of the new model.
In this work, we propose a new learning paradigm for representation learning: forward compatible training (FCT)
- Score: 53.300192863727226
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In visual retrieval systems, updating the embedding model requires
recomputing features for every piece of data. This expensive process is
referred to as backfilling. Recently, the idea of backward compatible training
(BCT) was proposed. To avoid the cost of backfilling, BCT modifies training of
the new model to make its representations compatible with those of the old
model. However, BCT can significantly hinder the performance of the new model.
In this work, we propose a new learning paradigm for representation learning:
forward compatible training (FCT). In FCT, when the old model is trained, we
also prepare for a future unknown version of the model. We propose learning
side-information, an auxiliary feature for each sample which facilitates future
updates of the model. To develop a powerful and flexible framework for model
compatibility, we combine side-information with a forward transformation from
old to new embeddings. Training of the new model is not modified, hence, its
accuracy is not degraded. We demonstrate significant retrieval accuracy
improvement compared to BCT for various datasets: ImageNet-1k (+18.1%),
Places-365 (+5.4%), and VGG-Face2 (+8.3%). FCT obtains model compatibility when
the new and old models are trained across different datasets, losses, and
architectures.
Related papers
- Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer [20.96380700548786]
Visual retrieval systems face challenges when updating models with improved representations due to misalignment between the old and new representations.
Prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling.
In this paper, we address achieving a balance between backward compatibility and the performance of independently trained models.
arXiv Detail & Related papers (2024-08-16T15:05:28Z) - Towards Cross-modal Backward-compatible Representation Learning for Vision-Language Models [44.56258991182532]
Backward-compatible Training (BT) has been proposed to ensure that the new model aligns with the old model's embeddings.
This paper extends the concept of vision-only BT to the field of cross-modal retrieval.
We propose a projection module that maps the new model's embeddings to those of the old model.
arXiv Detail & Related papers (2024-05-23T15:46:35Z) - MixBCT: Towards Self-Adapting Backward-Compatible Training [66.52766344751635]
We propose MixBCT, a simple yet highly effective backward-compatible training method.
We conduct experiments on the large-scale face recognition datasets MS1Mv3 and IJB-C.
arXiv Detail & Related papers (2023-08-14T05:55:38Z) - Boundary-aware Backward-Compatible Representation via Adversarial
Learning in Image Retrieval [17.995993499100017]
Backward-compatible training (BCT) improves the compatibility of two models with less negative impact on retrieval performance.
We introduce AdvBCT, an Adversarial Backward-Training method with an elastic boundary constraint.
Our method outperforms other BCT methods on both compatibility and discrimination.
arXiv Detail & Related papers (2023-05-04T07:37:07Z) - FastFill: Efficient Compatible Model Update [40.27741553705222]
FastFill is a compatible model update process using feature alignment and policy based partial backfilling.
We show that previous backfilling strategies suffer from decreased performance and demonstrate the importance of both the training objective and the ordering in online partial backfilling.
arXiv Detail & Related papers (2023-03-08T18:03:51Z) - $BT^2$: Backward-compatible Training with Basis Transformation [107.37014712361788]
Retrieval system often requires recomputing the representation of every piece of data in the gallery when updating to a better representation model.
This process is known as backfilling and can be especially costly in the real world where the gallery often contains billions of samples.
Recently, researchers have proposed the idea of Backward compatible Training (BCT) where the new representation model can be trained with an auxiliary loss to make it backward compatible with the old representation.
arXiv Detail & Related papers (2022-11-08T04:00:23Z) - Forward Compatible Few-Shot Class-Incremental Learning [71.2459746681805]
A machine learning model should recognize new classes without forgetting old ones.
Current methods handle incremental learning retrospectively.
We propose ForwArd Compatible Training (FACT) for FSCIL.
arXiv Detail & Related papers (2022-03-14T09:36:35Z) - Learning to Reweight with Deep Interactions [104.68509759134878]
We propose an improved data reweighting algorithm, in which the student model provides its internal states to the teacher model.
Experiments on image classification with clean/noisy labels and neural machine translation empirically demonstrate that our algorithm makes significant improvement over previous methods.
arXiv Detail & Related papers (2020-07-09T09:06:31Z) - Towards Backward-Compatible Representation Learning [86.39292571306395]
We propose a way to learn visual features that are compatible with previously computed ones even when they have different dimensions.
This enables visual search systems to bypass computing new features for all previously seen images when updating the embedding models.
We propose a framework to train embedding models, called backward-compatible training (BCT), as a first step towards backward compatible representation learning.
arXiv Detail & Related papers (2020-03-26T14:34:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.