Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer
- URL: http://arxiv.org/abs/2408.08793v1
- Date: Fri, 16 Aug 2024 15:05:28 GMT
- Title: Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer
- Authors: Simone Ricci, Niccolò Biondi, Federico Pernici, Alberto Del Bimbo,
- Abstract summary: Visual retrieval systems face challenges when updating models with improved representations due to misalignment between the old and new representations.
Prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling.
In this paper, we address achieving a balance between backward compatibility and the performance of independently trained models.
- Score: 20.96380700548786
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Visual retrieval systems face significant challenges when updating models with improved representations due to misalignment between the old and new representations. The costly and resource-intensive backfilling process involves recalculating feature vectors for images in the gallery set whenever a new model is introduced. To address this, prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling. Despite these advancements, achieving a balance between backward compatibility and the performance of independently trained models remains an open problem. In this paper, we address it by expanding the representation space with additional dimensions and learning an orthogonal transformation to achieve compatibility with old models and, at the same time, integrate new information. This transformation preserves the original feature space's geometry, ensuring that our model aligns with previous versions while also learning new data. Our Orthogonal Compatible Aligned (OCA) approach eliminates the need for re-indexing during model updates and ensures that features can be compared directly across different model updates without additional mapping functions. Experimental results on CIFAR-100 and ImageNet-1k demonstrate that our method not only maintains compatibility with previous models but also achieves state-of-the-art accuracy, outperforming several existing methods.
Related papers
- MixBCT: Towards Self-Adapting Backward-Compatible Training [66.52766344751635]
We propose MixBCT, a simple yet highly effective backward-compatible training method.
We conduct experiments on the large-scale face recognition datasets MS1Mv3 and IJB-C.
arXiv Detail & Related papers (2023-08-14T05:55:38Z) - FastFill: Efficient Compatible Model Update [40.27741553705222]
FastFill is a compatible model update process using feature alignment and policy based partial backfilling.
We show that previous backfilling strategies suffer from decreased performance and demonstrate the importance of both the training objective and the ordering in online partial backfilling.
arXiv Detail & Related papers (2023-03-08T18:03:51Z) - Online Backfilling with No Regret for Large-Scale Image Retrieval [50.162438586686356]
Backfilling is the process of re-extracting all gallery embeddings from upgraded models in image retrieval systems.
We propose an online backfilling algorithm, which enables us to achieve a progressive performance improvement during the backfilling process.
We incorporate a reverse transformation module for more effective and efficient merging, which is further enhanced by adopting a metric-compatible contrastive learning approach.
arXiv Detail & Related papers (2023-01-10T03:10:32Z) - $BT^2$: Backward-compatible Training with Basis Transformation [107.37014712361788]
Retrieval system often requires recomputing the representation of every piece of data in the gallery when updating to a better representation model.
This process is known as backfilling and can be especially costly in the real world where the gallery often contains billions of samples.
Recently, researchers have proposed the idea of Backward compatible Training (BCT) where the new representation model can be trained with an auxiliary loss to make it backward compatible with the old representation.
arXiv Detail & Related papers (2022-11-08T04:00:23Z) - Towards Universal Backward-Compatible Representation Learning [29.77801805854168]
backward-compatible representation learning is introduced to support backfill-free model upgrades.
We first introduce a new problem of universal backward-compatible representation learning, covering all possible data split in model upgrades.
We propose a simple yet effective method, dubbed Universal Backward- Training (UniBCT) with a novel structural prototype refinement algorithm.
arXiv Detail & Related papers (2022-03-03T09:23:51Z) - Forward Compatible Training for Representation Learning [53.300192863727226]
backward compatible training (BCT) modifies training of the new model to make its representations compatible with those of the old model.
BCT can significantly hinder the performance of the new model.
In this work, we propose a new learning paradigm for representation learning: forward compatible training (FCT)
arXiv Detail & Related papers (2021-12-06T06:18:54Z) - Neighborhood Consensus Contrastive Learning for Backward-Compatible
Representation [46.86784621137665]
backward-compatible representation is proposed to enable the "new" features compatible with "old"' features.
We propose a Neighborhood Consensus Contrastive Learning (NCCL) method, which learns backward-compatible representation from a neighborhood consensus perspective.
Our method ensures backward compatibility without impairing the accuracy of the new model.
arXiv Detail & Related papers (2021-08-07T05:50:47Z) - Towards Backward-Compatible Representation Learning [86.39292571306395]
We propose a way to learn visual features that are compatible with previously computed ones even when they have different dimensions.
This enables visual search systems to bypass computing new features for all previously seen images when updating the embedding models.
We propose a framework to train embedding models, called backward-compatible training (BCT), as a first step towards backward compatible representation learning.
arXiv Detail & Related papers (2020-03-26T14:34:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.