Neighborhood Consensus Contrastive Learning for Backward-Compatible
Representation
- URL: http://arxiv.org/abs/2108.03372v1
- Date: Sat, 7 Aug 2021 05:50:47 GMT
- Title: Neighborhood Consensus Contrastive Learning for Backward-Compatible
Representation
- Authors: Shengsen Wu, Liang Chen, Yihang Lou, YanBai, Tao Bai, Minghua Deng,
Lingyu Duan
- Abstract summary: backward-compatible representation is proposed to enable the "new" features compatible with "old"' features.
We propose a Neighborhood Consensus Contrastive Learning (NCCL) method, which learns backward-compatible representation from a neighborhood consensus perspective.
Our method ensures backward compatibility without impairing the accuracy of the new model.
- Score: 46.86784621137665
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In object re-identification (ReID), the development of deep learning
techniques often involves model update and deployment. It is unbearable to
re-extract image features of the large-scale gallery when deploying new models.
Therefore, backward-compatible representation is proposed to enable the "new"
features compatible with "old"' features, free from the re-extracting process.
The existing backward-compatible methods simply conduct constraints in the
embedding space or discriminative space and ignore the intra-class variance of
the old embeddings, resulting in a risk of damaging the discriminability of new
embeddings.
In this work, we propose a Neighborhood Consensus Contrastive Learning (NCCL)
method, which learns backward-compatible representation from a neighborhood
consensus perspective with both embedding structures and discriminative
knowledge. With NCCL, the new embeddings are aligned and improved with old
embeddings in a multi-cluster view. Besides, we also propose a scheme to filter
the old embeddings with low credibility, which can further improve the
compatibility robustness. Our method ensures backward compatibility without
impairing the accuracy of the new model. And it can even improve the new
model's accuracy in most scenarios.
Related papers
- Backward-Compatible Aligned Representations via an Orthogonal Transformation Layer [20.96380700548786]
Visual retrieval systems face challenges when updating models with improved representations due to misalignment between the old and new representations.
Prior research has explored backward-compatible training methods that enable direct comparisons between new and old representations without backfilling.
In this paper, we address achieving a balance between backward compatibility and the performance of independently trained models.
arXiv Detail & Related papers (2024-08-16T15:05:28Z) - PASS++: A Dual Bias Reduction Framework for Non-Exemplar Class-Incremental Learning [49.240408681098906]
Class-incremental learning (CIL) aims to recognize new classes incrementally while maintaining the discriminability of old classes.
Most existing CIL methods are exemplar-based, i.e., storing a part of old data for retraining.
We present a simple and novel dual bias reduction framework that employs self-supervised transformation (SST) in input space and prototype augmentation (protoAug) in deep feature space.
arXiv Detail & Related papers (2024-07-19T05:03:16Z) - Expandable Subspace Ensemble for Pre-Trained Model-Based Class-Incremental Learning [65.57123249246358]
We propose ExpAndable Subspace Ensemble (EASE) for PTM-based CIL.
We train a distinct lightweight adapter module for each new task, aiming to create task-specific subspaces.
Our prototype complement strategy synthesizes old classes' new features without using any old class instance.
arXiv Detail & Related papers (2024-03-18T17:58:13Z) - CEAT: Continual Expansion and Absorption Transformer for Non-Exemplar
Class-Incremental Learning [34.59310641291726]
In real-world applications, dynamic scenarios require the models to possess the capability to learn new tasks continuously without forgetting the old knowledge.
We propose a new architecture, named continual expansion and absorption transformer(CEAT)
The model can learn the novel knowledge by extending the expanded-fusion layers in parallel with the frozen previous parameters.
To improve the learning ability of the model, we designed a novel prototype contrastive loss to reduce the overlap between old and new classes in the feature space.
arXiv Detail & Related papers (2024-03-11T12:40:12Z) - MixBCT: Towards Self-Adapting Backward-Compatible Training [66.52766344751635]
We propose MixBCT, a simple yet highly effective backward-compatible training method.
We conduct experiments on the large-scale face recognition datasets MS1Mv3 and IJB-C.
arXiv Detail & Related papers (2023-08-14T05:55:38Z) - Darwinian Model Upgrades: Model Evolving with Selective Compatibility [29.920204547961696]
BCT presents the first step towards backward-compatible model upgrades to get rid of backfilling.
We propose Darwinian Model Upgrades (DMU) which disentangle the inheritance and variation in the model evolving with selective backward compatibility and forward adaptation.
DMU effectively alleviates the new-to-new degradation and improves new-to-old compatibility, rendering a more proper model upgrading paradigm in large-scale retrieval systems.
arXiv Detail & Related papers (2022-10-13T12:28:48Z) - Forward Compatible Few-Shot Class-Incremental Learning [71.2459746681805]
A machine learning model should recognize new classes without forgetting old ones.
Current methods handle incremental learning retrospectively.
We propose ForwArd Compatible Training (FACT) for FSCIL.
arXiv Detail & Related papers (2022-03-14T09:36:35Z) - Towards Backward-Compatible Representation Learning [86.39292571306395]
We propose a way to learn visual features that are compatible with previously computed ones even when they have different dimensions.
This enables visual search systems to bypass computing new features for all previously seen images when updating the embedding models.
We propose a framework to train embedding models, called backward-compatible training (BCT), as a first step towards backward compatible representation learning.
arXiv Detail & Related papers (2020-03-26T14:34:09Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.