Triplet Contrastive Learning for Unsupervised Vehicle Re-identification
- URL: http://arxiv.org/abs/2301.09498v1
- Date: Mon, 23 Jan 2023 15:52:12 GMT
- Title: Triplet Contrastive Learning for Unsupervised Vehicle Re-identification
- Authors: Fei Shen, Xiaoyu Du, Liyan Zhang, Jinhui Tang
- Abstract summary: Part feature learning is a critical technology for fine semantic understanding in vehicle re-identification.
We propose a novel Triplet Contrastive Learning framework (TCL) which leverages cluster features to bridge the part features and global features.
- Score: 55.445358749042384
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Part feature learning is a critical technology for finegrained semantic
understanding in vehicle re-identification. However, recent unsupervised
re-identification works exhibit serious gradient collapse issues when directly
modeling the part features and global features. To address this problem, in
this paper, we propose a novel Triplet Contrastive Learning framework (TCL)
which leverages cluster features to bridge the part features and global
features. Specifically, TCL devises three memory banks to store the features
according to their attributes and proposes a proxy contrastive loss (PCL) to
make contrastive learning between adjacent memory banks, thus presenting the
associations between the part and global features as a transition of the
partcluster and cluster-global associations. Since the cluster memory bank
deals with all the instance features, it can summarize them into a
discriminative feature representation. To deeply exploit the instance
information, TCL proposes two additional loss functions. For the inter-class
instance, a hybrid contrastive loss (HCL) re-defines the sample correlations by
approaching the positive cluster features and leaving the all negative instance
features. For the intra-class instances, a weighted regularization cluster
contrastive loss (WRCCL) refines the pseudo labels by penalizing the mislabeled
images according to the instance similarity. Extensive experiments show that
TCL outperforms many state-of-the-art unsupervised vehicle re-identification
approaches. The code will be available at https://github.com/muzishen/TCL.
Related papers
- Learning Label Hierarchy with Supervised Contrastive Learning [8.488965459026678]
Supervised contrastive learning (SCL) frameworks treat each class as independent and thus consider all classes to be equally important.
This paper introduces a family of Label-Aware SCL methods (LASCL) that incorporates hierarchical information to SCL by leveraging similarities between classes.
Experiments on three datasets show that the proposed LASCL works well on text classification of distinguishing a single label among multi-labels.
arXiv Detail & Related papers (2024-01-31T23:21:40Z) - CLC: Cluster Assignment via Contrastive Representation Learning [9.631532215759256]
We propose Contrastive Learning-based Clustering (CLC), which uses contrastive learning to directly learn cluster assignment.
We achieve 53.4% accuracy on the full ImageNet dataset and outperform existing methods by large margins.
arXiv Detail & Related papers (2023-06-08T07:15:13Z) - Dual Cluster Contrastive learning for Person Re-Identification [78.42770787790532]
We formulate a unified cluster contrastive framework, named Dual Cluster Contrastive learning (DCC)
DCC maintains two types of memory banks: individual and centroid cluster memory banks.
It can be easily applied for unsupervised or supervised person ReID.
arXiv Detail & Related papers (2021-12-09T02:43:25Z) - Mind Your Clever Neighbours: Unsupervised Person Re-identification via
Adaptive Clustering Relationship Modeling [19.532602887109668]
Unsupervised person re-identification (Re-ID) attracts increasing attention due to its potential to resolve the scalability problem of supervised Re-ID models.
Most existing unsupervised methods adopt an iterative clustering mechanism, where the network was trained based on pseudo labels generated by unsupervised clustering.
To generate high-quality pseudo-labels and mitigate the impact of clustering errors, we propose a novel clustering relationship modeling framework for unsupervised person Re-ID.
arXiv Detail & Related papers (2021-12-03T10:55:07Z) - Learning to Detect Instance-level Salient Objects Using Complementary
Image Labels [55.049347205603304]
We present the first weakly-supervised approach to the salient instance detection problem.
We propose a novel weakly-supervised network with three branches: a Saliency Detection Branch leveraging class consistency information to locate candidate objects; a Boundary Detection Branch exploiting class discrepancy information to delineate object boundaries; and a Centroid Detection Branch using subitizing information to detect salient instance centroids.
arXiv Detail & Related papers (2021-11-19T10:15:22Z) - Neighborhood Contrastive Learning for Novel Class Discovery [79.14767688903028]
We build a new framework, named Neighborhood Contrastive Learning, to learn discriminative representations that are important to clustering performance.
We experimentally demonstrate that these two ingredients significantly contribute to clustering performance and lead our model to outperform state-of-the-art methods by a large margin.
arXiv Detail & Related papers (2021-06-20T17:34:55Z) - You Never Cluster Alone [150.94921340034688]
We extend the mainstream contrastive learning paradigm to a cluster-level scheme, where all the data subjected to the same cluster contribute to a unified representation.
We define a set of categorical variables as clustering assignment confidence, which links the instance-level learning track with the cluster-level one.
By reparametrizing the assignment variables, TCC is trained end-to-end, requiring no alternating steps.
arXiv Detail & Related papers (2021-06-03T14:59:59Z) - Unsupervised Feature Learning by Cross-Level Instance-Group
Discrimination [68.83098015578874]
We integrate between-instance similarity into contrastive learning, not directly by instance grouping, but by cross-level discrimination.
CLD effectively brings unsupervised learning closer to natural data and real-world applications.
New state-of-the-art on self-supervision, semi-supervision, and transfer learning benchmarks, and beats MoCo v2 and SimCLR on every reported performance.
arXiv Detail & Related papers (2020-08-09T21:13:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.