Convolutional Fine-Grained Classification with Self-Supervised Target
Relation Regularization
- URL: http://arxiv.org/abs/2208.01997v1
- Date: Wed, 3 Aug 2022 11:51:53 GMT
- Title: Convolutional Fine-Grained Classification with Self-Supervised Target
Relation Regularization
- Authors: Kangjun Liu, Ke Chen, Kui Jia
- Abstract summary: This paper introduces a novel target coding scheme -- dynamic target relation graphs (DTRG)
Online computation of class-level feature centers is designed to generate cross-category distance in the representation space.
The proposed target graphs can alleviate data sparsity and imbalanceness in representation learning.
- Score: 34.8793946023412
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fine-grained visual classification can be addressed by deep representation
learning under supervision of manually pre-defined targets (e.g., one-hot or
the Hadamard codes). Such target coding schemes are less flexible to model
inter-class correlation and are sensitive to sparse and imbalanced data
distribution as well. In light of this, this paper introduces a novel target
coding scheme -- dynamic target relation graphs (DTRG), which, as an auxiliary
feature regularization, is a self-generated structural output to be mapped from
input images. Specifically, online computation of class-level feature centers
is designed to generate cross-category distance in the representation space,
which can thus be depicted by a dynamic graph in a non-parametric manner.
Explicitly minimizing intra-class feature variations anchored on those
class-level centers can encourage learning of discriminative features.
Moreover, owing to exploiting inter-class dependency, the proposed target
graphs can alleviate data sparsity and imbalanceness in representation
learning. Inspired by recent success of the mixup style data augmentation, this
paper introduces randomness into soft construction of dynamic target relation
graphs to further explore relation diversity of target classes. Experimental
results can demonstrate the effectiveness of our method on a number of diverse
benchmarks of multiple visual classification tasks, especially achieving the
state-of-the-art performance on popular fine-grained object benchmarks and
superior robustness against sparse and imbalanced data. Source codes are made
publicly available at https://github.com/AkonLau/DTRG.
Related papers
- Graph Structure Refinement with Energy-based Contrastive Learning [56.957793274727514]
We introduce an unsupervised method based on a joint of generative training and discriminative training to learn graph structure and representation.
We propose an Energy-based Contrastive Learning (ECL) guided Graph Structure Refinement (GSR) framework, denoted as ECL-GSR.
ECL-GSR achieves faster training with fewer samples and memories against the leading baseline, highlighting its simplicity and efficiency in downstream tasks.
arXiv Detail & Related papers (2024-12-20T04:05:09Z) - Conditional Distribution Learning on Graphs [15.730933577970687]
We propose a conditional distribution learning (CDL) method that learns graph representations from graph-structured data for semisupervised graph classification.
Specifically, we present an end-to-end graph representation learning model to align the conditional distributions of weakly and strongly augmented features over the original features.
arXiv Detail & Related papers (2024-11-20T07:26:36Z) - Enhancing Fine-Grained Visual Recognition in the Low-Data Regime Through Feature Magnitude Regularization [23.78498670529746]
We introduce a regularization technique to ensure that the magnitudes of the extracted features are evenly distributed.
Despite its apparent simplicity, our approach has demonstrated significant performance improvements across various fine-grained visual recognition datasets.
arXiv Detail & Related papers (2024-09-03T07:32:46Z) - Deep Contrastive Graph Learning with Clustering-Oriented Guidance [61.103996105756394]
Graph Convolutional Network (GCN) has exhibited remarkable potential in improving graph-based clustering.
Models estimate an initial graph beforehand to apply GCN.
Deep Contrastive Graph Learning (DCGL) model is proposed for general data clustering.
arXiv Detail & Related papers (2024-02-25T07:03:37Z) - Redundancy-Free Self-Supervised Relational Learning for Graph Clustering [13.176413653235311]
We propose a novel self-supervised deep graph clustering method named Redundancy-Free Graph Clustering (R$2$FGC)
It extracts the attribute- and structure-level relational information from both global and local views based on an autoencoder and a graph autoencoder.
Our experiments are performed on widely used benchmark datasets to validate the superiority of our R$2$FGC over state-of-the-art baselines.
arXiv Detail & Related papers (2023-09-09T06:18:50Z) - Improving Deep Representation Learning via Auxiliary Learnable Target Coding [69.79343510578877]
This paper introduces a novel learnable target coding as an auxiliary regularization of deep representation learning.
Specifically, a margin-based triplet loss and a correlation consistency loss on the proposed target codes are designed to encourage more discriminative representations.
arXiv Detail & Related papers (2023-05-30T01:38:54Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - Similarity-aware Positive Instance Sampling for Graph Contrastive
Pre-training [82.68805025636165]
We propose to select positive graph instances directly from existing graphs in the training set.
Our selection is based on certain domain-specific pair-wise similarity measurements.
Besides, we develop an adaptive node-level pre-training method to dynamically mask nodes to distribute them evenly in the graph.
arXiv Detail & Related papers (2022-06-23T20:12:51Z) - GraphCoCo: Graph Complementary Contrastive Learning [65.89743197355722]
Graph Contrastive Learning (GCL) has shown promising performance in graph representation learning (GRL) without the supervision of manual annotations.
This paper proposes an effective graph complementary contrastive learning approach named GraphCoCo to tackle the above issue.
arXiv Detail & Related papers (2022-03-24T02:58:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.