Towards Long-Tailed Recognition for Graph Classification via
Collaborative Experts
- URL: http://arxiv.org/abs/2308.16609v2
- Date: Tue, 5 Sep 2023 14:46:38 GMT
- Title: Towards Long-Tailed Recognition for Graph Classification via
Collaborative Experts
- Authors: Siyu Yi, Zhengyang Mao, Wei Ju, Yongdao Zhou, Luchen Liu, Xiao Luo,
and Ming Zhang
- Abstract summary: We propose a novel long-tailed graph-level classification framework via Collaborative Multi-expert Learning (CoMe)
To equilibrate the contributions of head and tail classes, we first develop balanced contrastive learning from the view of representation learning.
We execute gated fusion and disentangled knowledge distillation among the multiple experts to promote the collaboration in a multi-expert framework.
- Score: 10.99232053983369
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph classification, aiming at learning the graph-level representations for
effective class assignments, has received outstanding achievements, which
heavily relies on high-quality datasets that have balanced class distribution.
In fact, most real-world graph data naturally presents a long-tailed form,
where the head classes occupy much more samples than the tail classes, it thus
is essential to study the graph-level classification over long-tailed data
while still remaining largely unexplored. However, most existing long-tailed
learning methods in visions fail to jointly optimize the representation
learning and classifier training, as well as neglect the mining of the
hard-to-classify classes. Directly applying existing methods to graphs may lead
to sub-optimal performance, since the model trained on graphs would be more
sensitive to the long-tailed distribution due to the complex topological
characteristics. Hence, in this paper, we propose a novel long-tailed
graph-level classification framework via Collaborative Multi-expert Learning
(CoMe) to tackle the problem. To equilibrate the contributions of head and tail
classes, we first develop balanced contrastive learning from the view of
representation learning, and then design an individual-expert classifier
training based on hard class mining. In addition, we execute gated fusion and
disentangled knowledge distillation among the multiple experts to promote the
collaboration in a multi-expert framework. Comprehensive experiments are
performed on seven widely-used benchmark datasets to demonstrate the
superiority of our method CoMe over state-of-the-art baselines.
Related papers
- Granularity Matters in Long-Tail Learning [62.30734737735273]
We offer a novel perspective on long-tail learning, inspired by an observation: datasets with finer granularity tend to be less affected by data imbalance.
We introduce open-set auxiliary classes that are visually similar to existing ones, aiming to enhance representation learning for both head and tail classes.
To prevent the overwhelming presence of auxiliary classes from disrupting training, we introduce a neighbor-silencing loss.
arXiv Detail & Related papers (2024-10-21T13:06:21Z) - RAHNet: Retrieval Augmented Hybrid Network for Long-tailed Graph
Classification [10.806893809269074]
We propose a novel framework called Retrieval Augmented Hybrid Network (RAHNet) to jointly learn a robust feature extractor and an unbiased classifier.
In the feature extractor training stage, we develop a graph retrieval module to search for relevant graphs that directly enrich the intra-class diversity for the tail classes.
We also innovatively optimize a category-centered supervised contrastive loss to obtain discriminative representations.
arXiv Detail & Related papers (2023-08-04T14:06:44Z) - Mastering Long-Tail Complexity on Graphs: Characterization, Learning, and Generalization [33.89914557812127]
We propose a generalization bound for long-tail classification on graphs by formulating the problem in the fashion of multi-task learning.
Our theoretical results show that the generalization performance of long-tail classification is dominated by the overall loss range and the task complexity.
Building upon the theoretical findings, we propose a novel generic framework HierTail for long-tail classification on graphs.
arXiv Detail & Related papers (2023-05-17T03:52:40Z) - SuperDisco: Super-Class Discovery Improves Visual Recognition for the
Long-Tail [69.50380510879697]
We propose SuperDisco, an algorithm that discovers super-class representations for long-tailed recognition.
We learn to construct the super-class graph to guide the representation learning to deal with long-tailed distributions.
arXiv Detail & Related papers (2023-03-31T19:51:12Z) - Constructing Balance from Imbalance for Long-tailed Image Recognition [50.6210415377178]
The imbalance between majority (head) classes and minority (tail) classes severely skews the data-driven deep neural networks.
Previous methods tackle with data imbalance from the viewpoints of data distribution, feature space, and model design.
We propose a concise paradigm by progressively adjusting label space and dividing the head classes and tail classes.
Our proposed model also provides a feature evaluation method and paves the way for long-tailed feature learning.
arXiv Detail & Related papers (2022-08-04T10:22:24Z) - Effective and Efficient Graph Learning for Multi-view Clustering [173.8313827799077]
We propose an effective and efficient graph learning model for multi-view clustering.
Our method exploits the view-similar between graphs of different views by the minimization of tensor Schatten p-norm.
Our proposed algorithm is time-economical and obtains the stable results and scales well with the data size.
arXiv Detail & Related papers (2021-08-15T13:14:28Z) - Class-Balanced Distillation for Long-Tailed Visual Recognition [100.10293372607222]
Real-world imagery is often characterized by a significant imbalance of the number of images per class, leading to long-tailed distributions.
In this work, we introduce a new framework, by making the key observation that a feature representation learned with instance sampling is far from optimal in a long-tailed setting.
Our main contribution is a new training method, that leverages knowledge distillation to enhance feature representations.
arXiv Detail & Related papers (2021-04-12T08:21:03Z) - Long-Tailed Recognition Using Class-Balanced Experts [128.73438243408393]
We propose an ensemble of class-balanced experts that combines the strength of diverse classifiers.
Our ensemble of class-balanced experts reaches results close to state-of-the-art and an extended ensemble establishes a new state-of-the-art on two benchmarks for long-tailed recognition.
arXiv Detail & Related papers (2020-04-07T20:57:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.