Every Node is Different: Dynamically Fusing Self-Supervised Tasks for
Attributed Graph Clustering
- URL: http://arxiv.org/abs/2401.06595v1
- Date: Fri, 12 Jan 2024 14:24:10 GMT
- Title: Every Node is Different: Dynamically Fusing Self-Supervised Tasks for
Attributed Graph Clustering
- Authors: Pengfei Zhu, Qian Wang, Yu Wang, Jialu Li, Qinghua Hu
- Abstract summary: We propose Dynamically Fusing Self-Supervised Learning (DyFSS) for graph clustering.
DyFSS fuses features extracted from diverse SSL tasks using distinct weights derived from a gating network.
Experiments show DyFSS outperforms state-of-the-art multi-task SSL methods by up to 8.66% on the accuracy metric.
- Score: 59.45743537594695
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Attributed graph clustering is an unsupervised task that partitions nodes
into different groups. Self-supervised learning (SSL) shows great potential in
handling this task, and some recent studies simultaneously learn multiple SSL
tasks to further boost performance. Currently, different SSL tasks are assigned
the same set of weights for all graph nodes. However, we observe that some
graph nodes whose neighbors are in different groups require significantly
different emphases on SSL tasks. In this paper, we propose to dynamically learn
the weights of SSL tasks for different nodes and fuse the embeddings learned
from different SSL tasks to boost performance. We design an innovative graph
clustering approach, namely Dynamically Fusing Self-Supervised Learning
(DyFSS). Specifically, DyFSS fuses features extracted from diverse SSL tasks
using distinct weights derived from a gating network. To effectively learn the
gating network, we design a dual-level self-supervised strategy that
incorporates pseudo labels and the graph structure. Extensive experiments on
five datasets show that DyFSS outperforms the state-of-the-art multi-task SSL
methods by up to 8.66% on the accuracy metric. The code of DyFSS is available
at: https://github.com/q086/DyFSS.
Related papers
- Do Neural Scaling Laws Exist on Graph Self-Supervised Learning? [9.297227372861876]
Self-supervised learning(SSL) is essential to obtain foundation models in NLP and CV domains via effectively leveraging knowledge in large-scale unlabeled data.
It remains a mystery whether existing SSL in the graph domain can follow the scaling behavior toward building Graph Foundation Models(GFMs) with large-scale pre-training.
This paper examines existing SSL techniques for the feasibility of Graph SSL techniques in developing GFMs and opens a new direction for graph SSL design with the new evaluation prototype.
arXiv Detail & Related papers (2024-08-20T23:45:11Z) - DATA: Domain-Aware and Task-Aware Pre-training [94.62676913928831]
We present DATA, a simple yet effective NAS approach specialized for self-supervised learning (SSL)
Our method achieves promising results across a wide range of computation costs on downstream tasks, including image classification, object detection and semantic segmentation.
arXiv Detail & Related papers (2022-03-17T02:38:49Z) - Sound and Visual Representation Learning with Multiple Pretraining Tasks [104.11800812671953]
Self-supervised tasks (SSL) reveal different features from the data.
This work aims to combine Multiple SSL tasks (Multi-SSL) that generalizes well for all downstream tasks.
Experiments on sound representations demonstrate that Multi-SSL via incremental learning (IL) of SSL tasks outperforms single SSL task models.
arXiv Detail & Related papers (2022-01-04T09:09:38Z) - Self-Supervised Learning of Graph Neural Networks: A Unified Review [50.71341657322391]
Self-supervised learning is emerging as a new paradigm for making use of large amounts of unlabeled samples.
We provide a unified review of different ways of training graph neural networks (GNNs) using SSL.
Our treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
arXiv Detail & Related papers (2021-02-22T03:43:45Z) - Self-supervised Learning on Graphs: Deep Insights and New Direction [66.78374374440467]
Self-supervised learning (SSL) aims to create domain specific pretext tasks on unlabeled data.
There are increasing interests in generalizing deep learning to the graph domain in the form of graph neural networks (GNNs)
arXiv Detail & Related papers (2020-06-17T20:30:04Z) - TAFSSL: Task-Adaptive Feature Sub-Space Learning for few-shot
classification [50.358839666165764]
We show that the Task-Adaptive Feature Sub-Space Learning (TAFSSL) can significantly boost the performance in Few-Shot Learning scenarios.
Specifically, we show that on the challenging miniImageNet and tieredImageNet benchmarks, TAFSSL can improve the current state-of-the-art in both transductive and semi-supervised FSL settings by more than $5%$.
arXiv Detail & Related papers (2020-03-14T16:59:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.