AdaRC: Mitigating Graph Structure Shifts during Test-Time
- URL: http://arxiv.org/abs/2410.06976v1
- Date: Wed, 9 Oct 2024 15:15:40 GMT
- Title: AdaRC: Mitigating Graph Structure Shifts during Test-Time
- Authors: Wenxuan Bao, Zhichen Zeng, Zhining Liu, Hanghang Tong, Jingrui He,
- Abstract summary: Test-time adaptation (TTA) has attracted attention due to its ability to adapt a pre-trained model to a target domain without re-accessing the source domain.
We propose AdaRC, an innovative framework designed for effective and efficient adaptation to structure shifts in graphs.
- Score: 66.40525136929398
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Powerful as they are, graph neural networks (GNNs) are known to be vulnerable to distribution shifts. Recently, test-time adaptation (TTA) has attracted attention due to its ability to adapt a pre-trained model to a target domain without re-accessing the source domain. However, existing TTA algorithms are primarily designed for attribute shifts in vision tasks, where samples are independent. These methods perform poorly on graph data that experience structure shifts, where node connectivity differs between source and target graphs. We attribute this performance gap to the distinct impact of node attribute shifts versus graph structure shifts: the latter significantly degrades the quality of node representations and blurs the boundaries between different node categories. To address structure shifts in graphs, we propose AdaRC, an innovative framework designed for effective and efficient adaptation to structure shifts by adjusting the hop-aggregation parameters in GNNs. To enhance the representation quality, we design a prediction-informed clustering loss to encourage the formation of distinct clusters for different node categories. Additionally, AdaRC seamlessly integrates with existing TTA algorithms, allowing it to handle attribute shifts effectively while improving overall performance under combined structure and attribute shifts. We validate the effectiveness of AdaRC on both synthetic and real-world datasets, demonstrating its robustness across various combinations of structure and attribute shifts.
Related papers
- A Pure Transformer Pretraining Framework on Text-attributed Graphs [50.833130854272774]
We introduce a feature-centric pretraining perspective by treating graph structure as a prior.
Our framework, Graph Sequence Pretraining with Transformer (GSPT), samples node contexts through random walks.
GSPT can be easily adapted to both node classification and link prediction, demonstrating promising empirical success on various datasets.
arXiv Detail & Related papers (2024-06-19T22:30:08Z) - AGHINT: Attribute-Guided Representation Learning on Heterogeneous Information Networks with Transformer [4.01252998015631]
We investigate the impact of inter-node attribute disparities on HGNNs performance within a benchmark task.
We propose a novel Attribute-Guided heterogeneous Information Networks representation learning model with Transformer (AGHINT)
AGHINT transcends the constraints of the original graph structure by directly integrating higher-order similar neighbor features into the learning process.
arXiv Detail & Related papers (2024-04-16T10:30:48Z) - Learning Invariant Representations of Graph Neural Networks via Cluster
Generalization [58.68231635082891]
Graph neural networks (GNNs) have become increasingly popular in modeling graph-structured data.
In this paper, we experimentally find that the performance of GNNs drops significantly when the structure shift happens.
We propose the Cluster Information Transfer (CIT) mechanism, which can learn invariant representations for GNNs.
arXiv Detail & Related papers (2024-03-06T10:36:56Z) - DGNN: Decoupled Graph Neural Networks with Structural Consistency
between Attribute and Graph Embedding Representations [62.04558318166396]
Graph neural networks (GNNs) demonstrate a robust capability for representation learning on graphs with complex structures.
A novel GNNs framework, dubbed Decoupled Graph Neural Networks (DGNN), is introduced to obtain a more comprehensive embedding representation of nodes.
Experimental results conducted on several graph benchmark datasets verify DGNN's superiority in node classification task.
arXiv Detail & Related papers (2024-01-28T06:43:13Z) - Structural Re-weighting Improves Graph Domain Adaptation [13.019371337183202]
This work examines different impacts of distribution shifts caused by either graph structure or node attributes.
A novel approach, called structural reweighting (StruRW), is proposed to address this issue and is tested on synthetic graphs, four benchmark datasets, and a new application in high energy physics.
arXiv Detail & Related papers (2023-06-05T20:11:30Z) - ASGNN: Graph Neural Networks with Adaptive Structure [41.83813812538167]
We propose a novel interpretable message passing scheme with adaptive structure (ASMP) to defend against adversarial attacks on graph structure.
ASMP is adaptive in the sense that the message passing process in different layers is able to be carried out over dynamically adjusted graphs.
arXiv Detail & Related papers (2022-10-03T15:10:40Z) - Interpolation-based Correlation Reduction Network for Semi-Supervised
Graph Learning [49.94816548023729]
We propose a novel graph contrastive learning method, termed Interpolation-based Correlation Reduction Network (ICRN)
In our method, we improve the discriminative capability of the latent feature by enlarging the margin of decision boundaries.
By combining the two settings, we extract rich supervision information from both the abundant unlabeled nodes and the rare yet valuable labeled nodes for discnative representation learning.
arXiv Detail & Related papers (2022-06-06T14:26:34Z) - Variational Co-embedding Learning for Attributed Network Clustering [30.7006907516984]
Recent works for attributed network clustering utilize graph convolution to obtain node embeddings and simultaneously perform clustering assignments on the embedding space.
We propose a variational co-embedding learning model for attributed network clustering (ANC)
ANC is composed of dual variational auto-encoders to simultaneously embed nodes and attributes.
arXiv Detail & Related papers (2021-04-15T08:11:47Z) - Self-Challenging Improves Cross-Domain Generalization [81.99554996975372]
Convolutional Neural Networks (CNN) conduct image classification by activating dominant features that correlated with labels.
We introduce a simple training, Self-Challenging Representation (RSC), that significantly improves the generalization of CNN to the out-of-domain data.
RSC iteratively challenges the dominant features activated on the training data, and forces the network to activate remaining features that correlates with labels.
arXiv Detail & Related papers (2020-07-05T21:42:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.