Finding Diverse and Predictable Subgraphs for Graph Domain
Generalization
- URL: http://arxiv.org/abs/2206.09345v1
- Date: Sun, 19 Jun 2022 07:57:56 GMT
- Title: Finding Diverse and Predictable Subgraphs for Graph Domain
Generalization
- Authors: Junchi Yu, Jian Liang, Ran He
- Abstract summary: This paper focuses on out-of-distribution generalization on graphs where performance drops due to the unseen distribution shift.
We propose a new graph domain generalization framework, dubbed as DPS, by constructing multiple populations from the source domains.
Experiments on both node-level and graph-level benchmarks shows that the proposed DPS achieves impressive performance for various graph domain generalization tasks.
- Score: 88.32356432272356
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This paper focuses on out-of-distribution generalization on graphs where
performance drops due to the unseen distribution shift. Previous graph domain
generalization works always resort to learning an invariant predictor among
different source domains. However, they assume sufficient source domains are
available during training, posing huge challenges for realistic applications.
By contrast, we propose a new graph domain generalization framework, dubbed as
DPS, by constructing multiple populations from the source domains.
Specifically, DPS aims to discover multiple \textbf{D}iverse and
\textbf{P}redictable \textbf{S}ubgraphs with a set of generators, namely,
subgraphs are different from each other but all the them share the same
semantics with the input graph. These generated source domains are exploited to
learn an \textit{equi-predictive} graph neural network (GNN) across domains,
which is expected to generalize well to unseen target domains. Generally, DPS
is model-agnostic that can be incorporated with various GNN backbones.
Extensive experiments on both node-level and graph-level benchmarks shows that
the proposed DPS achieves impressive performance for various graph domain
generalization tasks.
Related papers
- Subgraph Aggregation for Out-of-Distribution Generalization on Graphs [29.884717215947745]
Out-of-distribution (OOD) generalization in Graph Neural Networks (GNNs) has gained significant attention.
We propose a novel framework, SubGraph Aggregation (SuGAr), designed to learn a diverse set of subgraphs.
Experiments on both synthetic and real-world datasets demonstrate that SuGAr outperforms state-of-the-art methods.
arXiv Detail & Related papers (2024-10-29T16:54:37Z) - Text-Free Multi-domain Graph Pre-training: Toward Graph Foundation Models [33.2696184519275]
We propose MDGPT, a text free Multi-Domain Graph Pre-Training and adaptation framework.
First, we propose a set of domain tokens to align features across source domains for synergistic pre-training.
Second, we propose a dual prompts, consisting of a unifying prompt and a mixing prompt, to further adapt the target domain with unified multi-domain knowledge.
arXiv Detail & Related papers (2024-05-22T19:06:39Z) - Rethinking Propagation for Unsupervised Graph Domain Adaptation [17.443218657417454]
Unlabelled Graph Domain Adaptation (UGDA) aims to transfer knowledge from a labelled source graph to an unsupervised target graph.
We propose a simple yet effective approach called A2GNN for graph domain adaptation.
arXiv Detail & Related papers (2024-02-08T13:24:57Z) - One for All: Towards Training One Graph Model for All Classification Tasks [61.656962278497225]
A unified model for various graph tasks remains underexplored, primarily due to the challenges unique to the graph learning domain.
We propose textbfOne for All (OFA), the first general framework that can use a single graph model to address the above challenges.
OFA performs well across different tasks, making it the first general-purpose across-domains classification model on graphs.
arXiv Detail & Related papers (2023-09-29T21:15:26Z) - Zero-shot Domain Adaptation of Heterogeneous Graphs via Knowledge
Transfer Networks [72.82524864001691]
heterogeneous graph neural networks (HGNNs) have shown superior performance as powerful representation learning techniques.
There is no direct way to learn using labels rooted at different node types.
In this work, we propose a novel domain adaptation method, Knowledge Transfer Networks for HGNNs (HGNN-KTN)
arXiv Detail & Related papers (2022-03-03T21:00:23Z) - Source Free Unsupervised Graph Domain Adaptation [60.901775859601685]
Unsupervised Graph Domain Adaptation (UGDA) shows its practical value of reducing the labeling cost for node classification.
Most existing UGDA methods heavily rely on the labeled graph in the source domain.
In some real-world scenarios, the source graph is inaccessible because of privacy issues.
We propose a novel scenario named Source Free Unsupervised Graph Domain Adaptation (SFUGDA)
arXiv Detail & Related papers (2021-12-02T03:18:18Z) - Graph Classification by Mixture of Diverse Experts [67.33716357951235]
We present GraphDIVE, a framework leveraging mixture of diverse experts for imbalanced graph classification.
With a divide-and-conquer principle, GraphDIVE employs a gating network to partition an imbalanced graph dataset into several subsets.
Experiments on real-world imbalanced graph datasets demonstrate the effectiveness of GraphDIVE.
arXiv Detail & Related papers (2021-03-29T14:03:03Z) - Cross-Domain Facial Expression Recognition: A Unified Evaluation
Benchmark and Adversarial Graph Learning [85.6386289476598]
We develop a novel adversarial graph representation adaptation (AGRA) framework for cross-domain holistic-local feature co-adaptation.
We conduct extensive and fair evaluations on several popular benchmarks and show that the proposed AGRA framework outperforms previous state-of-the-art methods.
arXiv Detail & Related papers (2020-08-03T15:00:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.