Finding the Missing-half: Graph Complementary Learning for
Homophily-prone and Heterophily-prone Graphs
- URL: http://arxiv.org/abs/2306.07608v1
- Date: Tue, 13 Jun 2023 08:06:10 GMT
- Title: Finding the Missing-half: Graph Complementary Learning for
Homophily-prone and Heterophily-prone Graphs
- Authors: Yizhen Zheng, He Zhang, Vincent CS Lee, Yu Zheng, Xiao Wang, Shirui
Pan
- Abstract summary: Graphs with homophily-prone edges tend to connect nodes with the same class.
Heterophily-prone edges tend to build relationships between nodes with different classes.
Existing GNNs only take the original graph during training.
- Score: 48.79929516665371
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Real-world graphs generally have only one kind of tendency in their
connections. These connections are either homophily-prone or heterophily-prone.
While graphs with homophily-prone edges tend to connect nodes with the same
class (i.e., intra-class nodes), heterophily-prone edges tend to build
relationships between nodes with different classes (i.e., inter-class nodes).
Existing GNNs only take the original graph during training. The problem with
this approach is that it forgets to take into consideration the ``missing-half"
structural information, that is, heterophily-prone topology for homophily-prone
graphs and homophily-prone topology for heterophily-prone graphs. In our paper,
we introduce Graph cOmplementAry Learning, namely GOAL, which consists of two
components: graph complementation and complemented graph convolution. The first
component finds the missing-half structural information for a given graph to
complement it. The complemented graph has two sets of graphs including both
homophily- and heterophily-prone topology. In the latter component, to handle
complemented graphs, we design a new graph convolution from the perspective of
optimisation. The experiment results show that GOAL consistently outperforms
all baselines in eight real-world datasets.
Related papers
- Robust Graph Structure Learning under Heterophily [12.557639223778722]
We propose a novel robust graph structure learning method to achieve a high-quality graph from heterophilic data for downstream tasks.
We first apply a high-pass filter to make each node more distinctive from its neighbors by encoding structure information into the node features.
Then, we learn a robust graph with an adaptive norm characterizing different levels of noise.
arXiv Detail & Related papers (2024-03-06T12:29:13Z) - Homophily-Related: Adaptive Hybrid Graph Filter for Multi-View Graph
Clustering [29.17784041837907]
We propose Adaptive Hybrid Graph Filter for Multi-View Graph Clustering (AHGFC)
AHGFC learns the node embedding based on the graph joint aggregation matrix.
Experimental results show that our proposed model performs well on six datasets containing homophilous and heterophilous graphs.
arXiv Detail & Related papers (2024-01-05T07:27:29Z) - Structure-free Graph Condensation: From Large-scale Graphs to Condensed
Graph-free Data [91.27527985415007]
Existing graph condensation methods rely on the joint optimization of nodes and structures in the condensed graph.
We advocate a new Structure-Free Graph Condensation paradigm, named SFGC, to distill a large-scale graph into a small-scale graph node set.
arXiv Detail & Related papers (2023-06-05T07:53:52Z) - Beyond Homophily: Reconstructing Structure for Graph-agnostic Clustering [15.764819403555512]
It is impossible to first identify a graph as homophilic or heterophilic before a suitable GNN model can be found.
We propose a novel graph clustering method, which contains three key components: graph reconstruction, a mixed filter, and dual graph clustering network.
Our method dominates others on heterophilic graphs.
arXiv Detail & Related papers (2023-05-03T01:49:01Z) - Break the Wall Between Homophily and Heterophily for Graph
Representation Learning [25.445073413243925]
Homophily and heterophily are intrinsic properties of graphs that describe whether two linked nodes share similar properties.
This work identifies three graph features, including the ego node feature, the aggregated node feature, and the graph structure feature, that are essential for graph representation learning.
It proposes a new GNN model called OGNN that extracts all three graph features and adaptively fuses them to achieve generalizability across the whole spectrum of homophily.
arXiv Detail & Related papers (2022-10-08T19:37:03Z) - G-Mixup: Graph Data Augmentation for Graph Classification [55.63157775049443]
Mixup has shown superiority in improving the generalization and robustness of neural networks by interpolating features and labels between two random samples.
We propose $mathcalG$-Mixup to augment graphs for graph classification by interpolating the generator (i.e., graphon) of different classes of graphs.
Experiments show that $mathcalG$-Mixup substantially improves the generalization and robustness of GNNs.
arXiv Detail & Related papers (2022-02-15T04:09:44Z) - Learning on heterogeneous graphs using high-order relations [37.64632406923687]
We propose an approach for learning on heterogeneous graphs without using meta-paths.
We decompose a heterogeneous graph into different homogeneous relation-type graphs, which are then combined to create higher-order relation-type representations.
arXiv Detail & Related papers (2021-03-29T12:02:47Z) - Factorizable Graph Convolutional Networks [90.59836684458905]
We introduce a novel graph convolutional network (GCN) that explicitly disentangles intertwined relations encoded in a graph.
FactorGCN takes a simple graph as input, and disentangles it into several factorized graphs.
We evaluate the proposed FactorGCN both qualitatively and quantitatively on the synthetic and real-world datasets.
arXiv Detail & Related papers (2020-10-12T03:01:40Z) - Multilevel Graph Matching Networks for Deep Graph Similarity Learning [79.3213351477689]
We propose a multi-level graph matching network (MGMN) framework for computing the graph similarity between any pair of graph-structured objects.
To compensate for the lack of standard benchmark datasets, we have created and collected a set of datasets for both the graph-graph classification and graph-graph regression tasks.
Comprehensive experiments demonstrate that MGMN consistently outperforms state-of-the-art baseline models on both the graph-graph classification and graph-graph regression tasks.
arXiv Detail & Related papers (2020-07-08T19:48:19Z) - Graph Pooling with Node Proximity for Hierarchical Representation
Learning [80.62181998314547]
We propose a novel graph pooling strategy that leverages node proximity to improve the hierarchical representation learning of graph data with their multi-hop topology.
Results show that the proposed graph pooling strategy is able to achieve state-of-the-art performance on a collection of public graph classification benchmark datasets.
arXiv Detail & Related papers (2020-06-19T13:09:44Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.