Simplified Graph Convolution with Heterophily
- URL: http://arxiv.org/abs/2202.04139v1
- Date: Tue, 8 Feb 2022 20:52:08 GMT
- Title: Simplified Graph Convolution with Heterophily
- Authors: Sudhanshu Chanpuriya and Cameron Musco
- Abstract summary: We show that Simple Graph Convolution (SGC) is ineffective for heterophilous (i.e., non-homophilous) graphs.
We propose Adaptive Simple Graph Convolution (ASGC), which we show can adapt to both homophilous and heterophilous graph structure.
- Score: 25.7577503312319
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph convolutional networks (GCNs) (Kipf & Welling, 2017) attempt to extend
the success of deep learning in modeling image and text data to graphs.
However, like other deep models, GCNs comprise repeated nonlinear
transformations of inputs and are therefore time and memory intensive to train.
Recent work has shown that a much simpler and faster model, Simple Graph
Convolution (SGC) (Wu et al., 2019), is competitive with GCNs in common graph
machine learning benchmarks. The use of graph data in SGC implicitly assumes
the common but not universal graph characteristic of homophily, wherein nodes
link to nodes which are similar. Here we show that SGC is indeed ineffective
for heterophilous (i.e., non-homophilous) graphs via experiments on synthetic
and real-world datasets. We propose Adaptive Simple Graph Convolution (ASGC),
which we show can adapt to both homophilous and heterophilous graph structure.
Like SGC, ASGC is not a deep model, and hence is fast, scalable, and
interpretable. We find that our non-deep method often outperforms
state-of-the-art deep models at node classification on a benchmark of
real-world datasets. The SGC paper questioned whether the complexity of graph
neural networks is warranted for common graph problems involving homophilous
networks; our results suggest that this question is still open even for more
complicated problems involving heterophilous networks.
Related papers
- Spectral Greedy Coresets for Graph Neural Networks [61.24300262316091]
The ubiquity of large-scale graphs in node-classification tasks hinders the real-world applications of Graph Neural Networks (GNNs)
This paper studies graph coresets for GNNs and avoids the interdependence issue by selecting ego-graphs based on their spectral embeddings.
Our spectral greedy graph coreset (SGGC) scales to graphs with millions of nodes, obviates the need for model pre-training, and applies to low-homophily graphs.
arXiv Detail & Related papers (2024-05-27T17:52:12Z) - Structure-free Graph Condensation: From Large-scale Graphs to Condensed
Graph-free Data [91.27527985415007]
Existing graph condensation methods rely on the joint optimization of nodes and structures in the condensed graph.
We advocate a new Structure-Free Graph Condensation paradigm, named SFGC, to distill a large-scale graph into a small-scale graph node set.
arXiv Detail & Related papers (2023-06-05T07:53:52Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Beyond Homophily: Reconstructing Structure for Graph-agnostic Clustering [15.764819403555512]
It is impossible to first identify a graph as homophilic or heterophilic before a suitable GNN model can be found.
We propose a novel graph clustering method, which contains three key components: graph reconstruction, a mixed filter, and dual graph clustering network.
Our method dominates others on heterophilic graphs.
arXiv Detail & Related papers (2023-05-03T01:49:01Z) - GCNH: A Simple Method For Representation Learning On Heterophilous
Graphs [4.051099980410583]
Graph Neural Networks (GNNs) are well-suited for learning on homophilous graphs.
Recent works have proposed extensions to standard GNN architectures to improve performance on heterophilous graphs.
We propose GCN for Heterophily (GCNH), a simple yet effective GNN architecture applicable to both heterophilous and homophilous scenarios.
arXiv Detail & Related papers (2023-04-21T11:26:24Z) - Make Heterophily Graphs Better Fit GNN: A Graph Rewiring Approach [43.41163711340362]
We propose a method named Deep Heterophily Graph Rewiring (DHGR) to rewire graphs by adding homophilic edges and pruning heterophilic edges.
To the best of our knowledge, it is the first work studying graph rewiring for heterophily graphs.
arXiv Detail & Related papers (2022-09-17T06:55:21Z) - AnchorGAE: General Data Clustering via $O(n)$ Bipartite Graph
Convolution [79.44066256794187]
We show how to convert a non-graph dataset into a graph by introducing the generative graph model, which is used to build graph convolution networks (GCNs)
A bipartite graph constructed by anchors is updated dynamically to exploit the high-level information behind data.
We theoretically prove that the simple update will lead to degeneration and a specific strategy is accordingly designed.
arXiv Detail & Related papers (2021-11-12T07:08:13Z) - Is Homophily a Necessity for Graph Neural Networks? [50.959340355849896]
Graph neural networks (GNNs) have shown great prowess in learning representations suitable for numerous graph-based machine learning tasks.
GNNs are widely believed to work well due to the homophily assumption ("like attracts like"), and fail to generalize to heterophilous graphs where dissimilar nodes connect.
Recent works design new architectures to overcome such heterophily-related limitations, citing poor baseline performance and new architecture improvements on a few heterophilous graph benchmark datasets as evidence for this notion.
In our experiments, we empirically find that standard graph convolutional networks (GCNs) can actually achieve better performance than
arXiv Detail & Related papers (2021-06-11T02:44:00Z) - R-GSN: The Relation-based Graph Similar Network for Heterogeneous Graph [5.2848965435713575]
This paper proposes the general heterogeneous message passing paradigm and designed R-GSN that does not need meta-path.
Experiments have shown that our R-GSN algorithm achieves the state-of-the-art performance on the ogbn-mag large scale heterogeneous graph dataset.
arXiv Detail & Related papers (2021-03-14T09:25:36Z) - Graph Highway Networks [77.38665506495553]
Graph Convolution Networks (GCN) are widely used in learning graph representations due to their effectiveness and efficiency.
They suffer from the notorious over-smoothing problem, in which the learned representations converge to alike vectors when many layers are stacked.
We propose Graph Highway Networks (GHNet) which utilize gating units to balance the trade-off between homogeneity and heterogeneity in the GCN learning process.
arXiv Detail & Related papers (2020-04-09T16:26:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.