$p$-Laplacian Based Graph Neural Networks
- URL: http://arxiv.org/abs/2111.07337v1
- Date: Sun, 14 Nov 2021 13:16:28 GMT
- Title: $p$-Laplacian Based Graph Neural Networks
- Authors: Guoji Fu and Peilin Zhao and Yatao Bian
- Abstract summary: Graph networks (GNNs) have demonstrated superior performance for semi-supervised node classification on graphs.
We propose a new $p$-Laplacian based GNN model, termed as $p$GNN, whose message passing mechanism is derived from a discrete regularization framework.
We show that the new message passing mechanism works simultaneously as low-pass and high-pass filters, thus making $p$GNNs effective on both homophilic and heterophilic graphs.
- Score: 27.747195341003263
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have demonstrated superior performance for
semi-supervised node classification on graphs, as a result of their ability to
exploit node features and topological information simultaneously. However, most
GNNs implicitly assume that the labels of nodes and their neighbors in a graph
are the same or consistent, which does not hold in heterophilic graphs, where
the labels of linked nodes are likely to differ. Hence, when the topology is
non-informative for label prediction, ordinary GNNs may work significantly
worse than simply applying multi-layer perceptrons (MLPs) on each node. To
tackle the above problem, we propose a new $p$-Laplacian based GNN model,
termed as $^p$GNN, whose message passing mechanism is derived from a discrete
regularization framework and could be theoretically explained as an
approximation of a polynomial graph filter defined on the spectral domain of
$p$-Laplacians. The spectral analysis shows that the new message passing
mechanism works simultaneously as low-pass and high-pass filters, thus making
$^p$GNNs are effective on both homophilic and heterophilic graphs. Empirical
studies on real-world and synthetic datasets validate our findings and
demonstrate that $^p$GNNs significantly outperform several state-of-the-art GNN
architectures on heterophilic benchmarks while achieving competitive
performance on homophilic benchmarks. Moreover, $^p$GNNs can adaptively learn
aggregation weights and are robust to noisy edges.
Related papers
- Spatio-Spectral Graph Neural Networks [50.277959544420455]
We propose Spatio-Spectral Graph Networks (S$2$GNNs)
S$2$GNNs combine spatially and spectrally parametrized graph filters.
We show that S$2$GNNs vanquish over-squashing and yield strictly tighter approximation-theoretic error bounds than MPGNNs.
arXiv Detail & Related papers (2024-05-29T14:28:08Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Exploiting Neighbor Effect: Conv-Agnostic GNNs Framework for Graphs with
Heterophily [58.76759997223951]
We propose a new metric based on von Neumann entropy to re-examine the heterophily problem of GNNs.
We also propose a Conv-Agnostic GNN framework (CAGNNs) to enhance the performance of most GNNs on heterophily datasets.
arXiv Detail & Related papers (2022-03-19T14:26:43Z) - Graph Neural Networks for Graphs with Heterophily: A Survey [98.45621222357397]
We provide a comprehensive review of graph neural networks (GNNs) for heterophilic graphs.
Specifically, we propose a systematic taxonomy that essentially governs existing heterophilic GNN models.
We discuss the correlation between graph heterophily and various graph research domains, aiming to facilitate the development of more effective GNNs.
arXiv Detail & Related papers (2022-02-14T23:07:47Z) - On Local Aggregation in Heterophilic Graphs [11.100606980915144]
We show that properly tuned classical GNNs and multi-layer perceptrons match or exceed the accuracy of recent long-range aggregation methods on heterophilic graphs.
We propose the Neighborhood Information Content(NIC) metric, which is a novel information-theoretic graph metric.
arXiv Detail & Related papers (2021-06-06T19:12:31Z) - Higher-Order Attribute-Enhancing Heterogeneous Graph Neural Networks [67.25782890241496]
We propose a higher-order Attribute-Enhancing Graph Neural Network (HAEGNN) for heterogeneous network representation learning.
HAEGNN simultaneously incorporates meta-paths and meta-graphs for rich, heterogeneous semantics.
It shows superior performance against the state-of-the-art methods in node classification, node clustering, and visualization.
arXiv Detail & Related papers (2021-04-16T04:56:38Z) - Beyond Low-Pass Filters: Adaptive Feature Propagation on Graphs [6.018995094882323]
Graph neural networks (GNNs) have been extensively studied for prediction tasks on graphs.
Most GNNs assume local homophily, i.e., strong similarities in localneighborhoods.
We propose a flexible GNN model, which is capable of handling any graphs without beingrestricted by their underlying homophily.
arXiv Detail & Related papers (2021-03-26T00:35:36Z) - Adaptive Universal Generalized PageRank Graph Neural Network [36.850433364139924]
Graph neural networks (GNNs) are designed to exploit both sources of evidence but they do not optimally trade-off their utility.
We introduce a new Generalized PageRank (GPR) GNN architecture that adaptively learns the GPR weights.
GPR-GNN offers significant performance improvement compared to existing techniques on both synthetic and benchmark data.
arXiv Detail & Related papers (2020-06-14T19:27:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.