How Curvature Enhance the Adaptation Power of Framelet GCNs
- URL: http://arxiv.org/abs/2307.09768v1
- Date: Wed, 19 Jul 2023 06:05:33 GMT
- Title: How Curvature Enhance the Adaptation Power of Framelet GCNs
- Authors: Dai Shi, Yi Guo, Zhiqi Shao, Junbin Gao
- Abstract summary: Graph neural network (GNN) has been demonstrated powerful in modeling graph-structured data.
This paper introduces a new approach to enhance GNN by discrete graph Ricci curvature.
We show that our curvature-based GNN model outperforms the state-of-the-art baselines in both homophily and heterophily graph datasets.
- Score: 27.831929635701886
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural network (GNN) has been demonstrated powerful in modeling
graph-structured data. However, despite many successful cases of applying GNNs
to various graph classification and prediction tasks, whether the graph
geometrical information has been fully exploited to enhance the learning
performance of GNNs is not yet well understood. This paper introduces a new
approach to enhance GNN by discrete graph Ricci curvature. Specifically, the
graph Ricci curvature defined on the edges of a graph measures how difficult
the information transits on one edge from one node to another based on their
neighborhoods. Motivated by the geometric analogy of Ricci curvature in the
graph setting, we prove that by inserting the curvature information with
different carefully designed transformation function $\zeta$, several known
computational issues in GNN such as over-smoothing can be alleviated in our
proposed model. Furthermore, we verified that edges with very positive Ricci
curvature (i.e., $\kappa_{i,j} \approx 1$) are preferred to be dropped to
enhance model's adaption to heterophily graph and one curvature based graph
edge drop algorithm is proposed. Comprehensive experiments show that our
curvature-based GNN model outperforms the state-of-the-art baselines in both
homophily and heterophily graph datasets, indicating the effectiveness of
involving graph geometric information in GNNs.
Related papers
- Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - A Differential Geometric View and Explainability of GNN on Evolving
Graphs [15.228139478280747]
Graphs are ubiquitous in social networks and biochemistry, where Graph Neural Networks (GNN) are the state-of-the-art models for prediction.
We propose a smooth parameterization of the GNN predicted distributions using axiomatic attribution.
Experiments on node classification, link prediction, and graph classification tasks with evolving graphs demonstrate the better sparsity, faithfulness, and intuitiveness of the proposed method.
arXiv Detail & Related papers (2024-03-11T04:26:18Z) - Design Your Own Universe: A Physics-Informed Agnostic Method for Enhancing Graph Neural Networks [34.16727363891593]
We propose a model-agnostic enhancement framework for Graph Neural Networks (GNNs)
This framework enriches the graph structure by introducing additional nodes and rewiring connections with both positive and negative weights.
We theoretically verify that GNNs enhanced through our approach can effectively circumvent the over-smoothing issue and exhibit robustness against over-squashing.
Empirical validations on benchmarks for homophilic, heterophilic graphs, and long-term graph datasets show that GNNs enhanced by our method significantly outperform their original counterparts.
arXiv Detail & Related papers (2024-01-26T00:47:43Z) - DeepRicci: Self-supervised Graph Structure-Feature Co-Refinement for
Alleviating Over-squashing [72.70197960100677]
Graph Structure Learning (GSL) plays an important role in boosting Graph Neural Networks (GNNs) with a refined graph.
GSL solutions usually focus on structure refinement with task-specific supervision (i.e., node classification) or overlook the inherent weakness of GNNs themselves.
We propose to study self-supervised graph structure-feature co-refinement for effectively alleviating the issue of over-squashing in typical GNNs.
arXiv Detail & Related papers (2024-01-23T14:06:08Z) - FoSR: First-order spectral rewiring for addressing oversquashing in GNNs [0.0]
Graph neural networks (GNNs) are able to leverage the structure of graph data by passing messages along the edges of the graph.
We propose a computationally efficient algorithm that prevents oversquashing by systematically adding edges to the graph.
We find experimentally that our algorithm outperforms existing graph rewiring methods in several graph classification tasks.
arXiv Detail & Related papers (2022-10-21T07:58:03Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Curvature Graph Neural Network [8.477559786537919]
We introduce discrete graph curvature (the Ricci curvature) to quantify the strength of structural connection of pairwise nodes.
We propose Curvature Graph Neural Network (CGNN), which effectively improves the adaptive locality ability of GNNs.
The experimental results on synthetic datasets show that CGNN effectively exploits the topology structure information.
arXiv Detail & Related papers (2021-06-30T00:56:03Z) - Training Robust Graph Neural Networks with Topology Adaptive Edge
Dropping [116.26579152942162]
Graph neural networks (GNNs) are processing architectures that exploit graph structural information to model representations from network data.
Despite their success, GNNs suffer from sub-optimal generalization performance given limited training data.
This paper proposes Topology Adaptive Edge Dropping to improve generalization performance and learn robust GNN models.
arXiv Detail & Related papers (2021-06-05T13:20:36Z) - XGNN: Towards Model-Level Explanations of Graph Neural Networks [113.51160387804484]
Graphs neural networks (GNNs) learn node features by aggregating and combining neighbor information.
GNNs are mostly treated as black-boxes and lack human intelligible explanations.
We propose a novel approach, known as XGNN, to interpret GNNs at the model-level.
arXiv Detail & Related papers (2020-06-03T23:52:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.