Higher-order Graph Convolutional Network with Flower-Petals Laplacians
on Simplicial Complexes
- URL: http://arxiv.org/abs/2309.12971v2
- Date: Thu, 18 Jan 2024 06:57:18 GMT
- Title: Higher-order Graph Convolutional Network with Flower-Petals Laplacians
on Simplicial Complexes
- Authors: Yiming Huang, Yujie Zeng, Qiang Wu, Linyuan L\"u
- Abstract summary: We present a higher-order Flower-Petals (FP) model, incorporating FP Laplacians into simplicial complexes (SCs)
We also introduce a Higher-order Graph Convolutional Network (HiGCN) grounded in FP Laplacians, capable of discerning intrinsic features across varying topological scales.
HiGCN accomplishes state-of-the-art performance on a range of graph tasks and provides a scalable and flexible solution to explore higher-order interactions in graphs.
- Score: 5.838832985156104
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite the recent successes of vanilla Graph Neural Networks (GNNs) on
various tasks, their foundation on pairwise networks inherently limits their
capacity to discern latent higher-order interactions in complex systems. To
bridge this capability gap, we propose a novel approach exploiting the rich
mathematical theory of simplicial complexes (SCs) - a robust tool for modeling
higher-order interactions. Current SC-based GNNs are burdened by high
complexity and rigidity, and quantifying higher-order interaction strengths
remains challenging. Innovatively, we present a higher-order Flower-Petals (FP)
model, incorporating FP Laplacians into SCs. Further, we introduce a
Higher-order Graph Convolutional Network (HiGCN) grounded in FP Laplacians,
capable of discerning intrinsic features across varying topological scales. By
employing learnable graph filters, a parameter group within each FP Laplacian
domain, we can identify diverse patterns where the filters' weights serve as a
quantifiable measure of higher-order interaction strengths. The theoretical
underpinnings of HiGCN's advanced expressiveness are rigorously demonstrated.
Additionally, our empirical investigations reveal that the proposed model
accomplishes state-of-the-art performance on a range of graph tasks and
provides a scalable and flexible solution to explore higher-order interactions
in graphs. Codes and datasets are available at
https://github.com/Yiminghh/HiGCN.
Related papers
- Efficient Topology-aware Data Augmentation for High-Degree Graph Neural Networks [2.7523980737007414]
We propose TADA, an efficient and effective front-mounted data augmentation framework for graph neural networks (GNNs) on high-degree graphs (HDGs)
Under the hood, TADA includes two key modules: (i) feature expansion with structure embeddings, and (ii) topology- and attribute-aware graph sparsification.
TADA considerably improves the predictive performance of mainstream GNN models on 8 real homophilic/heterophilic HDGs in terms of node classification.
arXiv Detail & Related papers (2024-06-08T14:14:19Z) - AGNN: Alternating Graph-Regularized Neural Networks to Alleviate
Over-Smoothing [29.618952407794776]
We propose an Alternating Graph-regularized Neural Network (AGNN) composed of Graph Convolutional Layer (GCL) and Graph Embedding Layer (GEL)
GEL is derived from the graph-regularized optimization containing Laplacian embedding term, which can alleviate the over-smoothing problem.
AGNN is evaluated via a large number of experiments including performance comparison with some multi-layer or multi-order graph neural networks.
arXiv Detail & Related papers (2023-04-14T09:20:03Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - ACE-HGNN: Adaptive Curvature Exploration Hyperbolic Graph Neural Network [72.16255675586089]
We propose an Adaptive Curvature Exploration Hyperbolic Graph NeuralNetwork named ACE-HGNN to adaptively learn the optimal curvature according to the input graph and downstream tasks.
Experiments on multiple real-world graph datasets demonstrate a significant and consistent performance improvement in model quality with competitive performance and good generalization ability.
arXiv Detail & Related papers (2021-10-15T07:18:57Z) - Generalization of graph network inferences in higher-order graphical
models [5.33024001730262]
Probabilistic graphical models provide a powerful tool to describe complex statistical structure.
inferences such as marginalization are intractable for general graphs.
We define the Recurrent Factor Graph Neural Network (RF-GNN) to achieve fast approximate inference on graphical models that involve many-variable interactions.
arXiv Detail & Related papers (2021-07-12T20:51:27Z) - Parameterized Hypercomplex Graph Neural Networks for Graph
Classification [1.1852406625172216]
We develop graph neural networks that leverage the properties of hypercomplex feature transformation.
In particular, in our proposed class of models, the multiplication rule specifying the algebra itself is inferred from the data during training.
We test our proposed hypercomplex GNN on several open graph benchmark datasets and show that our models reach state-of-the-art performance.
arXiv Detail & Related papers (2021-03-30T18:01:06Z) - Weisfeiler and Lehman Go Topological: Message Passing Simplicial
Networks [3.0017241250121383]
We propose Message Passing Simplicial Networks (MPSNs)
MPSNs are models that perform message passing on simplicial complexes (SCs)
We empirically support our theoretical claims by showing that MPSNs can distinguish challenging strongly regular graphs for which GNNs fail.
arXiv Detail & Related papers (2021-03-04T18:35:24Z) - Counting Substructures with Higher-Order Graph Neural Networks:
Possibility and Impossibility Results [58.277290855841976]
We study tradeoffs of computational cost and expressive power of Graph Neural Networks (GNNs)
We show that a new model can count subgraphs of size $k$, and thereby overcomes a known limitation of low-order GNNs.
In several cases, the proposed algorithm can greatly reduce computational complexity compared to the existing higher-order $k$-GNNs.
arXiv Detail & Related papers (2020-12-06T03:42:54Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Improving Graph Neural Network Expressivity via Subgraph Isomorphism
Counting [63.04999833264299]
"Graph Substructure Networks" (GSN) is a topologically-aware message passing scheme based on substructure encoding.
We show that it is strictly more expressive than the Weisfeiler-Leman (WL) graph isomorphism test.
We perform an extensive evaluation on graph classification and regression tasks and obtain state-of-the-art results in diverse real-world settings.
arXiv Detail & Related papers (2020-06-16T15:30:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.