Path Integral Based Convolution and Pooling for Heterogeneous Graph
Neural Networks
- URL: http://arxiv.org/abs/2302.13399v1
- Date: Sun, 26 Feb 2023 20:05:23 GMT
- Title: Path Integral Based Convolution and Pooling for Heterogeneous Graph
Neural Networks
- Authors: Lingjie Kong and Yun Liao
- Abstract summary: Graph neural networks (GNN) extends deep learning to graph-structure dataset.
Similar to Convolutional Neural Networks (CNN) using on image prediction, convolutional and pooling layers are the foundation to success for GNN on graph prediction tasks.
- Score: 2.5889737226898437
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNN) extends deep learning to graph-structure dataset.
Similar to Convolutional Neural Networks (CNN) using on image prediction,
convolutional and pooling layers are the foundation to success for GNN on graph
prediction tasks. In the initial PAN paper, it uses a path integral based graph
neural networks for graph prediction. Specifically, it uses a convolution
operation that involves every path linking the message sender and receiver with
learnable weights depending on the path length, which corresponds to the
maximal entropy random walk. It further generalizes such convolution operation
to a new transition matrix called maximal entropy transition (MET). Because the
diagonal entries of the MET matrix is directly related to the subgraph
centrality, it provide a trial mechanism for pooling based on centrality score.
While the initial PAN paper only considers node features. We further extends
its capability to handle complex heterogeneous graph including both node and
edge features.
Related papers
- Improving Graph Neural Networks by Learning Continuous Edge Directions [0.0]
Graph Neural Networks (GNNs) traditionally employ a message-passing mechanism that resembles diffusion over undirected graphs.
Our key insight is to assign fuzzy edge directions to the edges of a graph so that features can preferentially flow in one direction between nodes.
We propose a general framework, called Continuous Edge Direction (CoED) GNN, for learning on graphs with fuzzy edges.
arXiv Detail & Related papers (2024-10-18T01:34:35Z) - Generalization of Geometric Graph Neural Networks [84.01980526069075]
We study the generalization capabilities of geometric graph neural networks (GNNs)
We prove a generalization gap between the optimal empirical risk and the optimal statistical risk of this GNN.
The most important observation is that the generalization capability can be realized with one large graph instead of being limited to the size of the graph as in previous results.
arXiv Detail & Related papers (2024-09-08T18:55:57Z) - Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Degree-based stratification of nodes in Graph Neural Networks [66.17149106033126]
We modify the Graph Neural Network (GNN) architecture so that the weight matrices are learned, separately, for the nodes in each group.
This simple-to-implement modification seems to improve performance across datasets and GNN methods.
arXiv Detail & Related papers (2023-12-16T14:09:23Z) - Seq-HGNN: Learning Sequential Node Representation on Heterogeneous Graph [57.2953563124339]
We propose a novel heterogeneous graph neural network with sequential node representation, namely Seq-HGNN.
We conduct extensive experiments on four widely used datasets from Heterogeneous Graph Benchmark (HGB) and Open Graph Benchmark (OGB)
arXiv Detail & Related papers (2023-05-18T07:27:18Z) - Neural Link Prediction with Walk Pooling [31.12613408446031]
We propose a link prediction based on a new pooling scheme called WalkPool.
It combines the expressivity of topological algorithms with the feature-learning ability of neural networks.
It outperforms state-of-the-art methods on all common link prediction benchmarks.
arXiv Detail & Related papers (2021-10-08T20:52:12Z) - Simplicial Convolutional Neural Networks [36.078200422283835]
Recently, signal processing and neural networks have been extended to process and learn from data on graphs.
We propose a simplicial convolutional neural network (SCNN) architecture to learn from data defined on simplices.
arXiv Detail & Related papers (2021-10-06T08:52:55Z) - RaWaNet: Enriching Graph Neural Network Input via Random Walks on Graphs [0.0]
Graph neural networks (GNNs) have gained increasing popularity and have shown very promising results for data that are represented by graphs.
We propose a random walk data processing of the graphs based on three selected lengths. Namely, (regular) walks of length 1 and 2, and a fractional walk of length $gamma in (0,1)$, in order to capture the different local and global dynamics on the graphs.
We test our method on various molecular datasets by passing the processed node features to the network in order to perform several classification and regression tasks.
arXiv Detail & Related papers (2021-09-15T20:04:01Z) - Graph Neural Networks: Architectures, Stability and Transferability [176.3960927323358]
Graph Neural Networks (GNNs) are information processing architectures for signals supported on graphs.
They are generalizations of convolutional neural networks (CNNs) in which individual layers contain banks of graph convolutional filters.
arXiv Detail & Related papers (2020-08-04T18:57:36Z) - Path Integral Based Convolution and Pooling for Graph Neural Networks [12.801534458657592]
We propose a path integral based graph neural networks (PAN) for classification and regression tasks on graphs.
PAN provides a versatile framework that can be tailored for different graph data with varying sizes and structures.
Experimental results show that PAN achieves state-of-the-art performance on various graph classification/regression tasks.
arXiv Detail & Related papers (2020-06-29T16:20:33Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.