Path Integral Based Convolution and Pooling for Graph Neural Networks
- URL: http://arxiv.org/abs/2006.16811v2
- Date: Wed, 8 Jul 2020 15:03:01 GMT
- Title: Path Integral Based Convolution and Pooling for Graph Neural Networks
- Authors: Zheng Ma, Junyu Xuan, Yu Guang Wang, Ming Li, Pietro Lio
- Abstract summary: We propose a path integral based graph neural networks (PAN) for classification and regression tasks on graphs.
PAN provides a versatile framework that can be tailored for different graph data with varying sizes and structures.
Experimental results show that PAN achieves state-of-the-art performance on various graph classification/regression tasks.
- Score: 12.801534458657592
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Graph neural networks (GNNs) extends the functionality of traditional neural
networks to graph-structured data. Similar to CNNs, an optimized design of
graph convolution and pooling is key to success. Borrowing ideas from physics,
we propose a path integral based graph neural networks (PAN) for classification
and regression tasks on graphs. Specifically, we consider a convolution
operation that involves every path linking the message sender and receiver with
learnable weights depending on the path length, which corresponds to the
maximal entropy random walk. It generalizes the graph Laplacian to a new
transition matrix we call maximal entropy transition (MET) matrix derived from
a path integral formalism. Importantly, the diagonal entries of the MET matrix
are directly related to the subgraph centrality, thus providing a natural and
adaptive pooling mechanism. PAN provides a versatile framework that can be
tailored for different graph data with varying sizes and structures. We can
view most existing GNN architectures as special cases of PAN. Experimental
results show that PAN achieves state-of-the-art performance on various graph
classification/regression tasks, including a new benchmark dataset from
statistical mechanics we propose to boost applications of GNN in physical
sciences.
Related papers
- Scalable Graph Compressed Convolutions [68.85227170390864]
We propose a differentiable method that applies permutations to calibrate input graphs for Euclidean convolution.
Based on the graph calibration, we propose the Compressed Convolution Network (CoCN) for hierarchical graph representation learning.
arXiv Detail & Related papers (2024-07-26T03:14:13Z) - Connectivity Optimized Nested Graph Networks for Crystal Structures [1.1470070927586016]
Graph neural networks (GNNs) have been applied to a large variety of applications in materials science and chemistry.
We show that our suggested models systematically improve state-of-the-art results across all tasks within the MatBench benchmark.
arXiv Detail & Related papers (2023-02-27T19:26:48Z) - Path Integral Based Convolution and Pooling for Heterogeneous Graph
Neural Networks [2.5889737226898437]
Graph neural networks (GNN) extends deep learning to graph-structure dataset.
Similar to Convolutional Neural Networks (CNN) using on image prediction, convolutional and pooling layers are the foundation to success for GNN on graph prediction tasks.
arXiv Detail & Related papers (2023-02-26T20:05:23Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Graph Kernel Neural Networks [53.91024360329517]
We propose to use graph kernels, i.e. kernel functions that compute an inner product on graphs, to extend the standard convolution operator to the graph domain.
This allows us to define an entirely structural model that does not require computing the embedding of the input graph.
Our architecture allows to plug-in any type of graph kernels and has the added benefit of providing some interpretability.
arXiv Detail & Related papers (2021-12-14T14:48:08Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Spectral Graph Convolutional Networks With Lifting-based Adaptive Graph
Wavelets [81.63035727821145]
Spectral graph convolutional networks (SGCNs) have been attracting increasing attention in graph representation learning.
We propose a novel class of spectral graph convolutional networks that implement graph convolutions with adaptive graph wavelets.
arXiv Detail & Related papers (2021-08-03T17:57:53Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Optimal Transport Graph Neural Networks [31.191844909335963]
Current graph neural network (GNN) architectures naively average or sum node embeddings into an aggregated graph representation.
We introduce OT-GNN, a model that computes graph embeddings using parametric prototypes.
arXiv Detail & Related papers (2020-06-08T14:57:39Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.