Connectivity Optimized Nested Graph Networks for Crystal Structures
- URL: http://arxiv.org/abs/2302.14102v2
- Date: Wed, 9 Aug 2023 15:05:42 GMT
- Title: Connectivity Optimized Nested Graph Networks for Crystal Structures
- Authors: Robin Ruff, Patrick Reiser, Jan St\"uhmer, Pascal Friederich
- Abstract summary: Graph neural networks (GNNs) have been applied to a large variety of applications in materials science and chemistry.
We show that our suggested models systematically improve state-of-the-art results across all tasks within the MatBench benchmark.
- Score: 1.1470070927586016
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have been applied to a large variety of
applications in materials science and chemistry. Here, we recapitulate the
graph construction for crystalline (periodic) materials and investigate its
impact on the GNNs model performance. We suggest the asymmetric unit cell as a
representation to reduce the number of atoms by using all symmetries of the
system. This substantially reduced the computational cost and thus time needed
to train large graph neural networks without any loss in accuracy. Furthermore,
with a simple but systematically built GNN architecture based on message
passing and line graph templates, we introduce a general architecture (Nested
Graph Network, NGN) that is applicable to a wide range of tasks. We show that
our suggested models systematically improve state-of-the-art results across all
tasks within the MatBench benchmark. Further analysis shows that optimized
connectivity and deeper message functions are responsible for the improvement.
Asymmetric unit cells and connectivity optimization can be generally applied to
(crystal) graph networks, while our suggested nested graph framework will open
new ways of systematic comparison of GNN architectures.
Related papers
- Ensemble Learning for Graph Neural Networks [28.3650473174488]
Graph Neural Networks (GNNs) have shown success in various fields for learning from graph-structured data.
This paper investigates the application of ensemble learning techniques to improve the performance and robustness of GNNs.
arXiv Detail & Related papers (2023-10-22T03:55:13Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Directed Acyclic Graph Neural Networks [9.420935957200518]
We focus on a special, yet widely used, type of graphs -- DAGs -- and inject a stronger inductive bias -- partial ordering -- into the neural network design.
We propose the emphdirected acyclic graph relational neural network, DAGNN, an architecture that processes information according to the flow defined by the partial order.
arXiv Detail & Related papers (2021-01-20T04:50:16Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - A Unified View on Graph Neural Networks as Graph Signal Denoising [49.980783124401555]
Graph Neural Networks (GNNs) have risen to prominence in learning representations for graph structured data.
In this work, we establish mathematically that the aggregation processes in a group of representative GNN models can be regarded as solving a graph denoising problem.
We instantiate a novel GNN model, ADA-UGNN, derived from UGNN, to handle graphs with adaptive smoothness across nodes.
arXiv Detail & Related papers (2020-10-05T04:57:18Z) - Path Integral Based Convolution and Pooling for Graph Neural Networks [12.801534458657592]
We propose a path integral based graph neural networks (PAN) for classification and regression tasks on graphs.
PAN provides a versatile framework that can be tailored for different graph data with varying sizes and structures.
Experimental results show that PAN achieves state-of-the-art performance on various graph classification/regression tasks.
arXiv Detail & Related papers (2020-06-29T16:20:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.