Connectivity Optimized Nested Graph Networks for Crystal Structures
- URL: http://arxiv.org/abs/2302.14102v2
- Date: Wed, 9 Aug 2023 15:05:42 GMT
- Title: Connectivity Optimized Nested Graph Networks for Crystal Structures
- Authors: Robin Ruff, Patrick Reiser, Jan St\"uhmer, Pascal Friederich
- Abstract summary: Graph neural networks (GNNs) have been applied to a large variety of applications in materials science and chemistry.
We show that our suggested models systematically improve state-of-the-art results across all tasks within the MatBench benchmark.
- Score: 1.1470070927586016
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Graph neural networks (GNNs) have been applied to a large variety of
applications in materials science and chemistry. Here, we recapitulate the
graph construction for crystalline (periodic) materials and investigate its
impact on the GNNs model performance. We suggest the asymmetric unit cell as a
representation to reduce the number of atoms by using all symmetries of the
system. This substantially reduced the computational cost and thus time needed
to train large graph neural networks without any loss in accuracy. Furthermore,
with a simple but systematically built GNN architecture based on message
passing and line graph templates, we introduce a general architecture (Nested
Graph Network, NGN) that is applicable to a wide range of tasks. We show that
our suggested models systematically improve state-of-the-art results across all
tasks within the MatBench benchmark. Further analysis shows that optimized
connectivity and deeper message functions are responsible for the improvement.
Asymmetric unit cells and connectivity optimization can be generally applied to
(crystal) graph networks, while our suggested nested graph framework will open
new ways of systematic comparison of GNN architectures.
Related papers
- Exact Computation of Any-Order Shapley Interactions for Graph Neural Networks [53.10674067060148]
Shapley Interactions (SIs) quantify node contributions and interactions among multiple nodes.
By exploiting the GNN architecture, we show that the structure of interactions in node embeddings are preserved for graph prediction.
We introduce GraphSHAP-IQ, an efficient approach to compute any-order SIs exactly.
arXiv Detail & Related papers (2025-01-28T13:37:44Z) - Revisiting Graph Neural Networks on Graph-level Tasks: Comprehensive Experiments, Analysis, and Improvements [54.006506479865344]
We propose a unified evaluation framework for graph-level Graph Neural Networks (GNNs)
This framework provides a standardized setting to evaluate GNNs across diverse datasets.
We also propose a novel GNN model with enhanced expressivity and generalization capabilities.
arXiv Detail & Related papers (2025-01-01T08:48:53Z) - GIMS: Image Matching System Based on Adaptive Graph Construction and Graph Neural Network [7.711922592226936]
We introduce an innovative adaptive graph construction method that utilizes a filtering mechanism based on distance and dynamic threshold similarity.
We also combine the global awareness capabilities of Transformers to enhance the model's representation of graph structures.
Our system achieves an average improvement of 3.8x-40.3x in overall matching performance.
arXiv Detail & Related papers (2024-12-24T07:05:55Z) - A Comprehensive Study on Large-Scale Graph Training: Benchmarking and
Rethinking [124.21408098724551]
Large-scale graph training is a notoriously challenging problem for graph neural networks (GNNs)
We present a new ensembling training manner, named EnGCN, to address the existing issues.
Our proposed method has achieved new state-of-the-art (SOTA) performance on large-scale datasets.
arXiv Detail & Related papers (2022-10-14T03:43:05Z) - Relation Embedding based Graph Neural Networks for Handling
Heterogeneous Graph [58.99478502486377]
We propose a simple yet efficient framework to make the homogeneous GNNs have adequate ability to handle heterogeneous graphs.
Specifically, we propose Relation Embedding based Graph Neural Networks (RE-GNNs), which employ only one parameter per relation to embed the importance of edge type relations and self-loop connections.
arXiv Detail & Related papers (2022-09-23T05:24:18Z) - Learnable Filters for Geometric Scattering Modules [64.03877398967282]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2022-08-15T22:30:07Z) - Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid
Scattering Networks [11.857894213975644]
We propose a hybrid graph neural network (GNN) framework that combines traditional GCN filters with band-pass filters defined via the geometric scattering transform.
Our theoretical results establish the complementary benefits of the scattering filters to leverage structural information from the graph, while our experiments show the benefits of our method on various learning tasks.
arXiv Detail & Related papers (2022-01-22T00:47:41Z) - Adaptive Kernel Graph Neural Network [21.863238974404474]
Graph neural networks (GNNs) have demonstrated great success in representation learning for graph-structured data.
In this paper, we propose a novel framework - i.e., namely Adaptive Kernel Graph Neural Network (AKGNN)
AKGNN learns to adapt to the optimal graph kernel in a unified manner at the first attempt.
Experiments are conducted on acknowledged benchmark datasets and promising results demonstrate the outstanding performance of our proposed AKGNN.
arXiv Detail & Related papers (2021-12-08T20:23:58Z) - Directed Acyclic Graph Neural Networks [9.420935957200518]
We focus on a special, yet widely used, type of graphs -- DAGs -- and inject a stronger inductive bias -- partial ordering -- into the neural network design.
We propose the emphdirected acyclic graph relational neural network, DAGNN, an architecture that processes information according to the flow defined by the partial order.
arXiv Detail & Related papers (2021-01-20T04:50:16Z) - Data-Driven Learning of Geometric Scattering Networks [74.3283600072357]
We propose a new graph neural network (GNN) module based on relaxations of recently proposed geometric scattering transforms.
Our learnable geometric scattering (LEGS) module enables adaptive tuning of the wavelets to encourage band-pass features to emerge in learned representations.
arXiv Detail & Related papers (2020-10-06T01:20:27Z) - Path Integral Based Convolution and Pooling for Graph Neural Networks [12.801534458657592]
We propose a path integral based graph neural networks (PAN) for classification and regression tasks on graphs.
PAN provides a versatile framework that can be tailored for different graph data with varying sizes and structures.
Experimental results show that PAN achieves state-of-the-art performance on various graph classification/regression tasks.
arXiv Detail & Related papers (2020-06-29T16:20:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.