Scalable deeper graph neural networks for high-performance materials
property prediction
- URL: http://arxiv.org/abs/2109.12283v1
- Date: Sat, 25 Sep 2021 05:58:04 GMT
- Title: Scalable deeper graph neural networks for high-performance materials
property prediction
- Authors: Sadman Sadeed Omee, Steph-Yves Louis, Nihang Fu, Lai Wei, Sourin Dey,
Rongzhi Dong, Qinyang Li, Jianjun Hu
- Abstract summary: We propose a novel graph attention neural network model DeeperGATGNN with differenti groupable normalization and skip-connections.
Our work shows that to deal with the high complexity of mapping the crystal materials structures to their properties, large-scale very deep graph neural networks are needed to achieve robust performances.
- Score: 1.9129213267332026
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Machine learning (ML) based materials discovery has emerged as one of the
most promising approaches for breakthroughs in materials science. While
heuristic knowledge based descriptors have been combined with ML algorithms to
achieve good performance, the complexity of the physicochemical mechanisms
makes it urgently needed to exploit representation learning from either
compositions or structures for building highly effective materials machine
learning models. Among these methods, the graph neural networks have shown the
best performance by its capability to learn high-level features from crystal
structures. However, all these models suffer from their inability to scale up
the models due to the over-smoothing issue of their message-passing GNN
architecture. Here we propose a novel graph attention neural network model
DeeperGATGNN with differentiable group normalization and skip-connections,
which allows to train very deep graph neural network models (e.g. 30 layers
compared to 3-9 layers in previous works). Through systematic benchmark studies
over six benchmark datasets for energy and band gap predictions, we show that
our scalable DeeperGATGNN model needs little costly hyper-parameter tuning for
different datasets and achieves the state-of-the-art prediction performances
over five properties out of six with up to 10\% improvement. Our work shows
that to deal with the high complexity of mapping the crystal materials
structures to their properties, large-scale very deep graph neural networks are
needed to achieve robust performances.
Related papers
- Enhancing material property prediction with ensemble deep graph convolutional networks [9.470117608423957]
Recent efforts have focused on employing advanced ML algorithms, including deep learning - based graph neural network, for property prediction.
Our research provides an in-depth evaluation of ensemble strategies in deep learning - based graph neural network, specifically targeting material property prediction tasks.
By testing the Crystal Graph Convolutional Neural Network (CGCNN) and its multitask version, MT-CGCNN, we demonstrated that ensemble techniques, especially prediction averaging, substantially improve precision beyond traditional metrics.
arXiv Detail & Related papers (2024-07-26T16:12:06Z) - Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - Visual Prompting Upgrades Neural Network Sparsification: A Data-Model Perspective [64.04617968947697]
We introduce a novel data-model co-design perspective: to promote superior weight sparsity.
Specifically, customized Visual Prompts are mounted to upgrade neural Network sparsification in our proposed VPNs framework.
arXiv Detail & Related papers (2023-12-03T13:50:24Z) - Simple and Efficient Heterogeneous Graph Neural Network [55.56564522532328]
Heterogeneous graph neural networks (HGNNs) have powerful capability to embed rich structural and semantic information of a heterogeneous graph into node representations.
Existing HGNNs inherit many mechanisms from graph neural networks (GNNs) over homogeneous graphs, especially the attention mechanism and the multi-layer structure.
This paper conducts an in-depth and detailed study of these mechanisms and proposes Simple and Efficient Heterogeneous Graph Neural Network (SeHGNN)
arXiv Detail & Related papers (2022-07-06T10:01:46Z) - Simplifying approach to Node Classification in Graph Neural Networks [7.057970273958933]
We decouple the node feature aggregation step and depth of graph neural network, and empirically analyze how different aggregated features play a role in prediction performance.
We show that not all features generated via aggregation steps are useful, and often using these less informative features can be detrimental to the performance of the GNN model.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN), and show empirically that the proposed model achieves comparable or even higher accuracy than state-of-the-art GNN models.
arXiv Detail & Related papers (2021-11-12T14:53:22Z) - Gone Fishing: Neural Active Learning with Fisher Embeddings [55.08537975896764]
There is an increasing need for active learning algorithms that are compatible with deep neural networks.
This article introduces BAIT, a practical representation of tractable, and high-performing active learning algorithm for neural networks.
arXiv Detail & Related papers (2021-06-17T17:26:31Z) - Improving Graph Neural Networks with Simple Architecture Design [7.057970273958933]
We introduce several key design strategies for graph neural networks.
We present a simple and shallow model, Feature Selection Graph Neural Network (FSGNN)
We show that the proposed model outperforms other state of the art GNN models and achieves up to 64% improvements in accuracy on node classification tasks.
arXiv Detail & Related papers (2021-05-17T06:46:01Z) - PredRNN: A Recurrent Neural Network for Spatiotemporal Predictive
Learning [109.84770951839289]
We present PredRNN, a new recurrent network for learning visual dynamics from historical context.
We show that our approach obtains highly competitive results on three standard datasets.
arXiv Detail & Related papers (2021-03-17T08:28:30Z) - Evolutionary Architecture Search for Graph Neural Networks [23.691915813153496]
We propose a novel AutoML framework through the evolution of individual models in a large Graph Neural Networks (GNN) architecture space.
To the best of our knowledge, this is the first work to introduce and evaluate evolutionary architecture search for GNN models.
arXiv Detail & Related papers (2020-09-21T22:11:53Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z) - Global Attention based Graph Convolutional Neural Networks for Improved
Materials Property Prediction [8.371766047183739]
We develop a novel model, GATGNN, for predicting inorganic material properties based on graph neural networks.
We show that our method is able to both outperform the previous models' predictions and provide insight into the crystallization of the material.
arXiv Detail & Related papers (2020-03-11T07:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.