Interpretable Ensemble Learning for Materials Property Prediction with
Classical Interatomic Potentials: Carbon as an Example
- URL: http://arxiv.org/abs/2308.10818v1
- Date: Mon, 24 Jul 2023 19:10:13 GMT
- Title: Interpretable Ensemble Learning for Materials Property Prediction with
Classical Interatomic Potentials: Carbon as an Example
- Authors: Xinyu Jiang, Haofan Sun, Kamal Choudhary, Houlong Zhuang, and Qiong
Nian
- Abstract summary: Machine learning (ML) is widely used to explore crystal materials and predict their properties.
We propose an approach based on ensemble learning consisting of regression trees to predict formation energy and elastic constants.
Without using any descriptor, the inputs are the properties calculated by molecular dynamics with 9 different classical interatomic potentials.
- Score: 3.848961327213375
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine learning (ML) is widely used to explore crystal materials and predict
their properties. However, the training is time-consuming for deep-learning
models, and the regression process is a black box that is hard to interpret.
Also, the preprocess to transfer a crystal structure into the input of ML,
called descriptor, needs to be designed carefully. To efficiently predict
important properties of materials, we propose an approach based on ensemble
learning consisting of regression trees to predict formation energy and elastic
constants based on small-size datasets of carbon allotropes as an example.
Without using any descriptor, the inputs are the properties calculated by
molecular dynamics with 9 different classical interatomic potentials. Overall,
the results from ensemble learning are more accurate than those from classical
interatomic potentials, and ensemble learning can capture the relatively
accurate properties from the 9 classical potentials as criteria for predicting
the final properties.
Related papers
- Extrapolative ML Models for Copolymers [1.901715290314837]
Machine learning models have been progressively used for predicting materials properties.
These models are inherently interpolative, and their efficacy for searching candidates outside a material's known range of property is unresolved.
Here, we determine the relationship between the extrapolation ability of an ML model, the size and range of its training dataset, and its learning approach.
arXiv Detail & Related papers (2024-09-15T11:02:01Z) - Balancing Molecular Information and Empirical Data in the Prediction of Physico-Chemical Properties [8.649679686652648]
We propose a general method for combining molecular descriptors with representation learning.
The proposed hybrid model exploits chemical structure information using graph neural networks.
It automatically detects cases where structure-based predictions are unreliable, in which case it corrects them by representation-learning based predictions.
arXiv Detail & Related papers (2024-06-12T10:51:00Z) - Predicting Properties of Periodic Systems from Cluster Data: A Case
Study of Liquid Water [0.6562256987706128]
We show that local, atom-centred descriptors for machine-learned potentials enable the prediction of bulk properties from cluster model training data.
We demonstrate such transferability by studying structural and dynamical properties of bulk liquid water with density functional theory.
arXiv Detail & Related papers (2023-12-03T14:37:27Z) - Accurate machine learning force fields via experimental and simulation
data fusion [0.0]
Machine Learning (ML)-based force fields are attracting ever-increasing interest due to their capacity to span scales of classical interatomic potentials at quantum-level accuracy.
Here we leverage both Density Functional Theory (DFT) calculations and experimentally measured mechanical properties and lattice parameters to train an ML potential of titanium.
We demonstrate that the fused data learning strategy can concurrently satisfy all target objectives, thus resulting in a molecular model of higher accuracy compared to the models trained with a single source data.
arXiv Detail & Related papers (2023-08-17T18:22:19Z) - Curvature-informed multi-task learning for graph networks [56.155331323304]
State-of-the-art graph neural networks attempt to predict multiple properties simultaneously.
We investigate a potential explanation for this phenomenon: the curvature of each property's loss surface significantly varies, leading to inefficient learning.
arXiv Detail & Related papers (2022-08-02T18:18:41Z) - A Machine Learning Method for Material Property Prediction: Example
Polymer Compatibility [39.364776649251944]
We present a brand-new and general machine learning method for material property prediction.
As a representative example, polymer compatibility is chosen to demonstrate the effectiveness of our method.
arXiv Detail & Related papers (2022-02-28T05:48:05Z) - Prediction of liquid fuel properties using machine learning models with
Gaussian processes and probabilistic conditional generative learning [56.67751936864119]
The present work aims to construct cheap-to-compute machine learning (ML) models to act as closure equations for predicting the physical properties of alternative fuels.
Those models can be trained using the database from MD simulations and/or experimental measurements in a data-fusion-fidelity approach.
The results show that ML models can predict accurately the fuel properties of a wide range of pressure and temperature conditions.
arXiv Detail & Related papers (2021-10-18T14:43:50Z) - A Trainable Optimal Transport Embedding for Feature Aggregation and its
Relationship to Attention [96.77554122595578]
We introduce a parametrized representation of fixed size, which embeds and then aggregates elements from a given input set according to the optimal transport plan between the set and a trainable reference.
Our approach scales to large datasets and allows end-to-end training of the reference, while also providing a simple unsupervised learning mechanism with small computational cost.
arXiv Detail & Related papers (2020-06-22T08:35:58Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z) - Parsimonious neural networks learn interpretable physical laws [77.34726150561087]
We propose parsimonious neural networks (PNNs) that combine neural networks with evolutionary optimization to find models that balance accuracy with parsimony.
The power and versatility of the approach is demonstrated by developing models for classical mechanics and to predict the melting temperature of materials from fundamental properties.
arXiv Detail & Related papers (2020-05-08T16:15:47Z) - Explainable Deep Relational Networks for Predicting Compound-Protein
Affinities and Contacts [80.69440684790925]
DeepRelations is a physics-inspired deep relational network with intrinsically explainable architecture.
It shows superior interpretability to the state-of-the-art.
It boosts the AUPRC of contact prediction 9.5, 16.9, 19.3 and 5.7-fold for the test, compound-unique, protein-unique, and both-unique sets.
arXiv Detail & Related papers (2019-12-29T00:14:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.