Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials
- URL: http://arxiv.org/abs/2109.07421v1
- Date: Wed, 15 Sep 2021 16:46:46 GMT
- Title: Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials
- Authors: Viktor Zaverkin and Johannes K\"astner
- Abstract summary: We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Machine learning techniques allow a direct mapping of atomic positions and
nuclear charges to the potential energy surface with almost ab-initio accuracy
and the computational efficiency of empirical potentials. In this work we
propose a machine learning method for constructing high-dimensional potential
energy surfaces based on feed-forward neural networks. As input to the neural
network we propose an extendable invariant local molecular descriptor
constructed from geometric moments. Their formulation via pairwise distance
vectors and tensor contractions allows a very efficient implementation on
graphical processing units (GPUs). The atomic species is encoded in the
molecular descriptor, which allows the restriction to one neural network for
the training of all atomic species in the data set. We demonstrate that the
accuracy of the developed approach in representing both chemical and
configurational spaces is comparable to the one of several established machine
learning models. Due to its high accuracy and efficiency, the proposed
machine-learned potentials can be used for any further tasks, for example the
optimization of molecular geometries, the calculation of rate constants or
molecular dynamics.
Related papers
- Universal neural network potentials as descriptors: Towards scalable chemical property prediction using quantum and classical computers [0.0]
We present a versatile approach that uses the intermediate information of a universal neural network potential as a general-purpose descriptor for chemical property prediction.
We show that transfer learning with graph neural network potentials such as M3GNet and MACE achieves accuracy comparable to state-of-the-art methods for predicting the NMR chemical shifts.
arXiv Detail & Related papers (2024-02-28T15:57:22Z) - QH9: A Quantum Hamiltonian Prediction Benchmark for QM9 Molecules [69.25826391912368]
We generate a new Quantum Hamiltonian dataset, named as QH9, to provide precise Hamiltonian matrices for 999 or 2998 molecular dynamics trajectories.
We show that current machine learning models have the capacity to predict Hamiltonian matrices for arbitrary molecules.
arXiv Detail & Related papers (2023-06-15T23:39:07Z) - Reliable machine learning potentials based on artificial neural network
for graphene [2.115174610040722]
Special 2D structure of graphene enables it to exhibit a wide range of peculiar material properties.
molecular dynamics (MD) simulations are widely adopted for understanding the microscopic origins of their unique properties.
An artificial neural network based interatomic potential has been developed for graphene to represent the potential energy surface.
arXiv Detail & Related papers (2023-06-12T17:12:08Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - Molecular Geometry-aware Transformer for accurate 3D Atomic System
modeling [51.83761266429285]
We propose a novel Transformer architecture that takes nodes (atoms) and edges (bonds and nonbonding atom pairs) as inputs and models the interactions among them.
Moleformer achieves state-of-the-art on the initial state to relaxed energy prediction of OC20 and is very competitive in QM9 on predicting quantum chemical properties.
arXiv Detail & Related papers (2023-02-02T03:49:57Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - GeoMol: Torsional Geometric Generation of Molecular 3D Conformer
Ensembles [60.12186997181117]
Prediction of a molecule's 3D conformer ensemble from the molecular graph holds a key role in areas of cheminformatics and drug discovery.
Existing generative models have several drawbacks including lack of modeling important molecular geometry elements.
We propose GeoMol, an end-to-end, non-autoregressive and SE(3)-invariant machine learning approach to generate 3D conformers.
arXiv Detail & Related papers (2021-06-08T14:17:59Z) - Solving the electronic Schr\"odinger equation for multiple nuclear
geometries with weight-sharing deep neural networks [4.1201966507589995]
We introduce a weight-sharing constraint when optimizing neural network-based models for different molecular geometries.
We find that this technique can accelerate optimization when considering sets of nuclear geometries of the same molecule by an order of magnitude.
arXiv Detail & Related papers (2021-05-18T08:23:09Z) - Multi-task learning for electronic structure to predict and explore
molecular potential energy surfaces [39.228041052681526]
We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules.
The model is end-to-end differentiable due to the derivation of analytic gradients for all electronic structure terms.
It is shown to be transferable across chemical space due to the use of domain-specific features.
arXiv Detail & Related papers (2020-11-05T06:48:46Z) - End-to-End Differentiable Molecular Mechanics Force Field Construction [0.5269923665485903]
We propose an alternative approach that uses graph neural networks to perceive chemical environments.
The entire process is modular and end-to-end differentiable with respect to model parameters.
We show that this approach is not only sufficiently to reproduce legacy atom types, but that it can learn to accurately reproduce and extend existing molecular mechanics force fields.
arXiv Detail & Related papers (2020-10-02T20:59:46Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.