Broadening the Scope of Neural Network Potentials through Direct Inclusion of Additional Molecular Attributes
- URL: http://arxiv.org/abs/2403.15073v2
- Date: Fri, 07 Feb 2025 08:58:26 GMT
- Title: Broadening the Scope of Neural Network Potentials through Direct Inclusion of Additional Molecular Attributes
- Authors: Guillem Simeon, Antonio Mirarchi, Raul P. Pelaez, Raimondas Galvelis, Gianni De Fabritiis,
- Abstract summary: We demonstrate the importance of including additional electronic attributes in neural network potential representations.
We show that this modification resolves the input degeneracy issues stemming from the use of atomic numbers and positions alone.
This is accomplished without tailored strategies or the inclusion of physics-based energy terms.
- Score: 2.679689033125693
- License:
- Abstract: Most state-of-the-art neural network potentials do not account for molecular attributes other than atomic numbers and positions, which limits its range of applicability by design. In this work, we demonstrate the importance of including additional electronic attributes in neural network potential representations with a minimal architectural change to TensorNet, a state-of-the-art equivariant model based on Cartesian rank-2 tensor representations. By performing experiments on both custom-made and public benchmarking datasets, we show that this modification resolves the input degeneracy issues stemming from the use of atomic numbers and positions alone, while enhancing the model's predictive accuracy across diverse chemical systems with different charge or spin states. This is accomplished without tailored strategies or the inclusion of physics-based energy terms, while maintaining efficiency and accuracy. These findings should furthermore encourage researchers to train and use models incorporating these additional representations.
Related papers
- The Importance of Being Scalable: Improving the Speed and Accuracy of Neural Network Interatomic Potentials Across Chemical Domains [4.340917737559795]
We study scaling in Neural Network Interatomic Potentials (NNIPs)
NNIPs act as surrogate models for ab initio quantum mechanical calculations.
We develop an NNIP architecture designed for scaling: the Efficiently Scaled Attention Interatomic Potential (EScAIP)
arXiv Detail & Related papers (2024-10-31T17:35:57Z) - Higher-Rank Irreducible Cartesian Tensors for Equivariant Message Passing [23.754664894759234]
atomistic simulations are crucial for advancing the chemical sciences.
Machine-learned interatomic potentials achieve accuracy on par with ab initio and first-principles methods at a fraction of their computational cost.
arXiv Detail & Related papers (2024-05-23T07:31:20Z) - Understanding Self-attention Mechanism via Dynamical System Perspective [58.024376086269015]
Self-attention mechanism (SAM) is widely used in various fields of artificial intelligence.
We show that intrinsic stiffness phenomenon (SP) in the high-precision solution of ordinary differential equations (ODEs) also widely exists in high-performance neural networks (NN)
We show that the SAM is also a stiffness-aware step size adaptor that can enhance the model's representational ability to measure intrinsic SP.
arXiv Detail & Related papers (2023-08-19T08:17:41Z) - TensorNet: Cartesian Tensor Representations for Efficient Learning of
Molecular Potentials [4.169915659794567]
We introduceNet, an innovative O(3)-equivariant message-passing neural network architecture.
By using tensor atomic embeddings, feature mixing is simplified through matrix product operations.
The accurate prediction of vector and tensor molecular quantities on top of potential energies and forces is possible.
arXiv Detail & Related papers (2023-06-10T16:41:18Z) - Graph neural networks for the prediction of molecular structure-property
relationships [59.11160990637615]
Graph neural networks (GNNs) are a novel machine learning method that directly work on the molecular graph.
GNNs allow to learn properties in an end-to-end fashion, thereby avoiding the need for informative descriptors.
We describe the fundamentals of GNNs and demonstrate the application of GNNs via two examples for molecular property prediction.
arXiv Detail & Related papers (2022-07-25T11:30:44Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Multi-task learning for electronic structure to predict and explore
molecular potential energy surfaces [39.228041052681526]
We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules.
The model is end-to-end differentiable due to the derivation of analytic gradients for all electronic structure terms.
It is shown to be transferable across chemical space due to the use of domain-specific features.
arXiv Detail & Related papers (2020-11-05T06:48:46Z) - Graph Neural Network for Hamiltonian-Based Material Property Prediction [56.94118357003096]
We present and compare several different graph convolution networks that are able to predict the band gap for inorganic materials.
The models are developed to incorporate two different features: the information of each orbital itself and the interaction between each other.
The results show that our model can get a promising prediction accuracy with cross-validation.
arXiv Detail & Related papers (2020-05-27T13:32:10Z) - Multi-View Graph Neural Networks for Molecular Property Prediction [67.54644592806876]
We present Multi-View Graph Neural Network (MV-GNN), a multi-view message passing architecture.
In MV-GNN, we introduce a shared self-attentive readout component and disagreement loss to stabilize the training process.
We further boost the expressive power of MV-GNN by proposing a cross-dependent message passing scheme.
arXiv Detail & Related papers (2020-05-17T04:46:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.