TorchMD-NET: Equivariant Transformers for Neural Network based Molecular
Potentials
- URL: http://arxiv.org/abs/2202.02541v1
- Date: Sat, 5 Feb 2022 12:53:40 GMT
- Title: TorchMD-NET: Equivariant Transformers for Neural Network based Molecular
Potentials
- Authors: Philipp Th\"olke and Gianni De Fabritiis
- Abstract summary: We propose TorchMD-NET, a novel equivariant transformer (ET) architecture, outperforming state-of-the-art on MD17, ANI-1, and many QM9 targets in both accuracy and computational efficiency.
- Score: 2.538209532048867
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The prediction of quantum mechanical properties is historically plagued by a
trade-off between accuracy and speed. Machine learning potentials have
previously shown great success in this domain, reaching increasingly better
accuracy while maintaining computational efficiency comparable with classical
force fields. In this work we propose TorchMD-NET, a novel equivariant
transformer (ET) architecture, outperforming state-of-the-art on MD17, ANI-1,
and many QM9 targets in both accuracy and computational efficiency. Through an
extensive attention weight analysis, we gain valuable insights into the black
box predictor and show differences in the learned representation of conformers
versus conformations sampled from molecular dynamics or normal modes.
Furthermore, we highlight the importance of datasets including off-equilibrium
conformations for the evaluation of molecular potentials.
Related papers
- Equivariant Masked Position Prediction for Efficient Molecular Representation [6.761418610103767]
Graph neural networks (GNNs) have shown considerable promise in computational chemistry.
We introduce a novel self-supervised approach termed Equivariant Masked Position Prediction (EMPP)
EMPP formulates a nuanced position prediction task that is more well-defined and enhances the learning of quantum mechanical features.
arXiv Detail & Related papers (2025-02-12T08:39:26Z) - Excited-state nonadiabatic dynamics in explicit solvent using machine learned interatomic potentials [0.602276990341246]
We use FieldSchNet to replace QM/MM electrostatic embedding with its ML/MM counterpart for nonadiabatic excited state trajectories.
Our results demonstrate that the ML/MM model reproduces the electronic kinetics and structural rearrangements of QM/MM surface hopping reference simulations.
arXiv Detail & Related papers (2025-01-28T14:14:43Z) - Broadening the Scope of Neural Network Potentials through Direct Inclusion of Additional Molecular Attributes [2.679689033125693]
We demonstrate the importance of including additional electronic attributes in neural network potential representations.
We show that this modification resolves the input degeneracy issues stemming from the use of atomic numbers and positions alone.
This is accomplished without tailored strategies or the inclusion of physics-based energy terms.
arXiv Detail & Related papers (2024-03-22T09:54:04Z) - A Multi-Grained Symmetric Differential Equation Model for Learning Protein-Ligand Binding Dynamics [73.35846234413611]
In drug discovery, molecular dynamics (MD) simulation provides a powerful tool for predicting binding affinities, estimating transport properties, and exploring pocket sites.
We propose NeuralMD, the first machine learning (ML) surrogate that can facilitate numerical MD and provide accurate simulations in protein-ligand binding dynamics.
We demonstrate the efficiency and effectiveness of NeuralMD, achieving over 1K$times$ speedup compared to standard numerical MD simulations.
arXiv Detail & Related papers (2024-01-26T09:35:17Z) - QKSAN: A Quantum Kernel Self-Attention Network [53.96779043113156]
A Quantum Kernel Self-Attention Mechanism (QKSAM) is introduced to combine the data representation merit of Quantum Kernel Methods (QKM) with the efficient information extraction capability of SAM.
A Quantum Kernel Self-Attention Network (QKSAN) framework is proposed based on QKSAM, which ingeniously incorporates the Deferred Measurement Principle (DMP) and conditional measurement techniques.
Four QKSAN sub-models are deployed on PennyLane and IBM Qiskit platforms to perform binary classification on MNIST and Fashion MNIST.
arXiv Detail & Related papers (2023-08-25T15:08:19Z) - QH9: A Quantum Hamiltonian Prediction Benchmark for QM9 Molecules [69.25826391912368]
We generate a new Quantum Hamiltonian dataset, named as QH9, to provide precise Hamiltonian matrices for 999 or 2998 molecular dynamics trajectories.
We show that current machine learning models have the capacity to predict Hamiltonian matrices for arbitrary molecules.
arXiv Detail & Related papers (2023-06-15T23:39:07Z) - Physics-informed machine learning with differentiable programming for
heterogeneous underground reservoir pressure management [64.17887333976593]
Avoiding over-pressurization in subsurface reservoirs is critical for applications like CO2 sequestration and wastewater injection.
Managing the pressures by controlling injection/extraction are challenging because of complex heterogeneity in the subsurface.
We use differentiable programming with a full-physics model and machine learning to determine the fluid extraction rates that prevent over-pressurization.
arXiv Detail & Related papers (2022-06-21T20:38:13Z) - Quantum-tailored machine-learning characterization of a superconducting
qubit [50.591267188664666]
We develop an approach to characterize the dynamics of a quantum device and learn device parameters.
This approach outperforms physics-agnostic recurrent neural networks trained on numerically generated and experimental data.
This demonstration shows how leveraging domain knowledge improves the accuracy and efficiency of this characterization task.
arXiv Detail & Related papers (2021-06-24T15:58:57Z) - A Universal Framework for Featurization of Atomistic Systems [0.0]
Reactive force fields based on physics or machine learning can be used to bridge the gap in time and length scales.
We introduce the Gaussian multi-pole (GMP) featurization scheme that utilizes physically-relevant multi-pole expansions of the electron density around atoms.
We demonstrate that GMP-based models can achieve chemical accuracy for the QM9 dataset, and their accuracy remains reasonable even when extrapolating to new elements.
arXiv Detail & Related papers (2021-02-04T03:11:00Z) - Multi-task learning for electronic structure to predict and explore
molecular potential energy surfaces [39.228041052681526]
We refine the OrbNet model to accurately predict energy, forces, and other response properties for molecules.
The model is end-to-end differentiable due to the derivation of analytic gradients for all electronic structure terms.
It is shown to be transferable across chemical space due to the use of domain-specific features.
arXiv Detail & Related papers (2020-11-05T06:48:46Z) - Benchmarking adaptive variational quantum eigensolvers [63.277656713454284]
We benchmark the accuracy of VQE and ADAPT-VQE to calculate the electronic ground states and potential energy curves.
We find both methods provide good estimates of the energy and ground state.
gradient-based optimization is more economical and delivers superior performance than analogous simulations carried out with gradient-frees.
arXiv Detail & Related papers (2020-11-02T19:52:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.