Solving the electronic Schr\"odinger equation for multiple nuclear
geometries with weight-sharing deep neural networks
- URL: http://arxiv.org/abs/2105.08351v1
- Date: Tue, 18 May 2021 08:23:09 GMT
- Title: Solving the electronic Schr\"odinger equation for multiple nuclear
geometries with weight-sharing deep neural networks
- Authors: Michael Scherbela, Rafael Reisenhofer, Leon Gerard, Philipp Marquetand
and Philipp Grohs
- Abstract summary: We introduce a weight-sharing constraint when optimizing neural network-based models for different molecular geometries.
We find that this technique can accelerate optimization when considering sets of nuclear geometries of the same molecule by an order of magnitude.
- Score: 4.1201966507589995
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurate numerical solutions for the Schr\"odinger equation are of utmost
importance in quantum chemistry. However, the computational cost of current
high-accuracy methods scales poorly with the number of interacting particles.
Combining Monte Carlo methods with unsupervised training of neural networks has
recently been proposed as a promising approach to overcome the curse of
dimensionality in this setting and to obtain accurate wavefunctions for
individual molecules at a moderately scaling computational cost. These methods
currently do not exploit the regularity exhibited by wavefunctions with respect
to their molecular geometries. Inspired by recent successful applications of
deep transfer learning in machine translation and computer vision tasks, we
attempt to leverage this regularity by introducing a weight-sharing constraint
when optimizing neural network-based models for different molecular geometries.
That is, we restrict the optimization process such that up to 95 percent of
weights in a neural network model are in fact equal across varying molecular
geometries. We find that this technique can accelerate optimization when
considering sets of nuclear geometries of the same molecule by an order of
magnitude and that it opens a promising route towards pre-trained neural
network wavefunctions that yield high accuracy even across different molecules.
Related papers
- Neural Pfaffians: Solving Many Many-Electron Schrödinger Equations [58.130170155147205]
Neural wave functions accomplished unprecedented accuracies in approximating the ground state of many-electron systems, though at a high computational cost.
Recent works proposed amortizing the cost by learning generalized wave functions across different structures and compounds instead of solving each problem independently.
This work tackles the problem by defining overparametrized, fully learnable neural wave functions suitable for generalization across molecules.
arXiv Detail & Related papers (2024-05-23T16:30:51Z) - Towards a Foundation Model for Neural Network Wavefunctions [5.145741425164946]
We propose a novel neural network ansatz, which maps uncorrelated, computationally cheap Hartree-Fock orbitals to correlated, high-accuracy neural network orbitals.
This ansatz is inherently capable of learning a single wavefunction across multiple compounds and geometries.
arXiv Detail & Related papers (2023-03-17T16:03:10Z) - Towards Neural Variational Monte Carlo That Scales Linearly with System
Size [67.09349921751341]
Quantum many-body problems are central to demystifying some exotic quantum phenomena, e.g., high-temperature superconductors.
The combination of neural networks (NN) for representing quantum states, and the Variational Monte Carlo (VMC) algorithm, has been shown to be a promising method for solving such problems.
We propose a NN architecture called Vector-Quantized Neural Quantum States (VQ-NQS) that utilizes vector-quantization techniques to leverage redundancies in the local-energy calculations of the VMC algorithm.
arXiv Detail & Related papers (2022-12-21T19:00:04Z) - A Self-Attention Ansatz for Ab-initio Quantum Chemistry [3.4161707164978137]
We present a novel neural network architecture using self-attention, the Wavefunction Transformer (Psiformer)
We show that the Psiformer can be used as a drop-in replacement for other neural networks, often dramatically improving the accuracy of the calculations.
This demonstrates that self-attention networks can learn complex quantum mechanical correlations between electrons, and are a promising route to reaching unprecedented accuracy in chemical calculations on larger systems.
arXiv Detail & Related papers (2022-11-24T15:38:55Z) - Solving the nuclear pairing model with neural network quantum states [58.720142291102135]
We present a variational Monte Carlo method that solves the nuclear many-body problem in the occupation number formalism.
A memory-efficient version of the reconfiguration algorithm is developed to train the network by minimizing the expectation value of the Hamiltonian.
arXiv Detail & Related papers (2022-11-09T00:18:01Z) - Neural network enhanced measurement efficiency for molecular
groundstates [63.36515347329037]
We adapt common neural network models to learn complex groundstate wavefunctions for several molecular qubit Hamiltonians.
We find that using a neural network model provides a robust improvement over using single-copy measurement outcomes alone to reconstruct observables.
arXiv Detail & Related papers (2022-06-30T17:45:05Z) - Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave
Functions [2.61072980439312]
In this work, we combine a Graph Neural Network (GNN) with a neural wave function to simultaneously solve the Schr"odinger equation for multiple geometries via VMC.
Compared to existing state-of-the-art networks, our Potential Energy Surface Network (PESNet) speeds up training for multiple geometries by up to 40 times while matching or surpassing their accuracy.
arXiv Detail & Related papers (2021-10-11T07:58:31Z) - Autoregressive neural-network wavefunctions for ab initio quantum
chemistry [3.5987961950527287]
We parameterise the electronic wavefunction with a novel autoregressive neural network (ARN)
This allows us to perform electronic structure calculations on molecules with up to 30 spin-orbitals.
arXiv Detail & Related papers (2021-09-26T13:44:41Z) - Gaussian Moments as Physically Inspired Molecular Descriptors for
Accurate and Scalable Machine Learning Potentials [0.0]
We propose a machine learning method for constructing high-dimensional potential energy surfaces based on feed-forward neural networks.
The accuracy of the developed approach in representing both chemical and configurational spaces is comparable to the one of several established machine learning models.
arXiv Detail & Related papers (2021-09-15T16:46:46Z) - Variational Monte Carlo calculations of $\mathbf{A\leq 4}$ nuclei with
an artificial neural-network correlator ansatz [62.997667081978825]
We introduce a neural-network quantum state ansatz to model the ground-state wave function of light nuclei.
We compute the binding energies and point-nucleon densities of $Aleq 4$ nuclei as emerging from a leading-order pionless effective field theory Hamiltonian.
arXiv Detail & Related papers (2020-07-28T14:52:28Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.