Sampling-free Inference for Ab-Initio Potential Energy Surface Networks
- URL: http://arxiv.org/abs/2205.14962v1
- Date: Mon, 30 May 2022 10:00:59 GMT
- Title: Sampling-free Inference for Ab-Initio Potential Energy Surface Networks
- Authors: Nicholas Gao, Stephan G\"unnemann
- Abstract summary: A potential energy surface network (PESNet) has been proposed to reduce training time by solving the Schr"odinger equation for many geometries simultaneously.
Here, we address the inference shortcomings by proposing the Potential learning from ab-initio Networks (PlaNet) framework to simultaneously train a surrogate model that avoids expensive Monte-Carlo integration.
In this way, we can accurately model high-resolution multi-dimensional energy surfaces that previously would have been unobtainable via neural wave functions.
- Score: 2.088583843514496
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Obtaining the energy of molecular systems typically requires solving the
associated Schr\"odinger equation. Unfortunately, analytical solutions only
exist for single-electron systems, and accurate approximate solutions are
expensive. In recent work, the potential energy surface network (PESNet) has
been proposed to reduce training time by solving the Schr\"odinger equation for
many geometries simultaneously. While training significantly faster, inference
still required numerical integration limiting the evaluation to a few
geometries. Here, we address the inference shortcomings by proposing the
Potential learning from ab-initio Networks (PlaNet) framework to simultaneously
train a surrogate model that avoids expensive Monte-Carlo integration and,
thus, reduces inference time from minutes or even hours to milliseconds. In
this way, we can accurately model high-resolution multi-dimensional energy
surfaces that previously would have been unobtainable via neural wave
functions. Finally, we present PESNet++, an architectural improvement to
PESNet, that reduces errors by up to 39% and provides new state-of-the-art
results for neural wave functions across all systems evaluated.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Solving partial differential equations with sampled neural networks [1.8590821261905535]
Approximation of solutions to partial differential equations (PDE) is an important problem in computational science and engineering.
We discuss how sampling the hidden weights and biases of the ansatz network from data-agnostic and data-dependent probability distributions allows us to progress on both challenges.
arXiv Detail & Related papers (2024-05-31T14:24:39Z) - Good Lattice Training: Physics-Informed Neural Networks Accelerated by
Number Theory [7.462336024223669]
We propose a new technique called good lattice training (GLT) for PINNs.
GLT offers a set of collocation points that are effective even with a small number of points and for multi-dimensional spaces.
Our experiments demonstrate that GLT requires 2--20 times fewer collocation points than uniformly random sampling or Latin hypercube sampling.
arXiv Detail & Related papers (2023-07-26T00:01:21Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - LordNet: An Efficient Neural Network for Learning to Solve Parametric Partial Differential Equations without Simulated Data [47.49194807524502]
We propose LordNet, a tunable and efficient neural network for modeling entanglements.
The experiments on solving Poisson's equation and (2D and 3D) Navier-Stokes equation demonstrate that the long-range entanglements can be well modeled by the LordNet.
arXiv Detail & Related papers (2022-06-19T14:41:08Z) - Ab-Initio Potential Energy Surfaces by Pairing GNNs with Neural Wave
Functions [2.61072980439312]
In this work, we combine a Graph Neural Network (GNN) with a neural wave function to simultaneously solve the Schr"odinger equation for multiple geometries via VMC.
Compared to existing state-of-the-art networks, our Potential Energy Surface Network (PESNet) speeds up training for multiple geometries by up to 40 times while matching or surpassing their accuracy.
arXiv Detail & Related papers (2021-10-11T07:58:31Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - AutoInt: Automatic Integration for Fast Neural Volume Rendering [51.46232518888791]
We propose a new framework for learning efficient, closed-form solutions to integrals using implicit neural representation networks.
We demonstrate a greater than 10x improvement in photorealistic requirements, enabling fast neural volume rendering.
arXiv Detail & Related papers (2020-12-03T05:46:10Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - EikoNet: Solving the Eikonal equation with Deep Neural Networks [6.735657356113614]
We propose EikoNet, a deep learning approach to solving the Eikonal equation.
Our grid-free approach allows for rapid determination of the travel time between any two points within a continuous 3D domain.
The developed approach has important applications to earthquake hypocenter inversion, ray multi-pathing, and tomographic modeling.
arXiv Detail & Related papers (2020-03-25T02:31:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.