Simulations of state-of-the-art fermionic neural network wave functions
with diffusion Monte Carlo
- URL: http://arxiv.org/abs/2103.12570v2
- Date: Wed, 24 Mar 2021 19:13:18 GMT
- Title: Simulations of state-of-the-art fermionic neural network wave functions
with diffusion Monte Carlo
- Authors: Max Wilson, Nicholas Gao, Filip Wudarski, Eleanor Rieffel and Norm M.
Tubman
- Abstract summary: We introduce several modifications to the network (Fermi Net) and optimization method (Kronecker Factored Approximate Curvature)
The Diffusion Monte Carlo results exceed or match state-of-the-art performance for all systems investigated.
- Score: 2.039924457892648
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recently developed neural network-based \emph{ab-initio} solutions (Pfau et.
al arxiv:1909.02487v2) for finding ground states of fermionic systems can
generate state-of-the-art results on a broad class of systems. In this work, we
improve the results for this Ansatz with Diffusion Monte Carlo. Additionally,
we introduce several modifications to the network (Fermi Net) and optimization
method (Kronecker Factored Approximate Curvature) that reduce the number of
required resources while maintaining or improving the modelling performance. In
terms of the model, we remove redundant computations and alter the way data is
handled in the permutation equivariant function. The Diffusion Monte Carlo
results exceed or match state-of-the-art performance for all systems
investigated: atomic systems Be-Ne, and the carbon cation C$^+$.
Related papers
- von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - A Bayesian Take on Gaussian Process Networks [1.7188280334580197]
This work implements Monte Carlo and Markov Chain Monte Carlo methods to sample from the posterior distribution of network structures.
We show that our method outperforms state-of-the-art algorithms in recovering the graphical structure of the network.
arXiv Detail & Related papers (2023-06-20T08:38:31Z) - Cheap and Deterministic Inference for Deep State-Space Models of
Interacting Dynamical Systems [38.23826389188657]
We present a deep state-space model which employs graph neural networks in order to model the underlying interacting dynamical system.
The predictive distribution is multimodal and has the form of a Gaussian mixture model, where the moments of the Gaussian components can be computed via deterministic moment matching rules.
Our moment matching scheme can be exploited for sample-free inference, leading to more efficient and stable training compared to Monte Carlo alternatives.
arXiv Detail & Related papers (2023-05-02T20:30:23Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Synergy between deep neural networks and the variational Monte Carlo
method for small $^4He_N$ clusters [0.0]
We introduce a neural network-based approach for modeling wave functions that satisfy Bose-Einstein statistics.
Applying this model to small $4He_N$ clusters, we accurately predict ground state energies, pair density functions, and two-body contact parameters.
arXiv Detail & Related papers (2023-02-01T17:09:13Z) - Training neural networks using Metropolis Monte Carlo and an adaptive
variant [0.0]
We study the zero-temperature Metropolis Monte Carlo algorithm as a tool for training a neural network by minimizing a loss function.
We find that, as expected on theoretical grounds and shown empirically by other authors, Metropolis Monte Carlo can train a neural net with an accuracy comparable to that of gradient descent.
arXiv Detail & Related papers (2022-05-16T01:01:55Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Determinant-free fermionic wave function using feed-forward neural
networks [0.0]
We propose a framework for finding the ground state of many-body fermionic systems by using feed-forward neural networks.
We show that the accuracy of the approximation can be improved by optimizing the "variance" of the energy simultaneously with the energy itself.
These improvements can be applied to other approaches based on variational Monte Carlo methods.
arXiv Detail & Related papers (2021-08-19T11:51:36Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Nonlinear State-Space Generalizations of Graph Convolutional Neural
Networks [172.18295279061607]
Graph convolutional neural networks (GCNNs) learn compositional representations from network data by nesting linear graph convolutions into nonlinearities.
In this work, we approach GCNNs from a state-space perspective revealing that the graph convolutional module is a minimalistic linear state-space model.
We show that this state update may be problematic because it is nonparametric, and depending on the graph spectrum it may explode or vanish.
We propose a novel family of nodal aggregation rules that aggregate node features within a layer in a nonlinear state-space parametric fashion allowing for a better trade-off.
arXiv Detail & Related papers (2020-10-27T19:48:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.