Neural network quantum state with proximal optimization: a ground-state
searching scheme based on variational Monte Carlo
- URL: http://arxiv.org/abs/2210.16493v1
- Date: Sat, 29 Oct 2022 04:55:39 GMT
- Title: Neural network quantum state with proximal optimization: a ground-state
searching scheme based on variational Monte Carlo
- Authors: Feng Chen and Ming Xue
- Abstract summary: We introduce a novel objective function with proximal optimization (PO) that enables multiple updates via reusing the mismatched samples.
We investigate the performance of our VMC-PO algorithm for ground-state searching with a 1-dimensional transverse-field Ising model and 2-dimensional Heisenberg antiferromagnet on a square lattice.
- Score: 4.772126473623257
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural network quantum states (NQS), incorporating with variational Monte
Carlo (VMC) method, are shown to be a promising way to investigate quantum
many-body physics. Whereas vanilla VMC methods perform one gradient update per
sample, we introduce a novel objective function with proximal optimization (PO)
that enables multiple updates via reusing the mismatched samples. Our VMC-PO
method keeps the advantage of the previous importance sampling gradient
optimization algorithm [L. Yang, {\it et al}, Phys. Rev. Research {\bf 2},
012039(R)(2020)] that efficiently uses sampled states. PO mitigates the
numerical instabilities during network updates, which is similar to stochastic
reconfiguration (SR) methods, but achieves an alternative and simpler implement
with lower computational complexity. We investigate the performance of our
VMC-PO algorithm for ground-state searching with a 1-dimensional
transverse-field Ising model and 2-dimensional Heisenberg antiferromagnet on a
square lattice, and demonstrate that the reached ground-state energies are
comparable to state-of-the-art results.
Related papers
- Paths towards time evolution with larger neural-network quantum states [17.826631514127012]
We consider a quantum quench from the paramagnetic to the anti-ferromagnetic phase in the tilted Ising model.
We show that for both types of networks, the projected time-dependent variational Monte Carlo (p-tVMC) method performs better than the non-projected approach.
arXiv Detail & Related papers (2024-06-05T15:32:38Z) - Unleashing Network Potentials for Semantic Scene Completion [50.95486458217653]
This paper proposes a novel SSC framework - Adrial Modality Modulation Network (AMMNet)
AMMNet introduces two core modules: a cross-modal modulation enabling the interdependence of gradient flows between modalities, and a customized adversarial training scheme leveraging dynamic gradient competition.
Extensive experimental results demonstrate that AMMNet outperforms state-of-the-art SSC methods by a large margin.
arXiv Detail & Related papers (2024-03-12T11:48:49Z) - Improving sample efficiency of high dimensional Bayesian optimization
with MCMC [7.241485121318798]
We propose a new method based on Markov Chain Monte Carlo to efficiently sample from an approximated posterior.
We show experimentally that both the Metropolis-Hastings and the Langevin Dynamics version of our algorithm outperform state-of-the-art methods in high-dimensional sequential optimization and reinforcement learning benchmarks.
arXiv Detail & Related papers (2024-01-05T05:56:42Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Scalable Imaginary Time Evolution with Neural Network Quantum States [0.0]
The representation of a quantum wave function as a neural network quantum state (NQS) provides a powerful variational ansatz for finding the ground states of many-body quantum systems.
We introduce an approach that bypasses the computation of the metric tensor and instead relies exclusively on first-order descent with Euclidean metric.
We make this method adaptive and stable by determining the optimal time step and keeping the target fixed until the energy of the NQS decreases.
arXiv Detail & Related papers (2023-07-28T12:26:43Z) - Decomposition of Matrix Product States into Shallow Quantum Circuits [62.5210028594015]
tensor network (TN) algorithms can be mapped to parametrized quantum circuits (PQCs)
We propose a new protocol for approximating TN states using realistic quantum circuits.
Our results reveal one particular protocol, involving sequential growth and optimization of the quantum circuit, to outperform all other methods.
arXiv Detail & Related papers (2022-09-01T17:08:41Z) - Continuous-variable optimization with neural network quantum states [6.791920570692005]
We investigate the utility of continuous-variable neural network quantum states (CV-NQS) for performing continuous optimization.
Numerical experiments conducted using variational Monte Carlo with CV-NQS indicate that although the non-local algorithm succeeds in finding ground states competitive with the local gradient search methods, the proposal suffers from unfavorable scaling.
arXiv Detail & Related papers (2021-08-06T22:45:09Z) - Rayleigh-Gauss-Newton optimization with enhanced sampling for
variational Monte Carlo [0.0]
We analyze optimization and sampling methods used in Variational Monte Carlo.
We introduce alterations to improve their performance.
In particular, we demonstrate that RGN can be made robust to energy spikes.
arXiv Detail & Related papers (2021-06-19T19:05:52Z) - LocalDrop: A Hybrid Regularization for Deep Neural Networks [98.30782118441158]
We propose a new approach for the regularization of neural networks by the local Rademacher complexity called LocalDrop.
A new regularization function for both fully-connected networks (FCNs) and convolutional neural networks (CNNs) has been developed based on the proposed upper bound of the local Rademacher complexity.
arXiv Detail & Related papers (2021-03-01T03:10:11Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.