Direct sampling of projected entangled-pair states
- URL: http://arxiv.org/abs/2109.07356v2
- Date: Thu, 9 Jun 2022 14:11:57 GMT
- Title: Direct sampling of projected entangled-pair states
- Authors: Tom Vieijra, Jutho Haegeman, Frank Verstraete, Laurens Vanderstraeten
- Abstract summary: Variational Monte Carlo studies employing projected entangled-pair states (PEPS) have recently shown that they can provide answers on long-standing questions.
We propose a sampling algorithm that generates independent samples from a PEPS, bypassing all problems related to finite autocorrelation times.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Variational Monte Carlo studies employing projected entangled-pair states
(PEPS) have recently shown that they can provide answers on long-standing
questions such as the nature of the phases in the two-dimensional $J_1 - J_2$
model. The sampling in these Monte Carlo algorithms is typically performed with
Markov Chain Monte Carlo algorithms employing local update rules, which often
suffer from long autocorrelation times and interdependent samples. We propose a
sampling algorithm that generates independent samples from a PEPS, bypassing
all problems related to finite autocorrelation times. This algorithm is a
generalization of an existing direct sampling algorithm for unitary tensor
networks. We introduce an auxiliary probability distribution from which
independent samples can be drawn, and combine it with importance sampling in
order to evaluate expectation values accurately. We benchmark our algorithm on
the classical Ising model and on variational optimization of two-dimensional
quantum spin models.
Related papers
- Markov chain Monte Carlo without evaluating the target: an auxiliary variable approach [9.426953273977496]
Markov chain Monte Carlo algorithms can be unified under a simple common procedure.
We develop the theory of the new framework, applying it to existing algorithms to simplify and extend their results.
Several new algorithms emerge from this framework, with improved performance demonstrated on both synthetic and real datasets.
arXiv Detail & Related papers (2024-06-07T20:06:23Z) - You Only Accept Samples Once: Fast, Self-Correcting Stochastic Variational Inference [0.0]
YOASOVI is an algorithm for performing fast, self-correcting intuition optimization for Variational Inference (VI) on large Bayesian heirarchical models.
To accomplish this, we take advantage of available information on the objective function used for VI at each iteration and replace regular Monte Carlo sampling with acceptance sampling.
arXiv Detail & Related papers (2024-06-05T01:28:53Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Sampling Approximately Low-Rank Ising Models: MCMC meets Variational
Methods [35.24886589614034]
We consider quadratic definite Ising models on the hypercube with a general interaction $J$.
Our general result implies the first time sampling algorithms for low-rank Ising models.
arXiv Detail & Related papers (2022-02-17T21:43:50Z) - Sampling from Arbitrary Functions via PSD Models [55.41644538483948]
We take a two-step approach by first modeling the probability distribution and then sampling from that model.
We show that these models can approximate a large class of densities concisely using few evaluations, and present a simple algorithm to effectively sample from these models.
arXiv Detail & Related papers (2021-10-20T12:25:22Z) - Deterministic Gibbs Sampling via Ordinary Differential Equations [77.42706423573573]
This paper presents a general construction of deterministic measure-preserving dynamics using autonomous ODEs and tools from differential geometry.
We show how Hybrid Monte Carlo and other deterministic samplers follow as special cases of our theory.
arXiv Detail & Related papers (2021-06-18T15:36:09Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Adaptive Sequential SAA for Solving Two-stage Stochastic Linear Programs [1.6181085766811525]
We present adaptive sequential SAA (sample average approximation) algorithms to solve large-scale two-stage linear programs.
The proposed algorithm can be stopped in finite-time to return a solution endowed with a probabilistic guarantee on quality.
arXiv Detail & Related papers (2020-12-07T14:58:16Z) - Analysis and Design of Thompson Sampling for Stochastic Partial
Monitoring [91.22679787578438]
We present a novel Thompson-sampling-based algorithm for partial monitoring.
We prove that the new algorithm achieves the logarithmic problem-dependent expected pseudo-regret $mathrmO(log T)$ for a linearized variant of the problem with local observability.
arXiv Detail & Related papers (2020-06-17T05:48:33Z) - Quantum Sampling Algorithms for Near-Term Devices [0.0]
We introduce a family of quantum algorithms that provide unbiased samples by encoding the entire Gibbs distribution.
We show that this approach leads to a speedup over a classical Markov chain algorithm.
It opens the door to exploring potentially useful sampling algorithms on near-term quantum devices.
arXiv Detail & Related papers (2020-05-28T14:46:20Z) - Learning Gaussian Graphical Models via Multiplicative Weights [54.252053139374205]
We adapt an algorithm of Klivans and Meka based on the method of multiplicative weight updates.
The algorithm enjoys a sample complexity bound that is qualitatively similar to others in the literature.
It has a low runtime $O(mp2)$ in the case of $m$ samples and $p$ nodes, and can trivially be implemented in an online manner.
arXiv Detail & Related papers (2020-02-20T10:50:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.