Spatial Monte Carlo Integration with Annealed Importance Sampling
- URL: http://arxiv.org/abs/2012.11198v2
- Date: Mon, 12 Apr 2021 08:24:02 GMT
- Title: Spatial Monte Carlo Integration with Annealed Importance Sampling
- Authors: Muneki Yasuda and Kaiji Sekimoto
- Abstract summary: A new method is proposed to evaluate the expectations on Ising models combining AIS and SMCI.
The proposed method performs efficiently in both high- and low-temperature regions.
- Score: 0.45687771576879593
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Evaluating expectations on an Ising model (or Boltzmann machine) is essential
for various applications, including statistical machine learning. However, in
general, the evaluation is computationally difficult because it involves
intractable multiple summations or integrations; therefore, it requires
approximation. Monte Carlo integration (MCI) is a well-known approximation
method; a more effective MCI-like approximation method was proposed recently,
called spatial Monte Carlo integration (SMCI). However, the estimations
obtained using SMCI (and MCI) exhibit a low accuracy in Ising models under a
low temperature owing to degradation of the sampling quality. Annealed
importance sampling (AIS) is a type of importance sampling based on Markov
chain Monte Carlo methods that can suppress performance degradation in
low-temperature regions with the force of importance weights. In this study, a
new method is proposed to evaluate the expectations on Ising models combining
AIS and SMCI. The proposed method performs efficiently in both high- and
low-temperature regions, which is demonstrated theoretically and numerically.
Related papers
- Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - A Tale of Sampling and Estimation in Discounted Reinforcement Learning [50.43256303670011]
We present a minimax lower bound on the discounted mean estimation problem.
We show that estimating the mean by directly sampling from the discounted kernel of the Markov process brings compelling statistical properties.
arXiv Detail & Related papers (2023-04-11T09:13:17Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Context-aware learning of hierarchies of low-fidelity models for
multi-fidelity uncertainty quantification [0.0]
Multi-fidelity Monte Carlo methods leverage low-fidelity and surrogate models for variance reduction to make tractable uncertainty quantification.
This work proposes a context-aware multi-fidelity Monte Carlo method that optimally balances the costs of training low-fidelity models with the costs of Monte Carlo sampling.
arXiv Detail & Related papers (2022-11-20T01:12:51Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Composite Spatial Monte Carlo Integration Based on Generalized Least
Squares [0.0]
Spatial Monte Carlo integration (SMCI) is a sampling-based approximation.
A new effective method is proposed by combining multiple SMCI estimators.
The results indicate that the proposed method can be effective in the inverse Ising problem (or Boltzmann machine learning)
arXiv Detail & Related papers (2022-04-07T06:35:13Z) - A Survey of Monte Carlo Methods for Parameter Estimation [0.0]
This paper reviews Monte Carlo (MC) methods for the estimation of static parameters in signal processing applications.
A historical note on the development of MC schemes is also provided, followed by the basic MC method and a brief description of the rejection sampling (RS) algorithm.
arXiv Detail & Related papers (2021-07-25T14:57:58Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - A Generalization of Spatial Monte Carlo Integration [0.0]
Spatial Monte Carlo integration (SMCI) is an extension of standard Monte Carlo integration and can approximate expectations on Markov random fields with high accuracy.
A new Boltzmann machine learning method based on SMCI is proposed, which is obtained by combining SMCI and the persistent contrastive divergence.
arXiv Detail & Related papers (2020-09-04T13:02:58Z) - Efficient Debiased Evidence Estimation by Multilevel Monte Carlo
Sampling [0.0]
We propose a new optimization algorithm for Bayesian inference based multilevel Monte Carlo (MLMC) methods.
Our numerical results confirm considerable computational savings compared to the conventional estimators.
arXiv Detail & Related papers (2020-01-14T09:14:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.