Demystifying Orthogonal Monte Carlo and Beyond
- URL: http://arxiv.org/abs/2005.13590v1
- Date: Wed, 27 May 2020 18:44:38 GMT
- Title: Demystifying Orthogonal Monte Carlo and Beyond
- Authors: Han Lin, Haoxian Chen, Tianyi Zhang, Clement Laroche, and Krzysztof
Choromanski
- Abstract summary: Orthogonal Monte Carlo (OMC) is a very effective sampling algorithm imposing structural geometric conditions (orthogonality) on samples for variance reduction.
We shed new light on the theoretical principles behind OMC, applying theory of negatively dependent random variables to obtain several new concentration results.
We propose a novel extensions of the method leveraging number theory techniques and particle algorithms, called Near-Orthogonal Monte Carlo (NOMC)
- Score: 20.745014324028386
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Orthogonal Monte Carlo (OMC) is a very effective sampling algorithm imposing
structural geometric conditions (orthogonality) on samples for variance
reduction. Due to its simplicity and superior performance as compared to its
Quasi Monte Carlo counterparts, OMC is used in a wide spectrum of challenging
machine learning applications ranging from scalable kernel methods to
predictive recurrent neural networks, generative models and reinforcement
learning. However theoretical understanding of the method remains very limited.
In this paper we shed new light on the theoretical principles behind OMC,
applying theory of negatively dependent random variables to obtain several new
concentration results. We also propose a novel extensions of the method
leveraging number theory techniques and particle algorithms, called
Near-Orthogonal Monte Carlo (NOMC). We show that NOMC is the first algorithm
consistently outperforming OMC in applications ranging from kernel methods to
approximating distances in probabilistic metric spaces.
Related papers
- Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Reverse Diffusion Monte Carlo [19.35592726471155]
We propose a novel Monte Carlo sampling algorithm called reverse diffusion Monte Carlo (rdMC)
rdMC is distinct from the Markov chain Monte Carlo (MCMC) methods.
arXiv Detail & Related papers (2023-07-05T05:42:03Z) - D4FT: A Deep Learning Approach to Kohn-Sham Density Functional Theory [79.50644650795012]
We propose a deep learning approach to solve Kohn-Sham Density Functional Theory (KS-DFT)
We prove that such an approach has the same expressivity as the SCF method, yet reduces the computational complexity.
In addition, we show that our approach enables us to explore more complex neural-based wave functions.
arXiv Detail & Related papers (2023-03-01T10:38:10Z) - Convergence of Dirichlet Forms for MCMC Optimal Scaling with Dependent
Target Distributions on Large Graphs [0.6599344783327054]
Markov chain Monte Carlo (MCMC) algorithms have played a significant role in statistics, physics, machine learning and others.
The random walk Metropolis (RWM) algorithm as the most classical MCMC algorithm, has had a great influence on the development and practice of science and engineering.
In this paper, we utilize the Mosco convergence of Dirichlet forms in analyzing the RWM algorithm on large graphs.
arXiv Detail & Related papers (2022-10-31T03:41:17Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Quantum algorithm for stochastic optimal stopping problems with
applications in finance [60.54699116238087]
The famous least squares Monte Carlo (LSM) algorithm combines linear least square regression with Monte Carlo simulation to approximately solve problems in optimal stopping theory.
We propose a quantum LSM based on quantum access to a process, on quantum circuits for computing the optimal stopping times, and on quantum techniques for Monte Carlo.
arXiv Detail & Related papers (2021-11-30T12:21:41Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Annealed Flow Transport Monte Carlo [91.20263039913912]
Annealed Flow Transport (AFT) builds upon Annealed Importance Sampling (AIS) and Sequential Monte Carlo (SMC)
AFT relies on NF which is learned sequentially to push particles towards the successive targets.
We show that a continuous-time scaling limit of the population version of AFT is given by a Feynman--Kac measure.
arXiv Detail & Related papers (2021-02-15T12:05:56Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Estimation of Thermodynamic Observables in Lattice Field Theories with
Deep Generative Models [4.84753320115456]
We show that generative models can be used to estimate the absolute value of the free energy.
We demonstrate the effectiveness of the proposed method for two-dimensional $phi4$ theory.
arXiv Detail & Related papers (2020-07-14T15:31:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.