Stochastic automatic differentiation for Monte Carlo processes
- URL: http://arxiv.org/abs/2307.15406v1
- Date: Fri, 28 Jul 2023 08:59:01 GMT
- Title: Stochastic automatic differentiation for Monte Carlo processes
- Authors: Guilherme Catumba, Alberto Ramos, Bryan Zaldivar
- Abstract summary: We consider the extension of Automatic Differentiation (AD) techniques to Monte Carlo process.
We show that the Hamiltonian approach can be understood as a change of variables of the reweighting approach.
- Score: 1.1279808969568252
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Monte Carlo methods represent a cornerstone of computer science. They allow
to sample high dimensional distribution functions in an efficient way. In this
paper we consider the extension of Automatic Differentiation (AD) techniques to
Monte Carlo process, addressing the problem of obtaining derivatives (and in
general, the Taylor series) of expectation values. Borrowing ideas from the
lattice field theory community, we examine two approaches. One is based on
reweighting while the other represents an extension of the Hamiltonian approach
typically used by the Hybrid Monte Carlo (HMC) and similar algorithms. We show
that the Hamiltonian approach can be understood as a change of variables of the
reweighting approach, resulting in much reduced variances of the coefficients
of the Taylor series. This work opens the door to find other variance reduction
techniques for derivatives of expectation values.
Related papers
- A Stein Gradient Descent Approach for Doubly Intractable Distributions [5.63014864822787]
We propose a novel Monte Carlo Stein variational gradient descent (MC-SVGD) approach for inference for doubly intractable distributions.
The proposed method achieves substantial computational gains over existing algorithms, while providing comparable inferential performance for the posterior distributions.
arXiv Detail & Related papers (2024-10-28T13:42:27Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Logistic Variational Bayes Revisited [1.256245863497516]
Variational logistic regression is a popular method for approximate Bayesian inference.
Due to the intractability of the Evidence Lower Bound, authors have turned to the use of Monte Carlo, quadrature or bounds to perform inference.
In this paper we introduce a new bound for the expectation of softplus function.
We show that this bound is tighter than the state-of-the-art, and that the resulting variational posterior achieves state-of-the-art performance.
arXiv Detail & Related papers (2024-06-02T11:32:28Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Recursive Monte Carlo and Variational Inference with Auxiliary Variables [64.25762042361839]
Recursive auxiliary-variable inference (RAVI) is a new framework for exploiting flexible proposals.
RAVI generalizes and unifies several existing methods for inference with expressive expressive families.
We show RAVI's design framework and theorems by using them to analyze and improve upon Salimans et al.'s Markov Chain Variational Inference.
arXiv Detail & Related papers (2022-03-05T23:52:40Z) - q-Paths: Generalizing the Geometric Annealing Path using Power Means [51.73925445218366]
We introduce $q$-paths, a family of paths which includes the geometric and arithmetic mixtures as special cases.
We show that small deviations away from the geometric path yield empirical gains for Bayesian inference.
arXiv Detail & Related papers (2021-07-01T21:09:06Z) - A Discrete Variational Derivation of Accelerated Methods in Optimization [68.8204255655161]
We introduce variational which allow us to derive different methods for optimization.
We derive two families of optimization methods in one-to-one correspondence.
The preservation of symplecticity of autonomous systems occurs here solely on the fibers.
arXiv Detail & Related papers (2021-06-04T20:21:53Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Sampling Permutations for Shapley Value Estimation [1.0323063834827415]
Game-theoretic attribution techniques based on Shapley values are used extensively to interpret machine learning models.
As the computation of Shapley values can be expressed as a summation over a set of permutations, a common approach is to sample a subset of these permutations for approximation.
Unfortunately, standard Monte Carlo sampling methods can exhibit slow convergence, and more sophisticated quasi Monte Carlo methods are not well defined on the space of permutations.
arXiv Detail & Related papers (2021-04-25T16:44:18Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Sparse Orthogonal Variational Inference for Gaussian Processes [34.476453597078894]
We introduce a new interpretation of sparse variational approximations for Gaussian processes using inducing points.
We show that this formulation recovers existing approximations and at the same time allows to obtain tighter lower bounds on the marginal likelihood and new variational inference algorithms.
arXiv Detail & Related papers (2019-10-23T15:01:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.