Comparison of Markov chains via weak Poincar\'e inequalities with
application to pseudo-marginal MCMC
- URL: http://arxiv.org/abs/2112.05605v1
- Date: Fri, 10 Dec 2021 15:36:30 GMT
- Title: Comparison of Markov chains via weak Poincar\'e inequalities with
application to pseudo-marginal MCMC
- Authors: Christophe Andrieu, Anthony Lee, Sam Power, Andi Q. Wang
- Abstract summary: We investigate the use of a certain class of functional inequalities known as weak Poincar'e inequalities to bound convergence of Markov chains to equilibrium.
We show that this enables the derivation of subgeometric convergence bounds for methods such as the Independent Metropolis--Hastings sampler and pseudo-marginal methods for intractable likelihoods.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We investigate the use of a certain class of functional inequalities known as
weak Poincar\'e inequalities to bound convergence of Markov chains to
equilibrium. We show that this enables the straightforward and transparent
derivation of subgeometric convergence bounds for methods such as the
Independent Metropolis--Hastings sampler and pseudo-marginal methods for
intractable likelihoods, the latter being subgeometric in many practical
settings. These results rely on novel quantitative comparison theorems between
Markov chains. Associated proofs are simpler than those relying on
drift/minorization conditions and the tools developed allow us to recover and
further extend known results as particular cases. We are then able to provide
new insights into the practical use of pseudo-marginal algorithms, analyse the
effect of averaging in Approximate Bayesian Computation (ABC) and the use of
products of independent averages, and also to study the case of lognormal
weights relevant to particle marginal Metropolis--Hastings (PMMH).
Related papers
- A Unified Theory of Stochastic Proximal Point Methods without Smoothness [52.30944052987393]
Proximal point methods have attracted considerable interest owing to their numerical stability and robustness against imperfect tuning.
This paper presents a comprehensive analysis of a broad range of variations of the proximal point method (SPPM)
arXiv Detail & Related papers (2024-05-24T21:09:19Z) - Dimension-free Relaxation Times of Informed MCMC Samplers on Discrete Spaces [5.075066314996696]
We develop general mixing time bounds for Metropolis-Hastings algorithms on discrete spaces.
We establish sufficient conditions for a class of informed Metropolis-Hastings algorithms to attain relaxation times independent of the problem dimension.
arXiv Detail & Related papers (2024-04-05T02:40:45Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Hoeffding's Inequality for Markov Chains under Generalized
Concentrability Condition [15.228649445346473]
This paper studies Hoeffding's inequality for Markov chains under the generalized concentrability condition defined via integral probability metric (IPM)
The flexibility of our framework allows Hoeffding's inequality to be applied beyond the ergodic Markov chains in the traditional sense.
arXiv Detail & Related papers (2023-10-04T16:21:23Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Optimal variance-reduced stochastic approximation in Banach spaces [114.8734960258221]
We study the problem of estimating the fixed point of a contractive operator defined on a separable Banach space.
We establish non-asymptotic bounds for both the operator defect and the estimation error.
arXiv Detail & Related papers (2022-01-21T02:46:57Z) - Three rates of convergence or separation via U-statistics in a dependent
framework [5.929956715430167]
We put this theoretical breakthrough into action by pushing further the current state of knowledge in three different active fields of research.
First, we establish a new exponential inequality for the estimation of spectra of trace class integral operators with MCMC methods.
In addition, we investigate generalization performance of online algorithms working with pairwise loss functions and Markov chain samples.
arXiv Detail & Related papers (2021-06-24T07:10:36Z) - Machine Learning and Variational Algorithms for Lattice Field Theory [1.198562319289569]
In lattice quantum field theory studies, parameters defining the lattice theory must be tuned toward criticality to access continuum physics.
We introduce an approach to "deform" Monte Carlo estimators based on contour deformations applied to the domain of the path integral.
We demonstrate that flow-based MCMC can mitigate critical slowing down and observifolds can exponentially reduce variance in proof-of-principle applications.
arXiv Detail & Related papers (2021-06-03T16:37:05Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - A Unified Joint Maximum Mean Discrepancy for Domain Adaptation [73.44809425486767]
This paper theoretically derives a unified form of JMMD that is easy to optimize.
From the revealed unified JMMD, we illustrate that JMMD degrades the feature-label dependence that benefits to classification.
We propose a novel MMD matrix to promote the dependence, and devise a novel label kernel that is robust to label distribution shift.
arXiv Detail & Related papers (2021-01-25T09:46:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.