Dimension-free Relaxation Times of Informed MCMC Samplers on Discrete Spaces
- URL: http://arxiv.org/abs/2404.03867v1
- Date: Fri, 5 Apr 2024 02:40:45 GMT
- Title: Dimension-free Relaxation Times of Informed MCMC Samplers on Discrete Spaces
- Authors: Hyunwoong Chang, Quan Zhou,
- Abstract summary: We develop general mixing time bounds for Metropolis-Hastings algorithms on discrete spaces.
We establish sufficient conditions for a class of informed Metropolis-Hastings algorithms to attain relaxation times independent of the problem dimension.
- Score: 5.075066314996696
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Convergence analysis of Markov chain Monte Carlo methods in high-dimensional statistical applications is increasingly recognized. In this paper, we develop general mixing time bounds for Metropolis-Hastings algorithms on discrete spaces by building upon and refining some recent theoretical advancements in Bayesian model selection problems. We establish sufficient conditions for a class of informed Metropolis-Hastings algorithms to attain relaxation times that are independent of the problem dimension. These conditions are grounded in high-dimensional statistical theory and allow for possibly multimodal posterior distributions. We obtain our results through two independent techniques: the multicommodity flow method and single-element drift condition analysis; we find that the latter yields a tighter mixing time bound. Our results and proof techniques are readily applicable to a broad spectrum of statistical problems with discrete parameter spaces.
Related papers
- Constrained Sampling with Primal-Dual Langevin Monte Carlo [15.634831573546041]
This work considers the problem of sampling from a probability distribution known up to a normalization constant.
It satisfies a set of statistical constraints specified by the expected values of general nonlinear functions.
We put forward a discrete-time primal-dual Langevin Monte Carlo algorithm (PD-LMC) that simultaneously constrains the target distribution and samples from it.
arXiv Detail & Related papers (2024-11-01T13:26:13Z) - Convergence of Score-Based Discrete Diffusion Models: A Discrete-Time Analysis [56.442307356162864]
We study the theoretical aspects of score-based discrete diffusion models under the Continuous Time Markov Chain (CTMC) framework.
We introduce a discrete-time sampling algorithm in the general state space $[S]d$ that utilizes score estimators at predefined time points.
Our convergence analysis employs a Girsanov-based method and establishes key properties of the discrete score function.
arXiv Detail & Related papers (2024-10-03T09:07:13Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Symmetric Mean-field Langevin Dynamics for Distributional Minimax
Problems [78.96969465641024]
We extend mean-field Langevin dynamics to minimax optimization over probability distributions for the first time with symmetric and provably convergent updates.
We also study time and particle discretization regimes and prove a new uniform-in-time propagation of chaos result.
arXiv Detail & Related papers (2023-12-02T13:01:29Z) - Weighted Riesz Particles [0.0]
We consider the target distribution as a mapping where the infinite-dimensional space of the parameters consists of a number of deterministic submanifolds.
We study the properties of the point, called Riesz, and embed it into sequential MCMC.
We find that there will be higher acceptance rates with fewer evaluations.
arXiv Detail & Related papers (2023-12-01T14:36:46Z) - A Multivariate Unimodality Test Harnessing the Dip Statistic of Mahalanobis Distances Over Random Projections [0.18416014644193066]
We extend one-dimensional unimodality principles to multi-dimensional spaces through linear random projections and point-to-point distancing.
Our method, rooted in $alpha$-unimodality assumptions, presents a novel unimodality test named mud-pod.
Both theoretical and empirical studies confirm the efficacy of our method in unimodality assessment of multidimensional datasets.
arXiv Detail & Related papers (2023-11-28T09:11:02Z) - Comparison of Markov chains via weak Poincar\'e inequalities with
application to pseudo-marginal MCMC [0.0]
We investigate the use of a certain class of functional inequalities known as weak Poincar'e inequalities to bound convergence of Markov chains to equilibrium.
We show that this enables the derivation of subgeometric convergence bounds for methods such as the Independent Metropolis--Hastings sampler and pseudo-marginal methods for intractable likelihoods.
arXiv Detail & Related papers (2021-12-10T15:36:30Z) - Partial Counterfactual Identification from Observational and
Experimental Data [83.798237968683]
We develop effective Monte Carlo algorithms to approximate the optimal bounds from an arbitrary combination of observational and experimental data.
Our algorithms are validated extensively on synthetic and real-world datasets.
arXiv Detail & Related papers (2021-10-12T02:21:30Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.