Central Limit Theorem for ergodic averages of Markov chains \& the comparison of sampling algorithms for heavy-tailed distributions
- URL: http://arxiv.org/abs/2512.18255v1
- Date: Sat, 20 Dec 2025 07:37:55 GMT
- Title: Central Limit Theorem for ergodic averages of Markov chains \& the comparison of sampling algorithms for heavy-tailed distributions
- Authors: Miha Brešar, Aleksandar Mijatović, Gareth Roberts,
- Abstract summary: We provide verifiable necessary conditions for CLTs of ergodic averages of Markov chains on general state spaces.<n>Our theory is based on drift conditions, which also yield lower bounds on the rates of convergence to station in various metrics.
- Score: 42.17343824099138
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Establishing central limit theorems (CLTs) for ergodic averages of Markov chains is a fundamental problem in probability and its applications. Since the seminal work~\cite{MR834478}, a vast literature has emerged on the sufficient conditions for such CLTs. To counterbalance this, the present paper provides verifiable necessary conditions for CLTs of ergodic averages of Markov chains on general state spaces. Our theory is based on drift conditions, which also yield lower bounds on the rates of convergence to stationarity in various metrics. The validity of the ergodic CLT is of particular importance for sampling algorithms, where it underpins the error analysis of estimators in Bayesian statistics and machine learning. Although heavy-tailed sampling is of central importance in applications, the characterisation of the CLT and the convergence rates are theoretically poorly understood for almost all practically-used Markov chain Monte Carlo (MCMC) algorithms. In this setting our results provide sharp conditions on the validity of the ergodic CLT and establish convergence rates for large families of MCMC sampling algorithms for heavy-tailed targets. Our study includes a rather complete analyses for random walk Metropolis samplers (with finite- and infinite-variance proposals), Metropolis-adjusted and unadjusted Langevin algorithms and the stereographic projection sampler (as well as the independence sampler). By providing these sharp results via our practical drift conditions, our theory offers significant insights into the problems of algorithm selection and comparison for sampling heavy-tailed distributions (see short YouTube presentations~\cite{YouTube_talk} describing our \href{https://youtu.be/m2y7U4cEqy4}{\underline{theory}} and \href{https://youtu.be/w8I_oOweuko}{\underline{applications}}).
Related papers
- Improved Sample Complexity for Full Coverage in Compact and Continuous Spaces [0.0]
We study uniform random sampling on the $d$-dimensional unit hypercube.<n>We derive a sample complexity bound with a logarithmic dependence on the failure probability.<n>Our findings offer a sharper theoretical tool for algorithms that rely on grid-based coverage guarantees.
arXiv Detail & Related papers (2025-11-21T21:06:14Z) - Uncertainty quantification for Markov chain induced martingales with application to temporal difference learning [55.197497603087065]
We analyze the performance of the Temporal Difference (TD) learning algorithm with linear function approximations.<n>We establish novel and general high-dimensional concentration inequalities and Berry-Esseen bounds for vector-valued martingales induced by Markov chains.
arXiv Detail & Related papers (2025-02-19T15:33:55Z) - Constrained Sampling with Primal-Dual Langevin Monte Carlo [15.634831573546041]
This work considers the problem of sampling from a probability distribution known up to a normalization constant.<n>It satisfies a set of statistical constraints specified by the expected values of general nonlinear functions.<n>We put forward a discrete-time primal-dual Langevin Monte Carlo algorithm (PD-LMC) that simultaneously constrains the target distribution and samples from it.
arXiv Detail & Related papers (2024-11-01T13:26:13Z) - Robust Barycenter Estimation using Semi-Unbalanced Neural Optimal Transport [84.51977664336056]
We propose a novel scalable approach for estimating the robust continuous barycenter.<n>Our method is framed as a min-max optimization problem and is adaptable to general cost functions.
arXiv Detail & Related papers (2024-10-04T23:27:33Z) - Dimension-free Relaxation Times of Informed MCMC Samplers on Discrete Spaces [5.075066314996696]
We develop general mixing time bounds for Metropolis-Hastings algorithms on discrete spaces.<n>We establish sufficient conditions for a class of informed Metropolis-Hastings algorithms to attain relaxation times independent of the problem dimension.
arXiv Detail & Related papers (2024-04-05T02:40:45Z) - Distributed Markov Chain Monte Carlo Sampling based on the Alternating
Direction Method of Multipliers [143.6249073384419]
In this paper, we propose a distributed sampling scheme based on the alternating direction method of multipliers.
We provide both theoretical guarantees of our algorithm's convergence and experimental evidence of its superiority to the state-of-the-art.
In simulation, we deploy our algorithm on linear and logistic regression tasks and illustrate its fast convergence compared to existing gradient-based methods.
arXiv Detail & Related papers (2024-01-29T02:08:40Z) - Importance is Important: Generalized Markov Chain Importance Sampling Methods [4.611170084430822]
We show that for any multiple-try Metropolis algorithm, one can always accept the proposal and evaluate the importance weight that is needed to correct for the bias without extra computational cost.
We propose an alternative MCMC sampler on discrete spaces that is also outside the Metropolis--Hastings framework.
arXiv Detail & Related papers (2023-04-13T04:04:09Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Comparison of Markov chains via weak Poincar\'e inequalities with
application to pseudo-marginal MCMC [0.0]
We investigate the use of a certain class of functional inequalities known as weak Poincar'e inequalities to bound convergence of Markov chains to equilibrium.
We show that this enables the derivation of subgeometric convergence bounds for methods such as the Independent Metropolis--Hastings sampler and pseudo-marginal methods for intractable likelihoods.
arXiv Detail & Related papers (2021-12-10T15:36:30Z) - Amortized Conditional Normalized Maximum Likelihood: Reliable Out of
Distribution Uncertainty Estimation [99.92568326314667]
We propose the amortized conditional normalized maximum likelihood (ACNML) method as a scalable general-purpose approach for uncertainty estimation.
Our algorithm builds on the conditional normalized maximum likelihood (CNML) coding scheme, which has minimax optimal properties according to the minimum description length principle.
We demonstrate that ACNML compares favorably to a number of prior techniques for uncertainty estimation in terms of calibration on out-of-distribution inputs.
arXiv Detail & Related papers (2020-11-05T08:04:34Z) - Computationally efficient sparse clustering [67.95910835079825]
We provide a finite sample analysis of a new clustering algorithm based on PCA.
We show that it achieves the minimax optimal misclustering rate in the regime $|theta infty$.
arXiv Detail & Related papers (2020-05-21T17:51:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.