Asymptotically exact variational flows via involutive MCMC kernels
- URL: http://arxiv.org/abs/2506.02162v1
- Date: Mon, 02 Jun 2025 18:44:35 GMT
- Title: Asymptotically exact variational flows via involutive MCMC kernels
- Authors: Zuheng Xu, Trevor Campbell,
- Abstract summary: We present a general recipe for constructing tuning-free, expressively exact variational flows from involutive MCMC kernels.<n>This leads to three new variational families with provable total variation convergence.<n>We demonstrate the competitive performance of our flows across tasks including posterior approximation, Monte Carlo estimates, and normalization constant estimation.
- Score: 16.137032831974174
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Most expressive variational families -- such as normalizing flows -- lack practical convergence guarantees, as their theoretical assurances typically hold only at the intractable global optimum. In this work, we present a general recipe for constructing tuning-free, asymptotically exact variational flows from involutive MCMC kernels. The core methodological component is a novel representation of general involutive MCMC kernels as invertible, measure-preserving iterated random function systems, which act as the flow maps of our variational flows. This leads to three new variational families with provable total variation convergence. Our framework resolves key practical limitations of existing variational families with similar guarantees (e.g., MixFlows), while requiring substantially weaker theoretical assumptions. Finally, we demonstrate the competitive performance of our flows across tasks including posterior approximation, Monte Carlo estimates, and normalization constant estimation, outperforming or matching No-U-Turn sampler (NUTS) and black-box normalizing flows.
Related papers
- Marginalization Consistent Probabilistic Forecasting of Irregular Time Series via Mixture of Separable flows [4.489135297410294]
Probabilistic forecasting models for joint distributions of targets in irregular time series with missing values are a heavily under-researched area in machine learning.<n>We propose MOSES (Marginalization Consistent Mixture of Separable Flows), a model that parametrizes a mixture of several latent Gaussian processes combined with separable uni- normality flows.<n>Experiments on four datasets show that MOSES achieves both accurate joint and marginal predictions, surpassing all other marginalization consistent baselines, while only trailing slightly behind ProFITi in joint prediction, but vastly superior when predicting marginal distributions.
arXiv Detail & Related papers (2024-06-11T13:28:43Z) - Weakly Convex Regularisers for Inverse Problems: Convergence of Critical Points and Primal-Dual Optimisation [12.455342327482223]
We present a generalised formulation of convergent regularisation in terms of critical points.
We show that this is achieved by a class of weakly convex regularisers.
Applying this theory to learned regularisation, we prove universal approximation for input weakly convex neural networks.
arXiv Detail & Related papers (2024-02-01T22:54:45Z) - MixFlows: principled variational inference via mixed flows [16.393322369105864]
MixFlows are a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution.
We show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving.
We also develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment.
arXiv Detail & Related papers (2022-05-16T06:57:57Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Stochastic normalizing flows as non-equilibrium transformations [62.997667081978825]
We show that normalizing flows provide a route to sample lattice field theories more efficiently than conventional MonteCarlo simulations.
We lay out a strategy to optimize the efficiency of this extended class of generative models and present examples of applications.
arXiv Detail & Related papers (2022-01-21T19:00:18Z) - Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions [91.63716984911278]
We introduce a novel Mixture of Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently estimates uncertainty in principle for adaptive integration of different modalities and produces a trustworthy regression result.
Experimental results on both synthetic and different real-world data demonstrate the effectiveness and trustworthiness of our method on various multimodal regression tasks.
arXiv Detail & Related papers (2021-11-11T14:28:12Z) - Sinusoidal Flow: A Fast Invertible Autoregressive Flow [0.0]
We propose a new type of normalising flows that inherits the expressive power and triangular Jacobian from fully autoregressive flows.
Experiments show that our Sinusoidal Flow is not only able to model complex distributions, but can also be reliably inverted to generate realistic-looking samples.
arXiv Detail & Related papers (2021-10-26T01:20:29Z) - The Variational Method of Moments [65.91730154730905]
conditional moment problem is a powerful formulation for describing structural causal parameters in terms of observables.
Motivated by a variational minimax reformulation of OWGMM, we define a very general class of estimators for the conditional moment problem.
We provide algorithms for valid statistical inference based on the same kind of variational reformulations.
arXiv Detail & Related papers (2020-12-17T07:21:06Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - MetFlow: A New Efficient Method for Bridging the Gap between Markov
Chain Monte Carlo and Variational Inference [20.312106392307406]
We propose a new computationally efficient method to combine Variational Inference (VI) with Markov Chain Monte Carlo (MCMC)
This approach can be used with generic MCMC kernels, but is especially well suited to textitMetFlow, a novel family of MCMC algorithms we introduce.
arXiv Detail & Related papers (2020-02-27T16:50:30Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.