Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference
- URL: http://arxiv.org/abs/2510.02056v1
- Date: Thu, 02 Oct 2025 14:25:29 GMT
- Title: Adaptive Heterogeneous Mixtures of Normalising Flows for Robust Variational Inference
- Authors: Benjamin Wiriyapong, Oktay Karakuş, Kirill Sidorov,
- Abstract summary: We propose Adaptive Mixture Flow Variational Inference (AMF-VI)<n>AMF-VI is trained in two stages: (i) sequential expert training of individual flows, and (ii) adaptive global weight estimation via likelihood-driven updates.<n>We evaluate AMF-VI on six canonical posterior families of banana, X-shape, two-moons, rings, a bimodal, and a five-mode mixture.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalising-flow variational inference (VI) can approximate complex posteriors, yet single-flow models often behave inconsistently across qualitatively different distributions. We propose Adaptive Mixture Flow Variational Inference (AMF-VI), a heterogeneous mixture of complementary flows (MAF, RealNVP, RBIG) trained in two stages: (i) sequential expert training of individual flows, and (ii) adaptive global weight estimation via likelihood-driven updates, without per-sample gating or architectural changes. Evaluated on six canonical posterior families of banana, X-shape, two-moons, rings, a bimodal, and a five-mode mixture, AMF-VI achieves consistently lower negative log-likelihood than each single-flow baseline and delivers stable gains in transport metrics (Wasserstein-2) and maximum mean discrepancy (MDD), indicating improved robustness across shapes and modalities. The procedure is efficient and architecture-agnostic, incurring minimal overhead relative to standard flow training, and demonstrates that adaptive mixtures of diverse flows provide a reliable route to robust VI across diverse posterior families whilst preserving each expert's inductive bias.
Related papers
- Active Flow Matching [14.437387789022354]
Active Flow Matching (AFM) reformulates variational objectives to operate on conditional endpoint distributions along the flow.<n>We derive forward and reverse Kullback-Leibler (KL) variants using self-normalised importance sampling.
arXiv Detail & Related papers (2026-03-01T02:50:07Z) - Asymptotically exact variational flows via involutive MCMC kernels [16.137032831974174]
We present a general recipe for constructing tuning-free, expressively exact variational flows from involutive MCMC kernels.<n>This leads to three new variational families with provable total variation convergence.<n>We demonstrate the competitive performance of our flows across tasks including posterior approximation, Monte Carlo estimates, and normalization constant estimation.
arXiv Detail & Related papers (2025-06-02T18:44:35Z) - Stable Derivative Free Gaussian Mixture Variational Inference for Bayesian Inverse Problems [4.842853252452336]
Key challenges include costly repeated evaluations of forward models, multimodality, and inaccessible gradients for the forward model.<n>We develop a variational inference framework that combines Fisher-Rao natural gradient with specialized quadrature rules to enable derivative free updates of Gaussian mixture variational families.<n>The resulting method, termed Derivative Free Gaussian Mixture Variational Inference (DF-GMVI), guarantees covariance positivity and affine invariance, offering a stable and efficient framework for approximating complex posterior distributions.
arXiv Detail & Related papers (2025-01-08T03:50:15Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Stable Training of Normalizing Flows for High-dimensional Variational
Inference [2.139348034155473]
Variational inference with normalizing flows (NFs) is an increasingly popular alternative to MCMC methods.
In practice, training deep normalizing flows for approximating high-dimensional distributions is often infeasible due to the high variance of the gradients.
We show that previous methods for stabilizing the variance of gradient descent can be insufficient to achieve stable training of Real NVPs.
arXiv Detail & Related papers (2024-02-26T09:04:07Z) - MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction [72.70572835589158]
We propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction.<n>Our method achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Equivariant Flow Matching with Hybrid Probability Transport [69.11915545210393]
Diffusion Models (DMs) have demonstrated effectiveness in generating feature-rich geometries.
DMs typically suffer from unstable probability dynamics with inefficient sampling speed.
We introduce geometric flow matching, which enjoys the advantages of both equivariant modeling and stabilized probability dynamics.
arXiv Detail & Related papers (2023-12-12T11:13:13Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Discretely Indexed Flows [1.0079626733116611]
We propose Discretely Indexed flows (DIF) as a new tool for solving variational estimation problems.
DIF are built as an extension of Normalizing Flows (NF), in which the deterministic transport becomes discretely indexed.
They benefit from both a tractable density as well as a straightforward sampling scheme, and can thus be used for the dual problems of Variational Inference (VI) and of Variational density estimation (VDE)
arXiv Detail & Related papers (2022-04-04T10:13:43Z) - Trustworthy Multimodal Regression with Mixture of Normal-inverse Gamma
Distributions [91.63716984911278]
We introduce a novel Mixture of Normal-Inverse Gamma distributions (MoNIG) algorithm, which efficiently estimates uncertainty in principle for adaptive integration of different modalities and produces a trustworthy regression result.
Experimental results on both synthetic and different real-world data demonstrate the effectiveness and trustworthiness of our method on various multimodal regression tasks.
arXiv Detail & Related papers (2021-11-11T14:28:12Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.