MixFlows: principled variational inference via mixed flows
- URL: http://arxiv.org/abs/2205.07475v5
- Date: Thu, 1 Jun 2023 06:36:29 GMT
- Title: MixFlows: principled variational inference via mixed flows
- Authors: Zuheng Xu, Naitong Chen, Trevor Campbell
- Abstract summary: MixFlows are a new variational family that consists of a mixture of repeated applications of a map to an initial reference distribution.
We show that MixFlows have MCMC-like convergence guarantees when the flow map is ergodic and measure-preserving.
We also develop an implementation of MixFlows based on uncorrected discretized Hamiltonian dynamics combined with deterministic momentum refreshment.
- Score: 16.393322369105864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work presents mixed variational flows (MixFlows), a new variational
family that consists of a mixture of repeated applications of a map to an
initial reference distribution. First, we provide efficient algorithms for
i.i.d. sampling, density evaluation, and unbiased ELBO estimation. We then show
that MixFlows have MCMC-like convergence guarantees when the flow map is
ergodic and measure-preserving, and provide bounds on the accumulation of error
for practical implementations where the flow map is approximated. Finally, we
develop an implementation of MixFlows based on uncorrected discretized
Hamiltonian dynamics combined with deterministic momentum refreshment.
Simulated and real data experiments show that MixFlows can provide more
reliable posterior approximations than several black-box normalizing flows, as
well as samples of comparable quality to those obtained from state-of-the-art
MCMC methods.
Related papers
- Gaussian Mixture Flow Matching Models [51.976452482535954]
Diffusion models approximate the denoising distribution as a Gaussian and predict its mean, whereas flow matching models re parameterize the Gaussian mean as flow velocity.
They underperform in few-step sampling due to discretization error and tend to produce over-saturated colors under classifier-free guidance (CFG)
We introduce a novel probabilistic guidance scheme that mitigates the over-saturation issues of CFG and improves image generation quality.
arXiv Detail & Related papers (2025-04-07T17:59:42Z) - Stream-level flow matching with Gaussian processes [4.935875591615496]
Conditional flow matching (CFM) is a family of training algorithms for fitting continuous normalizing flows (CNFs)
We extend the CFM algorithm by defining conditional probability paths along streams'', instances of latent paths that connect data pairs of source and target.
We show that this generalization of the CFM can effectively reduce the variance in the estimated marginal vector field at a moderate computational cost.
arXiv Detail & Related papers (2024-09-30T15:47:22Z) - Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - MGF: Mixed Gaussian Flow for Diverse Trajectory Prediction [72.70572835589158]
We propose constructing a mixed Gaussian prior for a normalizing flow model for trajectory prediction.
Our method achieves state-of-the-art performance in the evaluation of both trajectory alignment and diversity on the popular UCY/ETH and SDD datasets.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Quantum Normalizing Flows for Anomaly Detection [23.262276593120305]
We introduce Normalizing Flows for Quantum architectures, describe how to model and optimize such a flow and evaluate our method on example datasets.
Our proposed models show competitive performance for anomaly detection compared to classical methods.
In the experiments we compare our performance to isolation forests (IF), the local outlier factor (LOF) or single-class SVMs.
arXiv Detail & Related papers (2024-02-05T10:28:20Z) - Fast Semisupervised Unmixing Using Nonconvex Optimization [80.11512905623417]
We introduce a novel convex convex model for semi/library-based unmixing.
We demonstrate the efficacy of Alternating Methods of sparse unsupervised unmixing.
arXiv Detail & Related papers (2024-01-23T10:07:41Z) - Detecting and Mitigating Mode-Collapse for Flow-based Sampling of
Lattice Field Theories [6.222204646855336]
We study the consequences of mode-collapse of normalizing flows in the context of lattice field theory.
We propose a metric to quantify the degree of mode-collapse and derive a bound on the resulting bias.
arXiv Detail & Related papers (2023-02-27T19:00:22Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Integrable Nonparametric Flows [5.9774834479750805]
We introduce a method for reconstructing an infinitesimal normalizing flow given only an infinitesimal change to a probability distribution.
This reverses the conventional task of normalizing flows.
We discuss potential applications to problems in quantum Monte Carlo and machine learning.
arXiv Detail & Related papers (2020-12-03T16:19:52Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z) - Semi-Supervised Learning with Normalizing Flows [54.376602201489995]
FlowGMM is an end-to-end approach to generative semi supervised learning with normalizing flows.
We show promising results on a wide range of applications, including AG-News and Yahoo Answers text data.
arXiv Detail & Related papers (2019-12-30T17:36:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.