Discretely Indexed Flows
- URL: http://arxiv.org/abs/2204.01361v1
- Date: Mon, 4 Apr 2022 10:13:43 GMT
- Title: Discretely Indexed Flows
- Authors: Elouan Argouarc'h, Fran\c{c}ois Desbouvries, Eric Barat, Eiji
Kawasaki, Thomas Dautremer
- Abstract summary: We propose Discretely Indexed flows (DIF) as a new tool for solving variational estimation problems.
DIF are built as an extension of Normalizing Flows (NF), in which the deterministic transport becomes discretely indexed.
They benefit from both a tractable density as well as a straightforward sampling scheme, and can thus be used for the dual problems of Variational Inference (VI) and of Variational density estimation (VDE)
- Score: 1.0079626733116611
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper we propose Discretely Indexed flows (DIF) as a new tool for
solving variational estimation problems. Roughly speaking, DIF are built as an
extension of Normalizing Flows (NF), in which the deterministic transport
becomes stochastic, and more precisely discretely indexed. Due to the discrete
nature of the underlying additional latent variable, DIF inherit the good
computational behavior of NF: they benefit from both a tractable density as
well as a straightforward sampling scheme, and can thus be used for the dual
problems of Variational Inference (VI) and of Variational density estimation
(VDE). On the other hand, DIF can also be understood as an extension of mixture
density models, in which the constant mixture weights are replaced by flexible
functions. As a consequence, DIF are better suited for capturing distributions
with discontinuities, sharp edges and fine details, which is a main advantage
of this construction. Finally we propose a methodology for constructiong DIF in
practice, and see that DIF can be sequentially cascaded, and cascaded with NF.
Related papers
- Entropy-Informed Weighting Channel Normalizing Flow [7.751853409569806]
We propose a regularized and feature-dependent $mathttShuffle$ operation and integrate it into vanilla multi-scale architecture.
We observe that such operation guides the variables to evolve in the direction of entropy increase, hence we refer to NFs with the $mathttShuffle$ operation as emphEntropy-Informed Weighting Channel Normalizing Flow (EIW-Flow)
arXiv Detail & Related papers (2024-07-06T04:46:41Z) - PINF: Continuous Normalizing Flows for Physics-Constrained Deep Learning [8.000355537589224]
In this paper, we introduce Physics-Informed Normalizing Flows (PINF), a novel extension of continuous normalizing flows.
Our method, which is mesh-free and causality-free, can efficiently solve high dimensional time-dependent and steady-state Fokker-Planck equations.
arXiv Detail & Related papers (2023-09-26T15:38:57Z) - Learning Discretized Neural Networks under Ricci Flow [51.36292559262042]
We study Discretized Neural Networks (DNNs) composed of low-precision weights and activations.
DNNs suffer from either infinite or zero gradients due to the non-differentiable discrete function during training.
arXiv Detail & Related papers (2023-02-07T10:51:53Z) - Matching Normalizing Flows and Probability Paths on Manifolds [57.95251557443005]
Continuous Normalizing Flows (CNFs) are generative models that transform a prior distribution to a model distribution by solving an ordinary differential equation (ODE)
We propose to train CNFs by minimizing probability path divergence (PPD), a novel family of divergences between the probability density path generated by the CNF and a target probability density path.
We show that CNFs learned by minimizing PPD achieve state-of-the-art results in likelihoods and sample quality on existing low-dimensional manifold benchmarks.
arXiv Detail & Related papers (2022-07-11T08:50:19Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z) - Variational Inference with Continuously-Indexed Normalizing Flows [29.95927906900098]
Continuously-indexed flows (CIFs) have recently achieved improvements over baseline normalizing flows on a variety of density estimation tasks.
We show here how CIFs can be used as part of an auxiliary variational inference scheme to formulate and train expressive posterior approximations.
arXiv Detail & Related papers (2020-07-10T15:00:04Z) - Stochastic Normalizing Flows [52.92110730286403]
We introduce normalizing flows for maximum likelihood estimation and variational inference (VI) using differential equations (SDEs)
Using the theory of rough paths, the underlying Brownian motion is treated as a latent variable and approximated, enabling efficient training of neural SDEs.
These SDEs can be used for constructing efficient chains to sample from the underlying distribution of a given dataset.
arXiv Detail & Related papers (2020-02-21T20:47:55Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.