A Tale of Two Latent Flows: Learning Latent Space Normalizing Flow with
Short-run Langevin Flow for Approximate Inference
- URL: http://arxiv.org/abs/2301.09300v1
- Date: Mon, 23 Jan 2023 07:15:43 GMT
- Title: A Tale of Two Latent Flows: Learning Latent Space Normalizing Flow with
Short-run Langevin Flow for Approximate Inference
- Authors: Jianwen Xie, Yaxuan Zhu, Yifei Xu, Dingcheng Li, Ping Li
- Abstract summary: We study a normalizing flow in the latent space of a top-down generator model, in which the normalizing flow model plays the role of the informative prior model of the generator.
We propose to jointly learn the latent space normalizing flow prior model and the top-down generator model by a Markov chain Monte Carlo (MCMC)-based maximum likelihood algorithm.
- Score: 44.97938062814525
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We study a normalizing flow in the latent space of a top-down generator
model, in which the normalizing flow model plays the role of the informative
prior model of the generator. We propose to jointly learn the latent space
normalizing flow prior model and the top-down generator model by a Markov chain
Monte Carlo (MCMC)-based maximum likelihood algorithm, where a short-run
Langevin sampling from the intractable posterior distribution is performed to
infer the latent variables for each observed example, so that the parameters of
the normalizing flow prior and the generator can be updated with the inferred
latent variables. We show that, under the scenario of non-convergent short-run
MCMC, the finite step Langevin dynamics is a flow-like approximate inference
model and the learning objective actually follows the perturbation of the
maximum likelihood estimation (MLE). We further point out that the learning
framework seeks to (i) match the latent space normalizing flow and the
aggregated posterior produced by the short-run Langevin flow, and (ii) bias the
model from MLE such that the short-run Langevin flow inference is close to the
true posterior. Empirical results of extensive experiments validate the
effectiveness of the proposed latent space normalizing flow model in the tasks
of image generation, image reconstruction, anomaly detection, supervised image
inpainting and unsupervised image recovery.
Related papers
- Unfolding Time: Generative Modeling for Turbulent Flows in 4D [49.843505326598596]
This work introduces a 4D generative diffusion model and a physics-informed guidance technique that enables the generation of realistic sequences of flow states.
Our findings indicate that the proposed method can successfully sample entire subsequences from the turbulent manifold.
This advancement opens doors for the application of generative modeling in analyzing the temporal evolution of turbulent flows.
arXiv Detail & Related papers (2024-06-17T10:21:01Z) - Verlet Flows: Exact-Likelihood Integrators for Flow-Based Generative Models [4.9425328004453375]
We present Verlet flows, a class of CNFs on an augmented state-space inspired by symplectic from Hamiltonian dynamics.
Verlet flows provide exact-likelihood generative models which generalize coupled flow architectures from a non-continuous setting while imposing minimal expressivity constraints.
On experiments over toy densities, we demonstrate that the variance of the commonly used Hutchinson trace estimator is unsuitable for importance sampling, whereas Verlet flows perform comparably to full autograd trace computations while being significantly faster.
arXiv Detail & Related papers (2024-05-05T03:47:56Z) - Training Dynamics of Multi-Head Softmax Attention for In-Context Learning: Emergence, Convergence, and Optimality [54.20763128054692]
We study the dynamics of gradient flow for training a multi-head softmax attention model for in-context learning of multi-task linear regression.
We prove that an interesting "task allocation" phenomenon emerges during the gradient flow dynamics.
arXiv Detail & Related papers (2024-02-29T18:43:52Z) - Mixed Gaussian Flow for Diverse Trajectory Prediction [78.00204650749453]
We propose a flow-based model to transform a mixed Gaussian prior into the future trajectory manifold.
The model shows a better capacity for generating diverse trajectory patterns.
We also demonstrate that it can generate diverse, controllable, and out-of-distribution trajectories.
arXiv Detail & Related papers (2024-02-19T15:48:55Z) - Generative Modeling with Phase Stochastic Bridges [49.4474628881673]
Diffusion models (DMs) represent state-of-the-art generative models for continuous inputs.
We introduce a novel generative modeling framework grounded in textbfphase space dynamics
Our framework demonstrates the capability to generate realistic data points at an early stage of dynamics propagation.
arXiv Detail & Related papers (2023-10-11T18:38:28Z) - Data-driven low-dimensional dynamic model of Kolmogorov flow [0.0]
Reduced order models (ROMs) that capture flow dynamics are of interest for decreasing computational costs for simulation.
This work presents a data-driven framework for minimal-dimensional models that effectively capture the dynamics and properties of the flow.
We apply this to Kolmogorov flow in a regime consisting of chaotic and intermittent behavior.
arXiv Detail & Related papers (2022-10-29T23:05:39Z) - A Tale of Two Flows: Cooperative Learning of Langevin Flow and
Normalizing Flow Toward Energy-Based Model [43.53802699867521]
We study the cooperative learning of two generative flow models, in which the two models are iteratively updated based on jointly synthesized examples.
We show that the trained CoopFlow is capable of realistic images, reconstructing images, and interpolating between images.
arXiv Detail & Related papers (2022-05-13T23:12:38Z) - Predicting the temporal dynamics of turbulent channels through deep
learning [0.0]
We aim to assess the capability of neural networks to reproduce the temporal evolution of a minimal turbulent channel flow.
Long-short-term-memory (LSTM) networks and a Koopman-based framework (KNF) are trained to predict the temporal dynamics of the minimal-channel-flow modes.
arXiv Detail & Related papers (2022-03-02T09:31:03Z) - Augmented Normalizing Flows: Bridging the Gap Between Generative Flows
and Latent Variable Models [11.206144910991481]
We propose a new family of generative flows on an augmented data space, with an aim to improve expressivity without drastically increasing the computational cost of sampling and evaluation of a lower bound on the likelihood.
We demonstrate state-of-the-art performance on standard benchmarks of flow-based generative modeling.
arXiv Detail & Related papers (2020-02-17T17:45:48Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.