Variational Flow Models: Flowing in Your Style
- URL: http://arxiv.org/abs/2402.02977v4
- Date: Mon, 5 Aug 2024 01:24:52 GMT
- Title: Variational Flow Models: Flowing in Your Style
- Authors: Kien Do, Duc Kieu, Toan Nguyen, Dang Nguyen, Hung Le, Dung Nguyen, Thin Nguyen,
- Abstract summary: We transform the probability flow of a "linear" process into a straight constant-speed (SC) flow, reminiscent of Rectified Flow.
This transformation facilitates fast sampling along the original probability flow via the Euler method without training a new model of the SC flow.
We can easily integrate high-order numerical solvers into the transformed SC flow, further enhancing the sampling accuracy and efficiency.
- Score: 32.913511518425864
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We propose a systematic training-free method to transform the probability flow of a "linear" stochastic process characterized by the equation X_{t}=a_{t}X_{0}+\sigma_{t}X_{1} into a straight constant-speed (SC) flow, reminiscent of Rectified Flow. This transformation facilitates fast sampling along the original probability flow via the Euler method without training a new model of the SC flow. The flexibility of our approach allows us to extend our transformation to inter-convert two posterior flows of two distinct linear stochastic processes. Moreover, we can easily integrate high-order numerical solvers into the transformed SC flow, further enhancing the sampling accuracy and efficiency. Rigorous theoretical analysis and extensive experimental results substantiate the advantages of our framework. Our code is available at this [https://github.com/clarken92/VFM||link].
Related papers
- Consistency Flow Matching: Defining Straight Flows with Velocity Consistency [97.28511135503176]
We introduce Consistency Flow Matching (Consistency-FM), a novel FM method that explicitly enforces self-consistency in the velocity field.
Preliminary experiments demonstrate that our Consistency-FM significantly improves training efficiency by converging 4.4x faster than consistency models.
arXiv Detail & Related papers (2024-07-02T16:15:37Z) - Flow Map Matching [15.520853806024943]
Flow map matching is an algorithm that learns the two-time flow map of an underlying ordinary differential equation.
We show that flow map matching leads to high-quality samples with significantly reduced sampling cost compared to diffusion or interpolant methods.
arXiv Detail & Related papers (2024-06-11T17:41:26Z) - DiffuSeq-v2: Bridging Discrete and Continuous Text Spaces for
Accelerated Seq2Seq Diffusion Models [58.450152413700586]
We introduce a soft absorbing state that facilitates the diffusion model in learning to reconstruct discrete mutations based on the underlying Gaussian space.
We employ state-of-the-art ODE solvers within the continuous space to expedite the sampling process.
Our proposed method effectively accelerates the training convergence by 4x and generates samples of similar quality 800x faster.
arXiv Detail & Related papers (2023-10-09T15:29:10Z) - Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic
Analysis For DDIM-Type Samplers [90.45898746733397]
We develop a framework for non-asymptotic analysis of deterministic samplers used for diffusion generative modeling.
We show that one step along the probability flow ODE can be expressed as two steps: 1) a restoration step that runs ascent on the conditional log-likelihood at some infinitesimally previous time, and 2) a degradation step that runs the forward process using noise pointing back towards the current gradient.
arXiv Detail & Related papers (2023-03-06T18:59:19Z) - Flow Straight and Fast: Learning to Generate and Transfer Data with
Rectified Flow [32.459587479351846]
We present rectified flow, a surprisingly simple approach to learning (neural) ordinary differential equation (ODE) models.
We show that rectified flow performs superbly on image generation, image-to-image translation, and domain adaptation.
arXiv Detail & Related papers (2022-09-07T08:59:55Z) - Towards extraction of orthogonal and parsimonious non-linear modes from
turbulent flows [0.0]
We propose a deep probabilistic-neural-network architecture for learning a minimal and near-orthogonal set of non-linear modes.
Our approach is based on $beta$-variational autoencoders ($beta$-VAEs) and convolutional neural networks (CNNs)
arXiv Detail & Related papers (2021-09-03T13:38:51Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Towards Recurrent Autoregressive Flow Models [39.25035894474609]
We present Recurrent Autoregressive Flows as a method toward general process modeling with normalizing flows.
The proposed method defines a conditional distribution for each variable in a sequential process by conditioning the parameters of a normalizing flow with recurrent neural connections.
We demonstrate the effectiveness of this class of models through a series of experiments in which models are trained on three complex processes.
arXiv Detail & Related papers (2020-06-17T18:38:36Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.