Equivariant flow matching
- URL: http://arxiv.org/abs/2306.15030v2
- Date: Thu, 23 Nov 2023 21:53:19 GMT
- Title: Equivariant flow matching
- Authors: Leon Klein, Andreas Kr\"amer, Frank No\'e
- Abstract summary: We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
- Score: 0.9208007322096533
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Normalizing flows are a class of deep generative models that are especially
interesting for modeling probability distributions in physics, where the exact
likelihood of flows allows reweighting to known target energy functions and
computing unbiased observables. For instance, Boltzmann generators tackle the
long-standing sampling problem in statistical physics by training flows to
produce equilibrium samples of many-body systems such as small molecules and
proteins. To build effective models for such systems, it is crucial to
incorporate the symmetries of the target energy into the model, which can be
achieved by equivariant continuous normalizing flows (CNFs). However, CNFs can
be computationally expensive to train and generate samples from, which has
hampered their scalability and practical application. In this paper, we
introduce equivariant flow matching, a new training objective for equivariant
CNFs that is based on the recently proposed optimal transport flow matching.
Equivariant flow matching exploits the physical symmetries of the target energy
for efficient, simulation-free training of equivariant CNFs. We demonstrate the
effectiveness of flow matching on rotation and permutation invariant
many-particle systems and a small molecule, alanine dipeptide, where for the
first time we obtain a Boltzmann generator with significant sampling efficiency
without relying on tailored internal coordinate featurization. Our results show
that the equivariant flow matching objective yields flows with shorter
integration paths, improved sampling efficiency, and higher scalability
compared to existing methods.
Related papers
- Hessian-Informed Flow Matching [4.542719108171107]
Hessian-Informed Flow Matching is a novel approach that integrates the Hessian of an energy function into conditional flows.
This integration allows HI-FM to account for local curvature and anisotropic covariance structures.
Empirical evaluations on the MNIST and Lennard-Jones particles datasets demonstrate that HI-FM improves the likelihood of test samples.
arXiv Detail & Related papers (2024-10-15T09:34:52Z) - Iterated Energy-based Flow Matching for Sampling from Boltzmann Densities [11.850515912491657]
We propose iterated energy-based flow matching (iEFM) to train continuous normalizing flow (CNF) models from unnormalized densities.
Our results demonstrate that iEFM outperforms existing methods, showcasing its potential for efficient and scalable probabilistic modeling.
arXiv Detail & Related papers (2024-08-29T04:06:34Z) - Dynamical Measure Transport and Neural PDE Solvers for Sampling [77.38204731939273]
We tackle the task of sampling from a probability density as transporting a tractable density function to the target.
We employ physics-informed neural networks (PINNs) to approximate the respective partial differential equations (PDEs) solutions.
PINNs allow for simulation- and discretization-free optimization and can be trained very efficiently.
arXiv Detail & Related papers (2024-07-10T17:39:50Z) - Equivariant Flow Matching with Hybrid Probability Transport [69.11915545210393]
Diffusion Models (DMs) have demonstrated effectiveness in generating feature-rich geometries.
DMs typically suffer from unstable probability dynamics with inefficient sampling speed.
We introduce geometric flow matching, which enjoys the advantages of both equivariant modeling and stabilized probability dynamics.
arXiv Detail & Related papers (2023-12-12T11:13:13Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Equivariant Flows: Exact Likelihood Generative Learning for Symmetric
Densities [1.7188280334580197]
Normalizing flows are exact-likelihood generative neural networks which transform samples from a simple prior distribution to samples of the probability distribution of interest.
Recent work showed that such generative models can be utilized in statistical mechanics to sample equilibrium states of many-body systems in physics and chemistry.
arXiv Detail & Related papers (2020-06-03T17:54:26Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - Stochastic Normalizing Flows [2.323220706791067]
We show that normalizing flows can be used to learn the transformation of a simple prior distribution.
We derive an efficient training procedure by which both the sampler's and the flow's parameters can be optimized end-to-end.
We illustrate the representational power, sampling efficiency and correctness of SNFs on several benchmarks including applications to molecular sampling systems in equilibrium.
arXiv Detail & Related papers (2020-02-16T23:29:32Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.