SE(3) Equivariant Augmented Coupling Flows
- URL: http://arxiv.org/abs/2308.10364v6
- Date: Tue, 5 Mar 2024 14:59:29 GMT
- Title: SE(3) Equivariant Augmented Coupling Flows
- Authors: Laurence I. Midgley and Vincent Stimper and Javier Antor\'an and Emile
Mathieu and Bernhard Sch\"olkopf and Jos\'e Miguel Hern\'andez-Lobato
- Abstract summary: Coupling normalizing flows allow for fast sampling and density evaluation.
Standard coupling architecture precludes endowing flows that operate on the Cartesian coordinates of atoms.
- Score: 16.65770540017618
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Coupling normalizing flows allow for fast sampling and density evaluation,
making them the tool of choice for probabilistic modeling of physical systems.
However, the standard coupling architecture precludes endowing flows that
operate on the Cartesian coordinates of atoms with the SE(3) and permutation
invariances of physical systems. This work proposes a coupling flow that
preserves SE(3) and permutation equivariance by performing coordinate splits
along additional augmented dimensions. At each layer, the flow maps atoms'
positions into learned SE(3) invariant bases, where we apply standard flow
transformations, such as monotonic rational-quadratic splines, before returning
to the original basis. Crucially, our flow preserves fast sampling and density
evaluation, and may be used to produce unbiased estimates of expectations with
respect to the target distribution via importance sampling. When trained on the
DW4, LJ13, and QM9-positional datasets, our flow is competitive with
equivariant continuous normalizing flows and diffusion models, while allowing
sampling more than an order of magnitude faster. Moreover, to the best of our
knowledge, we are the first to learn the full Boltzmann distribution of alanine
dipeptide by only modeling the Cartesian positions of its atoms. Lastly, we
demonstrate that our flow can be trained to approximately sample from the
Boltzmann distribution of the DW4 and LJ13 particle systems using only their
energy functions.
Related papers
- Flow matching achieves almost minimax optimal convergence [50.38891696297888]
Flow matching (FM) has gained significant attention as a simulation-free generative model.
This paper discusses the convergence properties of FM for large sample size under the $p$-Wasserstein distance.
We establish that FM can achieve an almost minimax optimal convergence rate for $1 leq p leq 2$, presenting the first theoretical evidence that FM can reach convergence rates comparable to those of diffusion models.
arXiv Detail & Related papers (2024-05-31T14:54:51Z) - Convergence Analysis of Flow Matching in Latent Space with Transformers [7.069772598731282]
We present theoretical convergence guarantees for ODE-based generative models, specifically flow matching.
We use a pre-trained autoencoder network to map high-dimensional original inputs to a low-dimensional latent space, where a transformer network is trained to predict the velocity field of the transformation from a standard normal distribution to the target latent distribution.
arXiv Detail & Related papers (2024-04-03T07:50:53Z) - Equivariant Flow Matching with Hybrid Probability Transport [69.11915545210393]
Diffusion Models (DMs) have demonstrated effectiveness in generating feature-rich geometries.
DMs typically suffer from unstable probability dynamics with inefficient sampling speed.
We introduce geometric flow matching, which enjoys the advantages of both equivariant modeling and stabilized probability dynamics.
arXiv Detail & Related papers (2023-12-12T11:13:13Z) - Equivariant flow matching [0.9208007322096533]
We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
arXiv Detail & Related papers (2023-06-26T19:40:10Z) - Delving into Discrete Normalizing Flows on SO(3) Manifold for
Probabilistic Rotation Modeling [30.09829541716024]
We propose a novel normalizing flow on SO(3) manifold.
We show that our rotation normalizing flows significantly outperform the baselines on both unconditional and conditional tasks.
arXiv Detail & Related papers (2023-04-08T06:52:02Z) - Third quantization of open quantum systems: new dissipative symmetries
and connections to phase-space and Keldysh field theory formulations [77.34726150561087]
We reformulate the technique of third quantization in a way that explicitly connects all three methods.
We first show that our formulation reveals a fundamental dissipative symmetry present in all quadratic bosonic or fermionic Lindbladians.
For bosons, we then show that the Wigner function and the characteristic function can be thought of as ''wavefunctions'' of the density matrix.
arXiv Detail & Related papers (2023-02-27T18:56:40Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - E(n) Equivariant Normalizing Flows for Molecule Generation in 3D [87.12477361140716]
This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs)
To the best of our knowledge, this is the first likelihood-based deep generative model that generates molecules in 3D.
arXiv Detail & Related papers (2021-05-19T09:28:54Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - You say Normalizing Flows I see Bayesian Networks [11.23030807455021]
We show that normalizing flows reduce to Bayesian networks with a pre-defined topology and a learnable density at each node.
We show that stacking multiple transformations in a normalizing flow relaxes independence assumptions and entangles the model distribution.
We prove the non-universality of the affine normalizing flow, regardless of its depth.
arXiv Detail & Related papers (2020-06-01T11:54:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.