Smooth Normalizing Flows
- URL: http://arxiv.org/abs/2110.00351v1
- Date: Fri, 1 Oct 2021 12:27:14 GMT
- Title: Smooth Normalizing Flows
- Authors: Jonas K\"ohler, Andreas Kr\"amer, Frank No\'e
- Abstract summary: We introduce a class of smooth mixture transformations working on both compact intervals and hypertori.
We show that such inverses can be computed from forward evaluations via the inverse function theorem.
We demonstrate two advantages of such smooth flows: they allow training by force matching to simulation data and can be used as potentials in molecular dynamics simulations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalizing flows are a promising tool for modeling probability distributions
in physical systems. While state-of-the-art flows accurately approximate
distributions and energies, applications in physics additionally require smooth
energies to compute forces and higher-order derivatives. Furthermore, such
densities are often defined on non-trivial topologies. A recent example are
Boltzmann Generators for generating 3D-structures of peptides and small
proteins. These generative models leverage the space of internal coordinates
(dihedrals, angles, and bonds), which is a product of hypertori and compact
intervals. In this work, we introduce a class of smooth mixture transformations
working on both compact intervals and hypertori. Mixture transformations employ
root-finding methods to invert them in practice, which has so far prevented
bi-directional flow training. To this end, we show that parameter gradients and
forces of such inverses can be computed from forward evaluations via the
inverse function theorem. We demonstrate two advantages of such smooth flows:
they allow training by force matching to simulation data and can be used as
potentials in molecular dynamics simulations.
Related papers
- Combining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows [6.799748192975493]
We formulate well-posed continuous-time generative flows for learning distributions supported on low-dimensional manifold.
We show that the Wasserstein-1 proximal operator regularize $f$-divergences so that singular distributions can be compared.
We also show that the Wasserstein-2 proximal operator regularize the paths of the generative flows by adding an optimal transport cost.
arXiv Detail & Related papers (2024-07-16T16:34:31Z) - Equivariant Flow Matching with Hybrid Probability Transport [69.11915545210393]
Diffusion Models (DMs) have demonstrated effectiveness in generating feature-rich geometries.
DMs typically suffer from unstable probability dynamics with inefficient sampling speed.
We introduce geometric flow matching, which enjoys the advantages of both equivariant modeling and stabilized probability dynamics.
arXiv Detail & Related papers (2023-12-12T11:13:13Z) - Equivariant flow matching [0.9208007322096533]
We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
arXiv Detail & Related papers (2023-06-26T19:40:10Z) - Machine learning of hidden variables in multiscale fluid simulation [77.34726150561087]
Solving fluid dynamics equations often requires the use of closure relations that account for missing microphysics.
In our study, a partial differential equation simulator that is end-to-end differentiable is used to train judiciously placed neural networks.
We show that this method enables an equation based approach to reproduce non-linear, large Knudsen number plasma physics.
arXiv Detail & Related papers (2023-06-19T06:02:53Z) - Delving into Discrete Normalizing Flows on SO(3) Manifold for
Probabilistic Rotation Modeling [30.09829541716024]
We propose a novel normalizing flow on SO(3) manifold.
We show that our rotation normalizing flows significantly outperform the baselines on both unconditional and conditional tasks.
arXiv Detail & Related papers (2023-04-08T06:52:02Z) - Rigid Body Flows for Sampling Molecular Crystal Structures [4.368185344922342]
We introduce a new type of normalizing flow that is tailored for modeling positions and orientations of multiple objects in three-dimensional space.
Our approach is based on two key ideas: first, we define smooth and expressive flows on the group of unit quaternions, which allows us to capture the continuous rotational motion of rigid bodies.
We evaluate the method by training Boltzmann generators for two molecular examples, namely the multi-modal density of a tetrahedral system in an external field and the ice XI phase in the TIP4P water model.
arXiv Detail & Related papers (2023-01-26T19:07:40Z) - E(n) Equivariant Normalizing Flows for Molecule Generation in 3D [87.12477361140716]
This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs)
To the best of our knowledge, this is the first likelihood-based deep generative model that generates molecules in 3D.
arXiv Detail & Related papers (2021-05-19T09:28:54Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.