Smooth Normalizing Flows
- URL: http://arxiv.org/abs/2110.00351v1
- Date: Fri, 1 Oct 2021 12:27:14 GMT
- Title: Smooth Normalizing Flows
- Authors: Jonas K\"ohler, Andreas Kr\"amer, Frank No\'e
- Abstract summary: We introduce a class of smooth mixture transformations working on both compact intervals and hypertori.
We show that such inverses can be computed from forward evaluations via the inverse function theorem.
We demonstrate two advantages of such smooth flows: they allow training by force matching to simulation data and can be used as potentials in molecular dynamics simulations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Normalizing flows are a promising tool for modeling probability distributions
in physical systems. While state-of-the-art flows accurately approximate
distributions and energies, applications in physics additionally require smooth
energies to compute forces and higher-order derivatives. Furthermore, such
densities are often defined on non-trivial topologies. A recent example are
Boltzmann Generators for generating 3D-structures of peptides and small
proteins. These generative models leverage the space of internal coordinates
(dihedrals, angles, and bonds), which is a product of hypertori and compact
intervals. In this work, we introduce a class of smooth mixture transformations
working on both compact intervals and hypertori. Mixture transformations employ
root-finding methods to invert them in practice, which has so far prevented
bi-directional flow training. To this end, we show that parameter gradients and
forces of such inverses can be computed from forward evaluations via the
inverse function theorem. We demonstrate two advantages of such smooth flows:
they allow training by force matching to simulation data and can be used as
potentials in molecular dynamics simulations.
Related papers
- Geometric Trajectory Diffusion Models [58.853975433383326]
Generative models have shown great promise in generating 3D geometric systems.
Existing approaches only operate on static structures, neglecting the fact that physical systems are always dynamic in nature.
We propose geometric trajectory diffusion models (GeoTDM), the first diffusion model for modeling the temporal distribution of 3D geometric trajectories.
arXiv Detail & Related papers (2024-10-16T20:36:41Z) - Hessian-Informed Flow Matching [4.542719108171107]
Hessian-Informed Flow Matching is a novel approach that integrates the Hessian of an energy function into conditional flows.
This integration allows HI-FM to account for local curvature and anisotropic covariance structures.
Empirical evaluations on the MNIST and Lennard-Jones particles datasets demonstrate that HI-FM improves the likelihood of test samples.
arXiv Detail & Related papers (2024-10-15T09:34:52Z) - Iterated Energy-based Flow Matching for Sampling from Boltzmann Densities [11.850515912491657]
We propose iterated energy-based flow matching (iEFM) to train continuous normalizing flow (CNF) models from unnormalized densities.
Our results demonstrate that iEFM outperforms existing methods, showcasing its potential for efficient and scalable probabilistic modeling.
arXiv Detail & Related papers (2024-08-29T04:06:34Z) - Combining Wasserstein-1 and Wasserstein-2 proximals: robust manifold learning via well-posed generative flows [6.799748192975493]
We formulate well-posed continuous-time generative flows for learning distributions supported on low-dimensional manifold.
We show that the Wasserstein-1 proximal operator regularize $f$-divergences so that singular distributions can be compared.
We also show that the Wasserstein-2 proximal operator regularize the paths of the generative flows by adding an optimal transport cost.
arXiv Detail & Related papers (2024-07-16T16:34:31Z) - Equivariant Flow Matching with Hybrid Probability Transport [69.11915545210393]
Diffusion Models (DMs) have demonstrated effectiveness in generating feature-rich geometries.
DMs typically suffer from unstable probability dynamics with inefficient sampling speed.
We introduce geometric flow matching, which enjoys the advantages of both equivariant modeling and stabilized probability dynamics.
arXiv Detail & Related papers (2023-12-12T11:13:13Z) - Equivariant flow matching [0.9208007322096533]
We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
arXiv Detail & Related papers (2023-06-26T19:40:10Z) - Rigid Body Flows for Sampling Molecular Crystal Structures [4.368185344922342]
We introduce a new type of normalizing flow that is tailored for modeling positions and orientations of multiple objects in three-dimensional space.
Our approach is based on two key ideas: first, we define smooth and expressive flows on the group of unit quaternions, which allows us to capture the continuous rotational motion of rigid bodies.
We evaluate the method by training Boltzmann generators for two molecular examples, namely the multi-modal density of a tetrahedral system in an external field and the ice XI phase in the TIP4P water model.
arXiv Detail & Related papers (2023-01-26T19:07:40Z) - E(n) Equivariant Normalizing Flows for Molecule Generation in 3D [87.12477361140716]
This paper introduces a generative model equivariant to Euclidean symmetries: E(n) Equivariant Normalizing Flows (E-NFs)
To the best of our knowledge, this is the first likelihood-based deep generative model that generates molecules in 3D.
arXiv Detail & Related papers (2021-05-19T09:28:54Z) - SurVAE Flows: Surjections to Bridge the Gap between VAEs and Flows [78.77808270452974]
SurVAE Flows is a modular framework for composable transformations that encompasses VAEs and normalizing flows.
We show that several recently proposed methods, including dequantization and augmented normalizing flows, can be expressed as SurVAE Flows.
arXiv Detail & Related papers (2020-07-06T13:13:22Z) - Gaussianization Flows [113.79542218282282]
We propose a new type of normalizing flow model that enables both efficient iteration of likelihoods and efficient inversion for sample generation.
Because of this guaranteed expressivity, they can capture multimodal target distributions without compromising the efficiency of sample generation.
arXiv Detail & Related papers (2020-03-04T08:15:06Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.