FALCON: Few-step Accurate Likelihoods for Continuous Flows
- URL: http://arxiv.org/abs/2512.09914v1
- Date: Wed, 10 Dec 2025 18:47:25 GMT
- Title: FALCON: Few-step Accurate Likelihoods for Continuous Flows
- Authors: Danyal Rehman, Tara Akhound-Sadegh, Artem Gazizov, Yoshua Bengio, Alexander Tong,
- Abstract summary: We propose Few-step Accurate Likelihoods for Continuous Flows (FALCON), which allows for few-step sampling with a likelihood accurate enough for importance sampling applications.<n>We show FALCON outperforms state-of-the-art normalizing flow models for molecular Boltzmann sampling and is two orders of magnitude faster than the equivalently performing CNF model.
- Score: 78.37361800856583
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Scalable sampling of molecular states in thermodynamic equilibrium is a long-standing challenge in statistical physics. Boltzmann Generators tackle this problem by pairing a generative model, capable of exact likelihood computation, with importance sampling to obtain consistent samples under the target distribution. Current Boltzmann Generators primarily use continuous normalizing flows (CNFs) trained with flow matching for efficient training of powerful models. However, likelihood calculation for these models is extremely costly, requiring thousands of function evaluations per sample, severely limiting their adoption. In this work, we propose Few-step Accurate Likelihoods for Continuous Flows (FALCON), a method which allows for few-step sampling with a likelihood accurate enough for importance sampling applications by introducing a hybrid training objective that encourages invertibility. We show FALCON outperforms state-of-the-art normalizing flow models for molecular Boltzmann sampling and is two orders of magnitude faster than the equivalently performing CNF model.
Related papers
- Coarse-Grained Boltzmann Generators [2.8880597165704]
We propose a principled framework that unifies scalable reduced-order modeling with the exactness of importance sampling.<n>CG-BGs act in a coarse-grained coordinate space, using a learned potential of mean force to reweight samples generated by a flow-based model.<n>Our results demonstrate that CG-BGs faithfully capture complex interactions mediated by explicit solvent within highly reduced representations.
arXiv Detail & Related papers (2026-02-11T08:37:13Z) - Joint Distillation for Fast Likelihood Evaluation and Sampling in Flow-based Models [100.28111930893188]
Some of today's best generative models still require hundreds to thousands of neural function evaluations to compute a single likelihood.<n>We present fast flow joint distillation (F2D2), a framework that simultaneously reduces the number of NFEs required for both sampling and likelihood evaluation by two orders of magnitude.<n>F2D2 is modular, compatible with existing flow-based few-step sampling models, and requires only an additional divergence prediction head.
arXiv Detail & Related papers (2025-12-02T10:48:20Z) - Energy-Weighted Flow Matching: Unlocking Continuous Normalizing Flows for Efficient and Scalable Boltzmann Sampling [42.79674268979455]
Energy-Weighted Flow Matching is a novel training objective enabling continuous normalizing flows to model Boltzmann distributions.<n>Our algorithms demonstrate sample quality competitive with state-of-the-art energy-only methods.
arXiv Detail & Related papers (2025-09-03T21:16:03Z) - BoltzNCE: Learning Likelihoods for Boltzmann Generation with Stochastic Interpolants and Noise Contrastive Estimation [1.2874523233023452]
Efficient sampling from the Boltzmann distribution is a key challenge for modeling complex physical systems such as molecules.<n>We train an energy-based model (EBM) to approximate likelihoods using both noise contrastive estimation (NCE) and score matching.<n>Our approach also exhibits effective transfer learning, generalizing to new systems at inference time and achieving at least a $6times$ speedup over standard MD.
arXiv Detail & Related papers (2025-07-01T15:18:28Z) - Efficient Regression-Based Training of Normalizing Flows for Boltzmann Generators [85.25962679349551]
Boltzmann Generators (BGs) offer efficient sampling and likelihoods, but their training via maximum likelihood is often unstable and computationally challenging.<n>We propose Regression Training of Normalizing Flows (RegFlow), a novel and scalable-based training objective that bypasses the numerical instability and computational challenge of conventional maximum likelihood training.
arXiv Detail & Related papers (2025-06-01T20:32:27Z) - Gaussian Mixture Flow Matching Models [63.092956669059824]
Diffusion models approximate the denoising distribution as a Gaussian and predict its mean, whereas flow matching models re parameterize the Gaussian mean as flow velocity.<n>They underperform in few-step sampling due to discretization error and tend to produce over-saturated colors under classifier-free guidance (CFG)<n>We introduce a novel probabilistic guidance scheme that mitigates the over-saturation issues of CFG and improves image generation quality.
arXiv Detail & Related papers (2025-04-07T17:59:42Z) - Scalable Equilibrium Sampling with Sequential Boltzmann Generators [60.00515282300297]
We extend the Boltzmann generator framework with two key contributions.<n>The first is a highly efficient Transformer-based normalizing flow operating directly on all-atom Cartesian coordinates.<n>In particular, we perform inference-time scaling of flow samples using a continuous-time variant of sequential Monte Carlo.
arXiv Detail & Related papers (2025-02-25T18:59:13Z) - Equivariant flow matching [0.9208007322096533]
We introduce equivariant flow matching, a new training objective for equivariant continuous normalizing flows (CNFs)
Equivariant flow matching exploits the physical symmetries of the target energy for efficient, simulation-free training of equivariant CNFs.
Our results show that the equivariant flow matching objective yields flows with shorter integration paths, improved sampling efficiency, and higher scalability compared to existing methods.
arXiv Detail & Related papers (2023-06-26T19:40:10Z) - Improving and generalizing flow-based generative models with minibatch
optimal transport [90.01613198337833]
We introduce the generalized conditional flow matching (CFM) technique for continuous normalizing flows (CNFs)
CFM features a stable regression objective like that used to train the flow in diffusion models but enjoys the efficient inference of deterministic flow models.
A variant of our objective is optimal transport CFM (OT-CFM), which creates simpler flows that are more stable to train and lead to faster inference.
arXiv Detail & Related papers (2023-02-01T14:47:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.