Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
- URL: http://arxiv.org/abs/2504.10612v1
- Date: Mon, 14 Apr 2025 18:10:58 GMT
- Title: Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
- Authors: Michal Balcerak, Tamaz Amiranashvili, Suprosanna Shit, Antonio Terpin, Sebastian Kaltenbach, Petros Koumoutsakos, Bjoern Menze,
- Abstract summary: Generative models often map noise to data by matching flows or scores, but these approaches become cumbersome for incorporating partial observations or additional priors.<n>Inspired by recent advances in Wasserstein gradient flows, we propose Energy Matching, a framework that unifies flow-based approaches with the flexibility of energy-based models (EBMs)<n>We parameterize this dynamic with a single time-independent scalar field, which serves as both a powerful generator and a flexible prior for effective regularization of inverse problems.
- Score: 4.584647857042494
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Generative models often map noise to data by matching flows or scores, but these approaches become cumbersome for incorporating partial observations or additional priors. Inspired by recent advances in Wasserstein gradient flows, we propose Energy Matching, a framework that unifies flow-based approaches with the flexibility of energy-based models (EBMs). Far from the data manifold, samples move along curl-free, optimal transport paths from noise to data. As they approach the data manifold, an entropic energy term guides the system into a Boltzmann equilibrium distribution, explicitly capturing the underlying likelihood structure of the data. We parameterize this dynamic with a single time-independent scalar field, which serves as both a powerful generator and a flexible prior for effective regularization of inverse problems. Our method substantially outperforms existing EBMs on CIFAR-10 generation (FID 3.97 compared to 8.61), while retaining the simulation-free training of transport-based approaches away from the data manifold. Additionally, we exploit the flexibility of our method and introduce an interaction energy for diverse mode exploration. Our approach focuses on learning a static scalar potential energy -- without time conditioning, auxiliary generators, or additional networks -- marking a significant departure from recent EBM methods. We believe this simplified framework significantly advances EBM capabilities and paves the way for their broader adoption in generative modeling across diverse domains.
Related papers
- Learning Energy-Based Generative Models via Potential Flow: A Variational Principle Approach to Probability Density Homotopy Matching [9.12119858170289]
Energy-based models (EBMs) are a powerful class of probabilistic generative models.
We propose Variational Potential Flow Bayes (VPFB), a new energy-based generative framework.
arXiv Detail & Related papers (2025-04-22T20:39:07Z) - Energy-Weighted Flow Matching for Offline Reinforcement Learning [53.64306385597818]
This paper investigates energy guidance in generative modeling, where the target distribution is defined as $q(mathbf x) propto p(mathbf x)exp(-beta mathcal E(mathcal x))$, with $p(mathbf x)$ being the data distribution and $mathcal E(mathcal x)$ as the energy function.<n>We introduce energy-weighted flow matching (EFM), a method that directly learns the energy-guided flow without the need for auxiliary models.<n>We extend this methodology to energy-weighted
arXiv Detail & Related papers (2025-03-06T21:10:12Z) - Energy-Based Modelling for Discrete and Mixed Data via Heat Equations on Structured Spaces [19.92604781654767]
Energy-based models (EBMs) offer a flexible framework for probabilistic modelling across various data domains.<n>We propose to train discrete EBMs with Energy Discrepancy, a loss function which only requires the evaluation of the energy function at data points.
arXiv Detail & Related papers (2024-12-02T00:35:29Z) - Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.<n>Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - Generalized Flow Matching for Transition Dynamics Modeling [14.76793118877456]
We propose a data-driven approach to warm-up the simulation by learning nonlinearities from local dynamics.
Specifically, we infer a potential energy function from local dynamics data to find plausible paths between two metastable states.
We validate the effectiveness of the proposed method to sample probable paths on both synthetic and real-world molecular systems.
arXiv Detail & Related papers (2024-10-19T15:03:39Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Variational Potential Flow: A Novel Probabilistic Framework for Energy-Based Generative Modelling [10.926841288976684]
We present a novel energy-based generative framework, Variational Potential Flow (VAPO)
VAPO aims to learn a potential energy function whose gradient (flow) guides the prior samples, so that their density evolution closely follows an approximate data likelihood homotopy.
Images can be generated after training the potential energy, by initializing the samples from Gaussian prior and solving the ODE governing the potential flow on a fixed time interval.
arXiv Detail & Related papers (2024-07-21T18:08:12Z) - Hitchhiker's guide on Energy-Based Models: a comprehensive review on the relation with other generative models, sampling and statistical physics [0.0]
Energy-Based Models (EBMs) have emerged as a powerful framework in the realm of generative modeling.
This review aims to provide physicists with a comprehensive understanding of EBMs, delineating their connection to other generative models.
arXiv Detail & Related papers (2024-06-19T16:08:00Z) - Revisiting Energy Based Models as Policies: Ranking Noise Contrastive
Estimation and Interpolating Energy Models [18.949193683555237]
In this work, we revisit the choice of energy-based models (EBM) as a policy class.
We develop a training objective and algorithm for energy models which combines several key ingredients.
We show that the Implicit Behavior Cloning (IBC) objective is actually biased even at the population level.
arXiv Detail & Related papers (2023-09-11T20:13:47Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - No MCMC for me: Amortized sampling for fast and stable training of
energy-based models [62.1234885852552]
Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.
We present a simple method for training EBMs at scale using an entropy-regularized generator to amortize the MCMC sampling.
Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training.
arXiv Detail & Related papers (2020-10-08T19:17:20Z) - Energy-Based Processes for Exchangeable Data [109.04978766553612]
We introduce Energy-Based Processes (EBPs) to extend energy based models to exchangeable data.
A key advantage of EBPs is the ability to express more flexible distributions over sets without restricting their cardinality.
We develop an efficient training procedure for EBPs that demonstrates state-of-the-art performance on a variety of tasks.
arXiv Detail & Related papers (2020-03-17T04:26:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.