Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
- URL: http://arxiv.org/abs/2504.10612v4
- Date: Thu, 26 Jun 2025 14:04:51 GMT
- Title: Energy Matching: Unifying Flow Matching and Energy-Based Models for Generative Modeling
- Authors: Michal Balcerak, Tamaz Amiranashvili, Antonio Terpin, Suprosanna Shit, Lea Bogensperger, Sebastian Kaltenbach, Petros Koumoutsakos, Bjoern Menze,
- Abstract summary: Energy-based models (EBMs) map noise and data distributions by matching flows or scores.<n>We propose Energy Matching, a framework that endows flow-based approaches with the flexibility of EBMs.<n>Our method substantially outperforms existing EBMs on CIFAR-10 and ImageNet generation in terms of fidelity.
- Score: 4.395339671282145
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The most widely used generative models map noise and data distributions by matching flows or scores. However, they struggle to incorporate partial observations and additional priors--something energy-based models (EBMs) handle elegantly by simply adding corresponding scalar energy terms. We address this issue by proposing Energy Matching, a framework that endows flow-based approaches with the flexibility of EBMs. Far from the data manifold, samples move along curl-free, optimal transport paths from noise to data. As they approach the data manifold, an entropic energy term guides the system into a Boltzmann equilibrium distribution, explicitly capturing the underlying likelihood structure of the data. We parameterize this dynamic with a single time-independent scalar field, which serves as both a powerful generator and a flexible prior for effective regularization of inverse problems. Our method substantially outperforms existing EBMs on CIFAR-10 and ImageNet generation in terms of fidelity, while retaining simulation-free training of transport-based approaches away from the data manifold. Furthermore, we leverage the method's flexibility to introduce an interaction energy that supports diverse mode exploration, which we demonstrate in a controlled protein-generation setting. Our approach focuses on learning a scalar potential energy--without time-conditioning, auxiliary generators, or additional networks--which marks a significant departure from recent EBM methods. We believe that this simplified framework significantly advances EBMs capabilities and paves the way for their wider adoption in generative modeling across diverse domains.
Related papers
- Learning Energy-Based Generative Models via Potential Flow: A Variational Principle Approach to Probability Density Homotopy Matching [9.12119858170289]
Energy-based models (EBMs) are a powerful class of probabilistic generative models.
We propose Variational Potential Flow Bayes (VPFB), a new energy-based generative framework.
arXiv Detail & Related papers (2025-04-22T20:39:07Z) - Energy-Weighted Flow Matching for Offline Reinforcement Learning [53.64306385597818]
This paper investigates energy guidance in generative modeling, where the target distribution is defined as $q(mathbf x) propto p(mathbf x)exp(-beta mathcal E(mathcal x))$, with $p(mathbf x)$ being the data distribution and $mathcal E(mathcal x)$ as the energy function.<n>We introduce energy-weighted flow matching (EFM), a method that directly learns the energy-guided flow without the need for auxiliary models.<n>We extend this methodology to energy-weighted
arXiv Detail & Related papers (2025-03-06T21:10:12Z) - Energy-Based Modelling for Discrete and Mixed Data via Heat Equations on Structured Spaces [19.92604781654767]
Energy-based models (EBMs) offer a flexible framework for probabilistic modelling across various data domains.<n>We propose to train discrete EBMs with Energy Discrepancy, a loss function which only requires the evaluation of the energy function at data points.
arXiv Detail & Related papers (2024-12-02T00:35:29Z) - Energy-Based Diffusion Language Models for Text Generation [126.23425882687195]
Energy-based Diffusion Language Model (EDLM) is an energy-based model operating at the full sequence level for each diffusion step.<n>Our framework offers a 1.3$times$ sampling speedup over existing diffusion models.
arXiv Detail & Related papers (2024-10-28T17:25:56Z) - Generalized Flow Matching for Transition Dynamics Modeling [14.76793118877456]
We propose a data-driven approach to warm-up the simulation by learning nonlinearities from local dynamics.
Specifically, we infer a potential energy function from local dynamics data to find plausible paths between two metastable states.
We validate the effectiveness of the proposed method to sample probable paths on both synthetic and real-world molecular systems.
arXiv Detail & Related papers (2024-10-19T15:03:39Z) - Latent Space Energy-based Neural ODEs [73.01344439786524]
This paper introduces novel deep dynamical models designed to represent continuous-time sequences.
We train the model using maximum likelihood estimation with Markov chain Monte Carlo.
Experimental results on oscillating systems, videos and real-world state sequences (MuJoCo) demonstrate that our model with the learnable energy-based prior outperforms existing counterparts.
arXiv Detail & Related papers (2024-09-05T18:14:22Z) - Variational Potential Flow: A Novel Probabilistic Framework for Energy-Based Generative Modelling [10.926841288976684]
We present a novel energy-based generative framework, Variational Potential Flow (VAPO)
VAPO aims to learn a potential energy function whose gradient (flow) guides the prior samples, so that their density evolution closely follows an approximate data likelihood homotopy.
Images can be generated after training the potential energy, by initializing the samples from Gaussian prior and solving the ODE governing the potential flow on a fixed time interval.
arXiv Detail & Related papers (2024-07-21T18:08:12Z) - Hitchhiker's guide on Energy-Based Models: a comprehensive review on the relation with other generative models, sampling and statistical physics [0.0]
Energy-Based Models (EBMs) have emerged as a powerful framework in the realm of generative modeling.
This review aims to provide physicists with a comprehensive understanding of EBMs, delineating their connection to other generative models.
arXiv Detail & Related papers (2024-06-19T16:08:00Z) - Improving Adversarial Energy-Based Model via Diffusion Process [25.023967485839155]
Adversarial EBMs introduce a generator to form a minimax training game.
Inspired by diffusion-based models, we embedded EBMs into each denoising step to split a long-generated process into several smaller steps.
Our experiments show significant improvement in generation compared to existing adversarial EBMs.
arXiv Detail & Related papers (2024-03-04T01:33:53Z) - Revisiting Energy Based Models as Policies: Ranking Noise Contrastive
Estimation and Interpolating Energy Models [18.949193683555237]
In this work, we revisit the choice of energy-based models (EBM) as a policy class.
We develop a training objective and algorithm for energy models which combines several key ingredients.
We show that the Implicit Behavior Cloning (IBC) objective is actually biased even at the population level.
arXiv Detail & Related papers (2023-09-11T20:13:47Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - Learning Discrete Energy-based Models via Auxiliary-variable Local
Exploration [130.89746032163106]
We propose ALOE, a new algorithm for learning conditional and unconditional EBMs for discrete structured data.
We show that the energy function and sampler can be trained efficiently via a new variational form of power iteration.
We present an energy model guided fuzzer for software testing that achieves comparable performance to well engineered fuzzing engines like libfuzzer.
arXiv Detail & Related papers (2020-11-10T19:31:29Z) - No MCMC for me: Amortized sampling for fast and stable training of
energy-based models [62.1234885852552]
Energy-Based Models (EBMs) present a flexible and appealing way to represent uncertainty.
We present a simple method for training EBMs at scale using an entropy-regularized generator to amortize the MCMC sampling.
Next, we apply our estimator to the recently proposed Joint Energy Model (JEM), where we match the original performance with faster and stable training.
arXiv Detail & Related papers (2020-10-08T19:17:20Z) - Energy-Based Processes for Exchangeable Data [109.04978766553612]
We introduce Energy-Based Processes (EBPs) to extend energy based models to exchangeable data.
A key advantage of EBPs is the ability to express more flexible distributions over sets without restricting their cardinality.
We develop an efficient training procedure for EBPs that demonstrates state-of-the-art performance on a variety of tasks.
arXiv Detail & Related papers (2020-03-17T04:26:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.