Generalized Energy Based Models
- URL: http://arxiv.org/abs/2003.05033v5
- Date: Tue, 21 Dec 2021 12:02:53 GMT
- Title: Generalized Energy Based Models
- Authors: Michael Arbel and Liang Zhou and Arthur Gretton
- Abstract summary: We introduce the Generalized Energy Based Model (GEBM) for generative modelling.
Both the energy function and base jointly constitute the final model, unlike GANs, which retain only the base distribution.
We show that both training stages are well-defined.
- Score: 35.49065282173972
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce the Generalized Energy Based Model (GEBM) for generative
modelling. These models combine two trained components: a base distribution
(generally an implicit model), which can learn the support of data with low
intrinsic dimension in a high dimensional space; and an energy function, to
refine the probability mass on the learned support. Both the energy function
and base jointly constitute the final model, unlike GANs, which retain only the
base distribution (the "generator"). GEBMs are trained by alternating between
learning the energy and the base. We show that both training stages are
well-defined: the energy is learned by maximising a generalized likelihood, and
the resulting energy-based loss provides informative gradients for learning the
base. Samples from the posterior on the latent space of the trained model can
be obtained via MCMC, thus finding regions in this space that produce better
quality samples. Empirically, the GEBM samples on image-generation tasks are of
much better quality than those from the learned generator alone, indicating
that all else being equal, the GEBM will outperform a GAN of the same
complexity. When using normalizing flows as base measures, GEBMs succeed on
density modelling tasks, returning comparable performance to direct maximum
likelihood of the same networks.
Related papers
- Variational Potential Flow: A Novel Probabilistic Framework for Energy-Based Generative Modelling [10.926841288976684]
We present a novel energy-based generative framework, Variational Potential Flow (VAPO)
VAPO aims to learn a potential energy function whose gradient (flow) guides the prior samples, so that their density evolution closely follows an approximate data likelihood homotopy.
Images can be generated after training the potential energy, by initializing the samples from Gaussian prior and solving the ODE governing the potential flow on a fixed time interval.
arXiv Detail & Related papers (2024-07-21T18:08:12Z) - Generalized Contrastive Divergence: Joint Training of Energy-Based Model
and Diffusion Model through Inverse Reinforcement Learning [13.22531381403974]
Generalized Contrastive Divergence (GCD) is a novel objective function for training an energy-based model (EBM) and a sampler simultaneously.
We present preliminary yet promising results showing that joint training is beneficial for both EBM and a diffusion model.
arXiv Detail & Related papers (2023-12-06T10:10:21Z) - Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood [64.95663299945171]
Training energy-based models (EBMs) on high-dimensional data can be both challenging and time-consuming.
There exists a noticeable gap in sample quality between EBMs and other generative frameworks like GANs and diffusion models.
We propose cooperative diffusion recovery likelihood (CDRL), an effective approach to tractably learn and sample from a series of EBMs.
arXiv Detail & Related papers (2023-09-10T22:05:24Z) - Learning Joint Latent Space EBM Prior Model for Multi-layer Generator [44.4434704520236]
We study the fundamental problem of learning multi-layer generator models.
We propose an energy-based model (EBM) on the joint latent space over all layers of latent variables.
Our experiments demonstrate that the learned model can be expressive in generating high-quality images.
arXiv Detail & Related papers (2023-06-10T00:27:37Z) - On Feature Diversity in Energy-based Models [98.78384185493624]
An energy-based model (EBM) is typically formed of inner-model(s) that learn a combination of the different features to generate an energy mapping for each input configuration.
We extend the probably approximately correct (PAC) theory of EBMs and analyze the effect of redundancy reduction on the performance of EBMs.
arXiv Detail & Related papers (2023-06-02T12:30:42Z) - An Energy-Based Prior for Generative Saliency [62.79775297611203]
We propose a novel generative saliency prediction framework that adopts an informative energy-based model as a prior distribution.
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
Experimental results show that our generative saliency model with an energy-based prior can achieve not only accurate saliency predictions but also reliable uncertainty maps consistent with human perception.
arXiv Detail & Related papers (2022-04-19T10:51:00Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Learning Latent Space Energy-Based Prior Model [118.86447805707094]
We learn energy-based model (EBM) in the latent space of a generator model.
We show that the learned model exhibits strong performances in terms of image and text generation and anomaly detection.
arXiv Detail & Related papers (2020-06-15T08:11:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.