Learning Probabilistic Models from Generator Latent Spaces with Hat EBM
- URL: http://arxiv.org/abs/2210.16486v1
- Date: Sat, 29 Oct 2022 03:55:34 GMT
- Title: Learning Probabilistic Models from Generator Latent Spaces with Hat EBM
- Authors: Mitch Hill, Erik Nijkamp, Jonathan Mitchell, Bo Pang, Song-Chun Zhu
- Abstract summary: This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM)
Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators.
- Score: 81.35199221254763
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This work proposes a method for using any generator network as the foundation
of an Energy-Based Model (EBM). Our formulation posits that observed images are
the sum of unobserved latent variables passed through the generator network and
a residual random variable that spans the gap between the generator output and
the image manifold. One can then define an EBM that includes the generator as
part of its forward pass, which we call the Hat EBM. The model can be trained
without inferring the latent variables of the observed data or calculating the
generator Jacobian determinant. This enables explicit probabilistic modeling of
the output distribution of any type of generator network. Experiments show
strong performance of the proposed method on (1) unconditional ImageNet
synthesis at 128x128 resolution, (2) refining the output of existing
generators, and (3) learning EBMs that incorporate non-probabilistic
generators. Code and pretrained models to reproduce our results are available
at https://github.com/point0bar1/hat-ebm.
Related papers
- Learning Energy-based Model via Dual-MCMC Teaching [5.31573596283377]
Learning the energy-based model (EBM) can be achieved using the maximum likelihood estimation (MLE)
This paper studies the fundamental learning problem of the energy-based model (EBM)
arXiv Detail & Related papers (2023-12-05T03:39:54Z) - Generative Visual Prompt: Unifying Distributional Control of Pre-Trained
Generative Models [77.47505141269035]
Generative Visual Prompt (PromptGen) is a framework for distributional control over pre-trained generative models.
PromptGen approximats an energy-based model (EBM) and samples images in a feed-forward manner.
Code is available at https://github.com/ChenWu98/Generative-Visual-Prompt.
arXiv Detail & Related papers (2022-09-14T22:55:18Z) - An Energy-Based Prior for Generative Saliency [62.79775297611203]
We propose a novel generative saliency prediction framework that adopts an informative energy-based model as a prior distribution.
With the generative saliency model, we can obtain a pixel-wise uncertainty map from an image, indicating model confidence in the saliency prediction.
Experimental results show that our generative saliency model with an energy-based prior can achieve not only accurate saliency predictions but also reliable uncertainty maps consistent with human perception.
arXiv Detail & Related papers (2022-04-19T10:51:00Z) - Perturb-and-max-product: Sampling and learning in discrete energy-based
models [3.056751497358646]
We present perturb-and-max-product (PMP), a parallel and scalable mechanism for sampling and learning in discrete energy-based models.
We show that (a) for Ising models, PMP is orders of magnitude faster than Gibbs and Gibbs-with-Gradients at learning and generating samples of similar or better quality; (b) PMP is able to learn and sample from RBMs; (c) in a large, entangled graphical model in which Gibbs and GWG fail to mix, PMP succeeds.
arXiv Detail & Related papers (2021-11-03T18:23:31Z) - Controllable and Compositional Generation with Latent-Space Energy-Based
Models [60.87740144816278]
Controllable generation is one of the key requirements for successful adoption of deep generative models in real-world applications.
In this work, we use energy-based models (EBMs) to handle compositional generation over a set of attributes.
By composing energy functions with logical operators, this work is the first to achieve such compositionality in generating photo-realistic images of resolution 1024x1024.
arXiv Detail & Related papers (2021-10-21T03:31:45Z) - Unsupervised Controllable Generation with Self-Training [90.04287577605723]
controllable generation with GANs remains a challenging research problem.
We propose an unsupervised framework to learn a distribution of latent codes that control the generator through self-training.
Our framework exhibits better disentanglement compared to other variants such as the variational autoencoder.
arXiv Detail & Related papers (2020-07-17T21:50:35Z) - Learning Latent Space Energy-Based Prior Model [118.86447805707094]
We learn energy-based model (EBM) in the latent space of a generator model.
We show that the learned model exhibits strong performances in terms of image and text generation and anomaly detection.
arXiv Detail & Related papers (2020-06-15T08:11:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.