Learning Latent Space Hierarchical EBM Diffusion Models
- URL: http://arxiv.org/abs/2405.13910v2
- Date: Mon, 27 May 2024 18:05:55 GMT
- Title: Learning Latent Space Hierarchical EBM Diffusion Models
- Authors: Jiali Cui, Tian Han,
- Abstract summary: We study the learning problem of the energy-based prior model and the multi-layer generator model.
Recent works have explored learning the energy-based (EBM) prior model as a second-stage, complementary model to bridge the gap.
We propose to leverage the diffusion probabilistic scheme to mitigate the burden of EBM sampling and thus facilitate EBM learning.
- Score: 4.4996462447311725
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: This work studies the learning problem of the energy-based prior model and the multi-layer generator model. The multi-layer generator model, which contains multiple layers of latent variables organized in a top-down hierarchical structure, typically assumes the Gaussian prior model. Such a prior model can be limited in modelling expressivity, which results in a gap between the generator posterior and the prior model, known as the prior hole problem. Recent works have explored learning the energy-based (EBM) prior model as a second-stage, complementary model to bridge the gap. However, the EBM defined on a multi-layer latent space can be highly multi-modal, which makes sampling from such marginal EBM prior challenging in practice, resulting in ineffectively learned EBM. To tackle the challenge, we propose to leverage the diffusion probabilistic scheme to mitigate the burden of EBM sampling and thus facilitate EBM learning. Our extensive experiments demonstrate a superior performance of our diffusion-learned EBM prior on various challenging tasks.
Related papers
- Improving Adversarial Energy-Based Model via Diffusion Process [25.023967485839155]
Adversarial EBMs introduce a generator to form a minimax training game.
Inspired by diffusion-based models, we embedded EBMs into each denoising step to split a long-generated process into several smaller steps.
Our experiments show significant improvement in generation compared to existing adversarial EBMs.
arXiv Detail & Related papers (2024-03-04T01:33:53Z) - Learning Energy-Based Prior Model with Diffusion-Amortized MCMC [89.95629196907082]
Common practice of learning latent space EBMs with non-convergent short-run MCMC for prior and posterior sampling is hindering the model from further progress.
We introduce a simple but effective diffusion-based amortization method for long-run MCMC sampling and develop a novel learning algorithm for the latent space EBM based on it.
arXiv Detail & Related papers (2023-10-05T00:23:34Z) - Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood [64.95663299945171]
Training energy-based models (EBMs) on high-dimensional data can be both challenging and time-consuming.
There exists a noticeable gap in sample quality between EBMs and other generative frameworks like GANs and diffusion models.
We propose cooperative diffusion recovery likelihood (CDRL), an effective approach to tractably learn and sample from a series of EBMs.
arXiv Detail & Related papers (2023-09-10T22:05:24Z) - Learning Joint Latent Space EBM Prior Model for Multi-layer Generator [44.4434704520236]
We study the fundamental problem of learning multi-layer generator models.
We propose an energy-based model (EBM) on the joint latent space over all layers of latent variables.
Our experiments demonstrate that the learned model can be expressive in generating high-quality images.
arXiv Detail & Related papers (2023-06-10T00:27:37Z) - Latent Diffusion Energy-Based Model for Interpretable Text Modeling [104.85356157724372]
We introduce a novel symbiosis between the diffusion models and latent space EBMs in a variational learning framework.
We develop a geometric clustering-based regularization jointly with the information bottleneck to further improve the quality of the learned latent space.
arXiv Detail & Related papers (2022-06-13T03:41:31Z) - Particle Dynamics for Learning EBMs [83.59335980576637]
Energy-based modeling is a promising approach to unsupervised learning, which yields many downstream applications from a single model.
The main difficulty in learning energy-based models with the "contrastive approaches" is the generation of samples from the current energy function at each iteration.
This paper proposes an alternative approach to getting these samples and avoiding crude MCMC sampling from the current model.
arXiv Detail & Related papers (2021-11-26T23:41:07Z) - Learning Latent Space Energy-Based Prior Model [118.86447805707094]
We learn energy-based model (EBM) in the latent space of a generator model.
We show that the learned model exhibits strong performances in terms of image and text generation and anomaly detection.
arXiv Detail & Related papers (2020-06-15T08:11:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.