On Feature Diversity in Energy-based Models
- URL: http://arxiv.org/abs/2306.01489v1
- Date: Fri, 2 Jun 2023 12:30:42 GMT
- Title: On Feature Diversity in Energy-based Models
- Authors: Firas Laakom, Jenni Raitoharju, Alexandros Iosifidis, Moncef Gabbouj
- Abstract summary: An energy-based model (EBM) is typically formed of inner-model(s) that learn a combination of the different features to generate an energy mapping for each input configuration.
We extend the probably approximately correct (PAC) theory of EBMs and analyze the effect of redundancy reduction on the performance of EBMs.
- Score: 98.78384185493624
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Energy-based learning is a powerful learning paradigm that encapsulates
various discriminative and generative approaches. An energy-based model (EBM)
is typically formed of inner-model(s) that learn a combination of the different
features to generate an energy mapping for each input configuration. In this
paper, we focus on the diversity of the produced feature set. We extend the
probably approximately correct (PAC) theory of EBMs and analyze the effect of
redundancy reduction on the performance of EBMs. We derive generalization
bounds for various learning contexts, i.e., regression, classification, and
implicit regression, with different energy functions and we show that indeed
reducing redundancy of the feature set can consistently decrease the gap
between the true and empirical expectation of the energy and boosts the
performance of the model.
Related papers
- Hitchhiker's guide on Energy-Based Models: a comprehensive review on the relation with other generative models, sampling and statistical physics [0.0]
Energy-Based Models (EBMs) have emerged as a powerful framework in the realm of generative modeling.
This review aims to provide physicists with a comprehensive understanding of EBMs, delineating their connection to other generative models.
arXiv Detail & Related papers (2024-06-19T16:08:00Z) - Improving Adversarial Energy-Based Model via Diffusion Process [25.023967485839155]
Adversarial EBMs introduce a generator to form a minimax training game.
Inspired by diffusion-based models, we embedded EBMs into each denoising step to split a long-generated process into several smaller steps.
Our experiments show significant improvement in generation compared to existing adversarial EBMs.
arXiv Detail & Related papers (2024-03-04T01:33:53Z) - Energy Transformer [64.22957136952725]
Our work combines aspects of three promising paradigms in machine learning, namely, attention mechanism, energy-based models, and associative memory.
We propose a novel architecture, called the Energy Transformer (or ET for short), that uses a sequence of attention layers that are purposely designed to minimize a specifically engineered energy function.
arXiv Detail & Related papers (2023-02-14T18:51:22Z) - Latent Diffusion Energy-Based Model for Interpretable Text Modeling [104.85356157724372]
We introduce a novel symbiosis between the diffusion models and latent space EBMs in a variational learning framework.
We develop a geometric clustering-based regularization jointly with the information bottleneck to further improve the quality of the learned latent space.
arXiv Detail & Related papers (2022-06-13T03:41:31Z) - Energy-Based Models for Continual Learning [36.05297743063411]
We motivate Energy-Based Models (EBMs) as a promising model class for continual learning problems.
Our proposed version of EBMs for continual learning is simple, efficient and outperforms baseline methods by a large margin on several benchmarks.
arXiv Detail & Related papers (2020-11-24T17:08:13Z) - Energy-based View of Retrosynthesis [70.66156081030766]
We propose a framework that unifies sequence- and graph-based methods as energy-based models.
We present a novel dual variant within the framework that performs consistent training over Bayesian forward- and backward-prediction.
This model improves state-of-the-art performance by 9.6% for template-free approaches where the reaction type is unknown.
arXiv Detail & Related papers (2020-07-14T18:51:06Z) - Joint Training of Variational Auto-Encoder and Latent Energy-Based Model [112.7509497792616]
This paper proposes a joint training method to learn both the variational auto-encoder (VAE) and the latent energy-based model (EBM)
The joint training of VAE and latent EBM are based on an objective function that consists of three Kullback-Leibler divergences between three joint distributions on the latent vector and the image.
arXiv Detail & Related papers (2020-06-10T20:32:25Z) - Training Deep Energy-Based Models with f-Divergence Minimization [113.97274898282343]
Deep energy-based models (EBMs) are very flexible in distribution parametrization but computationally challenging.
We propose a general variational framework termed f-EBM to train EBMs using any desired f-divergence.
Experimental results demonstrate the superiority of f-EBM over contrastive divergence, as well as the benefits of training EBMs using f-divergences other than KL.
arXiv Detail & Related papers (2020-03-06T23:11:13Z) - ICE-BeeM: Identifiable Conditional Energy-Based Deep Models Based on
Nonlinear ICA [11.919315372249802]
We consider the identifiability theory of probabilistic models.
We show that our model can be used for the estimation of the components in the framework of Independently Modulated Component Analysis.
arXiv Detail & Related papers (2020-02-26T14:43:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.