Neural Boltzmann Machines
- URL: http://arxiv.org/abs/2305.08337v1
- Date: Mon, 15 May 2023 04:03:51 GMT
- Title: Neural Boltzmann Machines
- Authors: Alex H. Lang, Anton D. Loukianov, and Charles K. Fisher
- Abstract summary: Conditional generative models are capable of using contextual information as input to create new imaginative outputs.
Conditional Restricted Boltzmann Machines (CRBMs) are one class of conditional generative models that have proven to be especially adept at modeling noisy discrete or continuous data.
We generalize CRBMs by converting each of the CRBM parameters to their own neural networks that are allowed to be functions of the conditional inputs.
- Score: 2.179313476241343
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Conditional generative models are capable of using contextual information as
input to create new imaginative outputs. Conditional Restricted Boltzmann
Machines (CRBMs) are one class of conditional generative models that have
proven to be especially adept at modeling noisy discrete or continuous data,
but the lack of expressivity in CRBMs have limited their widespread adoption.
Here we introduce Neural Boltzmann Machines (NBMs) which generalize CRBMs by
converting each of the CRBM parameters to their own neural networks that are
allowed to be functions of the conditional inputs. NBMs are highly flexible
conditional generative models that can be trained via stochastic gradient
descent to approximately maximize the log-likelihood of the data. We
demonstrate the utility of NBMs especially with normally distributed data which
has historically caused problems for Gaussian-Bernoulli CRBMs. Code to
reproduce our results can be found at
https://github.com/unlearnai/neural-boltzmann-machines.
Related papers
- Quantum Generative Modeling of Sequential Data with Trainable Token
Embedding [0.0]
A quantum-inspired generative model known as the Born machines have shown great advancements in learning classical and quantum data.
We generalize the embedding method into trainable quantum measurement operators that can be simultaneously honed with MPS.
Our study indicated that combined with trainable embedding, Born machines can exhibit better performance and learn deeper correlations from the dataset.
arXiv Detail & Related papers (2023-11-08T22:56:37Z) - Human Trajectory Forecasting with Explainable Behavioral Uncertainty [63.62824628085961]
Human trajectory forecasting helps to understand and predict human behaviors, enabling applications from social robots to self-driving cars.
Model-free methods offer superior prediction accuracy but lack explainability, while model-based methods provide explainability but cannot predict well.
We show that BNSP-SFM achieves up to a 50% improvement in prediction accuracy, compared with 11 state-of-the-art methods.
arXiv Detail & Related papers (2023-07-04T16:45:21Z) - Investigating the generative dynamics of energy-based neural networks [0.35911228556176483]
We study the generative dynamics of Restricted Boltzmann Machines (RBMs)
We show that the capacity to produce diverse data prototypes can be increased by initiating top-down sampling from chimera states.
We also found that the model is not capable of transitioning between all possible digit states within a single generation trajectory.
arXiv Detail & Related papers (2023-05-11T12:05:40Z) - Learning Probabilistic Models from Generator Latent Spaces with Hat EBM [81.35199221254763]
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM)
Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators.
arXiv Detail & Related papers (2022-10-29T03:55:34Z) - Multi-layered Discriminative Restricted Boltzmann Machine with Untrained
Probabilistic Layer [0.0]
An extreme learning machine (ELM) is a three-layered feed-forward neural network having untrained parameters.
Inspired by ELM, a probabilistic untrained layer called a probabilistic-ELM layer is proposed.
It is combined with a discriminative restricted Boltzmann machine (DRBM) to solve classification problems.
arXiv Detail & Related papers (2022-10-27T13:56:17Z) - Gaussian-Bernoulli RBMs Without Tears [113.62579223055958]
We propose a novel Gibbs-Langevin sampling algorithm that outperforms existing methods like Gibbs sampling.
We propose a modified contrastive divergence (CD) algorithm so that one can generate images with GRBMs starting from noise.
arXiv Detail & Related papers (2022-10-19T06:22:55Z) - Your Autoregressive Generative Model Can be Better If You Treat It as an
Energy-Based One [83.5162421521224]
We propose a unique method termed E-ARM for training autoregressive generative models.
E-ARM takes advantage of a well-designed energy-based learning objective.
We show that E-ARM can be trained efficiently and is capable of alleviating the exposure bias problem.
arXiv Detail & Related papers (2022-06-26T10:58:41Z) - Conditional Born machine for Monte Carlo events generation [0.0]
This paper presents an application of Born machines to Monte Carlo simulations and extends their reach to conditional distributions.
Born machines are used to generate muonic force carriers (MFC) events in high-energy-physics colliders experiments.
Empirical evidences suggest that Born machines can reproduce the underlying distribution of datasets coming from Monte Carlo simulations.
arXiv Detail & Related papers (2022-05-16T13:41:03Z) - Continual Learning with Fully Probabilistic Models [70.3497683558609]
We present an approach for continual learning based on fully probabilistic (or generative) models of machine learning.
We propose a pseudo-rehearsal approach using a Gaussian Mixture Model (GMM) instance for both generator and classifier functionalities.
We show that GMR achieves state-of-the-art performance on common class-incremental learning problems at very competitive time and memory complexity.
arXiv Detail & Related papers (2021-04-19T12:26:26Z) - Highly-scalable stochastic neuron based on Ovonic Threshold Switch (OTS)
and its applications in Restricted Boltzmann Machine (RBM) [0.2529563359433233]
We propose a highly-scalable neuron device based on Ovonic Threshold Switch (OTS)
As a candidate for a true random number generator (TRNG), it passes 15 among the 16 tests of the National Institute of Standards and Technology (NIST) Statistical Test Suite.
reconstruction of images is successfully demonstrated using images contaminated with noises, resulting in images with the noise removed.
arXiv Detail & Related papers (2020-10-21T13:20:01Z) - Exact representations of many body interactions with RBM neural networks [77.34726150561087]
We exploit the representation power of RBMs to provide an exact decomposition of many-body contact interactions into one-body operators.
This construction generalizes the well known Hirsch's transform used for the Hubbard model to more complicated theories such as Pionless EFT in nuclear physics.
arXiv Detail & Related papers (2020-05-07T15:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.