Restricted Boltzmann Machine and Deep Belief Network: Tutorial and
Survey
- URL: http://arxiv.org/abs/2107.12521v1
- Date: Mon, 26 Jul 2021 23:59:12 GMT
- Title: Restricted Boltzmann Machine and Deep Belief Network: Tutorial and
Survey
- Authors: Benyamin Ghojogh, Ali Ghodsi, Fakhri Karray, Mark Crowley
- Abstract summary: This tutorial and survey paper is on Boltzmann Machine (BM), Restricted Boltzmann Machine (RBM), and Deep Belief Network (DBN)
We start with the required background on probabilistic graphical models, Markov random field, Gibbs sampling, statistical physics, Ising model, and the Hopfield network.
The conditional distributions of visible and hidden variables, Gibbs sampling in RBM for generating variables, training BM and RBM by maximum likelihood estimation, and contrastive divergence are explained.
- Score: 5.967999555890417
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This is a tutorial and survey paper on Boltzmann Machine (BM), Restricted
Boltzmann Machine (RBM), and Deep Belief Network (DBN). We start with the
required background on probabilistic graphical models, Markov random field,
Gibbs sampling, statistical physics, Ising model, and the Hopfield network.
Then, we introduce the structures of BM and RBM. The conditional distributions
of visible and hidden variables, Gibbs sampling in RBM for generating
variables, training BM and RBM by maximum likelihood estimation, and
contrastive divergence are explained. Then, we discuss different possible
discrete and continuous distributions for the variables. We introduce
conditional RBM and how it is trained. Finally, we explain deep belief network
as a stack of RBM models. This paper on Boltzmann machines can be useful in
various fields including data science, statistics, neural computation, and
statistical physics.
Related papers
- Monotone deep Boltzmann machines [86.50247625239406]
Deep Boltzmann machines (DBMs) are multi-layered probabilistic models governed by a pairwise energy function.
We develop a new class of restricted model, the monotone DBM, which allows for arbitrary self-connection in each layer.
We show that a particular choice of activation results in a fixed-point iteration that gives a variational mean-field solution.
arXiv Detail & Related papers (2023-07-11T03:02:44Z) - Human Trajectory Forecasting with Explainable Behavioral Uncertainty [63.62824628085961]
Human trajectory forecasting helps to understand and predict human behaviors, enabling applications from social robots to self-driving cars.
Model-free methods offer superior prediction accuracy but lack explainability, while model-based methods provide explainability but cannot predict well.
We show that BNSP-SFM achieves up to a 50% improvement in prediction accuracy, compared with 11 state-of-the-art methods.
arXiv Detail & Related papers (2023-07-04T16:45:21Z) - Neural Boltzmann Machines [2.179313476241343]
Conditional generative models are capable of using contextual information as input to create new imaginative outputs.
Conditional Restricted Boltzmann Machines (CRBMs) are one class of conditional generative models that have proven to be especially adept at modeling noisy discrete or continuous data.
We generalize CRBMs by converting each of the CRBM parameters to their own neural networks that are allowed to be functions of the conditional inputs.
arXiv Detail & Related papers (2023-05-15T04:03:51Z) - Learning Probabilistic Models from Generator Latent Spaces with Hat EBM [81.35199221254763]
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM)
Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators.
arXiv Detail & Related papers (2022-10-29T03:55:34Z) - Gaussian-Bernoulli RBMs Without Tears [113.62579223055958]
We propose a novel Gibbs-Langevin sampling algorithm that outperforms existing methods like Gibbs sampling.
We propose a modified contrastive divergence (CD) algorithm so that one can generate images with GRBMs starting from noise.
arXiv Detail & Related papers (2022-10-19T06:22:55Z) - Probabilistic Gradient Boosting Machines for Large-Scale Probabilistic
Regression [51.770998056563094]
Probabilistic Gradient Boosting Machines (PGBM) is a method to create probabilistic predictions with a single ensemble of decision trees.
We empirically demonstrate the advantages of PGBM compared to existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-03T08:32:13Z) - Boltzmann machines as two-dimensional tensor networks [7.041258064903578]
We show that RBM and DBM can be exactly represented as a two-dimensional tensor network.
This representation gives an understanding of the expressive power of RBM and DBM.
Also provides an efficient tensor network contraction algorithm for the computing partition function of RBM and DBM.
arXiv Detail & Related papers (2021-05-10T06:14:49Z) - On the mapping between Hopfield networks and Restricted Boltzmann
Machines [0.0]
We show an exact mapping between Hopfield networks (HNs) and Restricted Boltzmann Machines (RBMs)
We outline the conditions under which the reverse mapping exists, and conduct experiments on the MNIST dataset.
We discuss extensions, the potential importance of this correspondence for the training of RBMs, and for understanding the performance of deep architectures which utilize RBMs.
arXiv Detail & Related papers (2021-01-27T23:49:48Z) - Using Restricted Boltzmann Machines to Model Molecular Geometries [0.0]
This paper proposes a new methodology for modeling a set of physical parameters by taking advantage of the Boltzmann machine's fast learning capacity and representational power.
In this paper we introduce a new RBM based on the Tanh activation function, and conduct a comparison of RBMs with different activation functions.
We demonstrate the ability of Gaussian RBMs to model small molecules such as water and ethane.
arXiv Detail & Related papers (2020-12-13T07:02:32Z) - Restricted Boltzmann Machine, recent advances and mean-field theory [0.8702432681310401]
Review deals with Restricted Boltzmann Machine (RBM) under the light of statistical physics.
RBM is a classical family of Machine learning (ML) models which played a central role in the development of deep learning.
arXiv Detail & Related papers (2020-11-23T10:08:53Z) - Exact representations of many body interactions with RBM neural networks [77.34726150561087]
We exploit the representation power of RBMs to provide an exact decomposition of many-body contact interactions into one-body operators.
This construction generalizes the well known Hirsch's transform used for the Hubbard model to more complicated theories such as Pionless EFT in nuclear physics.
arXiv Detail & Related papers (2020-05-07T15:59:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.