Inferring effective couplings with Restricted Boltzmann Machines
- URL: http://arxiv.org/abs/2309.02292v3
- Date: Wed, 24 Jan 2024 11:10:07 GMT
- Title: Inferring effective couplings with Restricted Boltzmann Machines
- Authors: Aur\'elien Decelle, Cyril Furtlehner, Alfonso De Jesus Navas G\'omez,
Beatriz Seoane
- Abstract summary: Generative models attempt to encode correlations observed in the data at the level of the Boltzmann weight associated with an energy function in the form of a neural network.
We propose a solution by implementing a direct mapping between the Restricted Boltzmann Machine and an effective Ising spin Hamiltonian.
- Score: 3.150368120416908
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Generative models offer a direct way of modeling complex data. Energy-based
models attempt to encode the statistical correlations observed in the data at
the level of the Boltzmann weight associated with an energy function in the
form of a neural network. We address here the challenge of understanding the
physical interpretation of such models. In this study, we propose a simple
solution by implementing a direct mapping between the Restricted Boltzmann
Machine and an effective Ising spin Hamiltonian. This mapping includes
interactions of all possible orders, going beyond the conventional pairwise
interactions typically considered in the inverse Ising (or Boltzmann Machine)
approach, and allowing the description of complex datasets. Earlier works
attempted to achieve this goal, but the proposed mappings were inaccurate for
inference applications, did not properly treat the complexity of the problem,
or did not provide precise prescriptions for practical application. To validate
our method, we performed several controlled inverse numerical experiments in
which we trained the RBMs using equilibrium samples of predefined models with
local external fields, 2-body and 3-body interactions in different sparse
topologies. The results demonstrate the effectiveness of our proposed approach
in learning the correct interaction network and pave the way for its
application in modeling interesting binary variable datasets. We also evaluate
the quality of the inferred model based on different training methods.
Related papers
- Conditional score-based diffusion models for solving inverse problems in mechanics [6.319616423658121]
We propose a framework to perform Bayesian inference using conditional score-based diffusion models.
Conditional score-based diffusion models are generative models that learn to approximate the score function of a conditional distribution.
We demonstrate the efficacy of the proposed approach on a suite of high-dimensional inverse problems in mechanics.
arXiv Detail & Related papers (2024-06-19T02:09:15Z) - Integrating GNN and Neural ODEs for Estimating Non-Reciprocal Two-Body Interactions in Mixed-Species Collective Motion [0.0]
We present a novel deep learning framework for estimating the underlying equations of motion from observed trajectories.
Our framework integrates graph neural networks with neural differential equations, enabling effective prediction of two-body interactions.
arXiv Detail & Related papers (2024-05-26T09:47:17Z) - Fusion of Gaussian Processes Predictions with Monte Carlo Sampling [61.31380086717422]
In science and engineering, we often work with models designed for accurate prediction of variables of interest.
Recognizing that these models are approximations of reality, it becomes desirable to apply multiple models to the same data and integrate their outcomes.
arXiv Detail & Related papers (2024-03-03T04:21:21Z) - Towards Theoretical Understandings of Self-Consuming Generative Models [56.84592466204185]
This paper tackles the emerging challenge of training generative models within a self-consuming loop.
We construct a theoretical framework to rigorously evaluate how this training procedure impacts the data distributions learned by future models.
We present results for kernel density estimation, delivering nuanced insights such as the impact of mixed data training on error propagation.
arXiv Detail & Related papers (2024-02-19T02:08:09Z) - Efficient Training of Energy-Based Models Using Jarzynski Equality [13.636994997309307]
Energy-based models (EBMs) are generative models inspired by statistical physics.
The computation of its gradient with respect to the model parameters requires sampling the model distribution.
Here we show how results for nonequilibrium thermodynamics based on Jarzynski equality can be used to perform this computation efficiently.
arXiv Detail & Related papers (2023-05-30T21:07:52Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - MINIMALIST: Mutual INformatIon Maximization for Amortized Likelihood
Inference from Sampled Trajectories [61.3299263929289]
Simulation-based inference enables learning the parameters of a model even when its likelihood cannot be computed in practice.
One class of methods uses data simulated with different parameters to infer an amortized estimator for the likelihood-to-evidence ratio.
We show that this approach can be formulated in terms of mutual information between model parameters and simulated data.
arXiv Detail & Related papers (2021-06-03T12:59:16Z) - A Twin Neural Model for Uplift [59.38563723706796]
Uplift is a particular case of conditional treatment effect modeling.
We propose a new loss function defined by leveraging a connection with the Bayesian interpretation of the relative risk.
We show our proposed method is competitive with the state-of-the-art in simulation setting and on real data from large scale randomized experiments.
arXiv Detail & Related papers (2021-05-11T16:02:39Z) - Reconstruction of Pairwise Interactions using Energy-Based Models [3.553493344868414]
We show that hybrid models, which combine a pairwise model and a neural network, can lead to significant improvements in the reconstruction of pairwise interactions.
This is in line with the general idea that simple interpretable models and complex black-box models are not necessarily a dichotomy.
arXiv Detail & Related papers (2020-12-11T20:15:10Z) - Combining data assimilation and machine learning to emulate a dynamical
model from sparse and noisy observations: a case study with the Lorenz 96
model [0.0]
The method consists in applying iteratively a data assimilation step, here an ensemble Kalman filter, and a neural network.
Data assimilation is used to optimally combine a surrogate model with sparse data.
The output analysis is spatially complete and is used as a training set by the neural network to update the surrogate model.
Numerical experiments have been carried out using the chaotic 40-variables Lorenz 96 model, proving both convergence and statistical skill of the proposed hybrid approach.
arXiv Detail & Related papers (2020-01-06T12:26:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.