Analysis of Bayesian Inference Algorithms by the Dynamical Functional
Approach
- URL: http://arxiv.org/abs/2001.04918v1
- Date: Tue, 14 Jan 2020 17:22:02 GMT
- Title: Analysis of Bayesian Inference Algorithms by the Dynamical Functional
Approach
- Authors: Burak \c{C}akmak and Manfred Opper
- Abstract summary: We analyze an algorithm for approximate inference with large Gaussian latent variable models in a student-trivial scenario.
For the case of perfect data-model matching, the knowledge of static order parameters derived from the replica method allows us to obtain efficient algorithmic updates.
- Score: 2.8021833233819486
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We analyze the dynamics of an algorithm for approximate inference with large
Gaussian latent variable models in a student-teacher scenario. To model
nontrivial dependencies between the latent variables, we assume random
covariance matrices drawn from rotation invariant ensembles. For the case of
perfect data-model matching, the knowledge of static order parameters derived
from the replica method allows us to obtain efficient algorithmic updates in
terms of matrix-vector multiplications with a fixed matrix. Using the dynamical
functional approach, we obtain an exact effective stochastic process in the
thermodynamic limit for a single node. From this, we obtain closed-form
expressions for the rate of the convergence. Analytical results are excellent
agreement with simulations of single instances of large models.
Related papers
- Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Auxiliary Functions as Koopman Observables: Data-Driven Analysis of
Dynamical Systems via Polynomial Optimization [0.0]
We present a flexible data-driven method for system analysis that does not require explicit model discovery.
The method is rooted in well-established techniques for approxing the Koopman operator from data and is implemented as a semidefinite program that can be solved numerically.
arXiv Detail & Related papers (2023-03-02T18:44:18Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Learning Graphical Factor Models with Riemannian Optimization [70.13748170371889]
This paper proposes a flexible algorithmic framework for graph learning under low-rank structural constraints.
The problem is expressed as penalized maximum likelihood estimation of an elliptical distribution.
We leverage geometries of positive definite matrices and positive semi-definite matrices of fixed rank that are well suited to elliptical models.
arXiv Detail & Related papers (2022-10-21T13:19:45Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - Analysis of Random Sequential Message Passing Algorithms for Approximate
Inference [18.185200593985844]
We analyze the dynamics of a random sequential message passing algorithm for approximate inference with large Gaussian latent variable models.
We derive a range of model parameters for which the sequential algorithm does not converge.
arXiv Detail & Related papers (2022-02-16T17:16:22Z) - Generalized Matrix Factorization: efficient algorithms for fitting
generalized linear latent variable models to large data arrays [62.997667081978825]
Generalized Linear Latent Variable models (GLLVMs) generalize such factor models to non-Gaussian responses.
Current algorithms for estimating model parameters in GLLVMs require intensive computation and do not scale to large datasets.
We propose a new approach for fitting GLLVMs to high-dimensional datasets, based on approximating the model using penalized quasi-likelihood.
arXiv Detail & Related papers (2020-10-06T04:28:19Z) - Model Fusion with Kullback--Leibler Divergence [58.20269014662046]
We propose a method to fuse posterior distributions learned from heterogeneous datasets.
Our algorithm relies on a mean field assumption for both the fused model and the individual dataset posteriors.
arXiv Detail & Related papers (2020-07-13T03:27:45Z) - A Dynamical Mean-Field Theory for Learning in Restricted Boltzmann
Machines [2.8021833233819486]
We define a message-passing algorithm for computing magnetizations in Boltzmann machines.
We prove the global convergence of the algorithm under a stability criterion and compute convergence rates showing excellent agreement with numerical simulations.
arXiv Detail & Related papers (2020-05-04T15:19:31Z) - Estimation of sparse Gaussian graphical models with hidden clustering
structure [8.258451067861932]
We propose a model to estimate the sparse Gaussian graphical models with hidden clustering structure.
We develop a symmetric Gauss-Seidel based alternating direction method of the multipliers.
Numerical experiments on both synthetic data and real data demonstrate the good performance of our model.
arXiv Detail & Related papers (2020-04-17T08:43:31Z) - Semi-analytic approximate stability selection for correlated data in
generalized linear models [3.42658286826597]
We propose a novel approximate inference algorithm that can conduct Stability Selection without the repeated fitting.
The algorithm is based on the replica method of statistical mechanics and vector approximate message passing of information theory.
Numerical experiments indicate that the algorithm exhibits fast convergence and high approximation accuracy for both synthetic and real-world data.
arXiv Detail & Related papers (2020-03-19T10:43:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.