Prior Density Learning in Variational Bayesian Phylogenetic Parameters
Inference
- URL: http://arxiv.org/abs/2302.02522v3
- Date: Fri, 8 Sep 2023 21:06:06 GMT
- Title: Prior Density Learning in Variational Bayesian Phylogenetic Parameters
Inference
- Authors: Amine M. Remita, Golrokh Vitae and Abdoulaye Banir\'e Diallo
- Abstract summary: We propose an approach to relax the rigidity of the prior densities by learning their parameters using a gradient-based method and a neural network-based parameterization.
The results of performed simulations show that the approach is powerful in estimating branch lengths and evolutionary model parameters.
- Score: 1.03590082373586
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The advances in variational inference are providing promising paths in
Bayesian estimation problems. These advances make variational phylogenetic
inference an alternative approach to Markov Chain Monte Carlo methods for
approximating the phylogenetic posterior. However, one of the main drawbacks of
such approaches is modelling the prior through fixed distributions, which could
bias the posterior approximation if they are distant from the current data
distribution. In this paper, we propose an approach and an implementation
framework to relax the rigidity of the prior densities by learning their
parameters using a gradient-based method and a neural network-based
parameterization. We applied this approach for branch lengths and evolutionary
parameters estimation under several Markov chain substitution models. The
results of performed simulations show that the approach is powerful in
estimating branch lengths and evolutionary model parameters. They also show
that a flexible prior model could provide better results than a predefined
prior model. Finally, the results highlight that using neural networks improves
the initialization of the optimization of the prior density parameters.
Related papers
- Variational Autoencoders for Efficient Simulation-Based Inference [0.3495246564946556]
We present a generative modeling approach based on the variational inference framework for likelihood-free simulation-based inference.
We demonstrate the efficacy of these models on well-established benchmark problems, achieving results comparable to flow-based approaches.
arXiv Detail & Related papers (2024-11-21T12:24:13Z) - Joint Bayesian Inference of Graphical Structure and Parameters with a
Single Generative Flow Network [59.79008107609297]
We propose in this paper to approximate the joint posterior over the structure of a Bayesian Network.
We use a single GFlowNet whose sampling policy follows a two-phase process.
Since the parameters are included in the posterior distribution, this leaves more flexibility for the local probability models.
arXiv Detail & Related papers (2023-05-30T19:16:44Z) - Bayesian Analysis for Over-parameterized Linear Model without Sparsity [8.1585306387285]
This study introduces a Bayesian approach that employs a prior distribution dependent on the eigenvectors of data covariance matrices without inducing parameter sparsity.
We also provide contraction rates of the derived posterior estimation and develop a truncated Gaussian approximation of the posterior distribution.
These findings suggest that Bayesian methods capable of handling data spectra and estimating non-sparse high-dimensional parameters are feasible.
arXiv Detail & Related papers (2023-05-25T06:07:47Z) - Variational EP with Probabilistic Backpropagation for Bayesian Neural
Networks [0.0]
I propose a novel approach for nonlinear Logistic regression using a two-layer neural network (NN) model structure with hierarchical priors on the network weights.
I derive a computationally efficient algorithm, whose complexity scales similarly to an ensemble of independent sparse logistic models.
arXiv Detail & Related papers (2023-03-02T19:09:47Z) - Variational Laplace Autoencoders [53.08170674326728]
Variational autoencoders employ an amortized inference model to approximate the posterior of latent variables.
We present a novel approach that addresses the limited posterior expressiveness of fully-factorized Gaussian assumption.
We also present a general framework named Variational Laplace Autoencoders (VLAEs) for training deep generative models.
arXiv Detail & Related papers (2022-11-30T18:59:27Z) - Sparse high-dimensional linear regression with a partitioned empirical
Bayes ECM algorithm [62.997667081978825]
We propose a computationally efficient and powerful Bayesian approach for sparse high-dimensional linear regression.
Minimal prior assumptions on the parameters are used through the use of plug-in empirical Bayes estimates.
The proposed approach is implemented in the R package probe.
arXiv Detail & Related papers (2022-09-16T19:15:50Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Estimation of Switched Markov Polynomial NARX models [75.91002178647165]
We identify a class of models for hybrid dynamical systems characterized by nonlinear autoregressive (NARX) components.
The proposed approach is demonstrated on a SMNARX problem composed by three nonlinear sub-models with specific regressors.
arXiv Detail & Related papers (2020-09-29T15:00:47Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z) - Sparse Gaussian Processes Revisited: Bayesian Approaches to
Inducing-Variable Approximations [27.43948386608]
Variational inference techniques based on inducing variables provide an elegant framework for scalable estimation in Gaussian process (GP) models.
In this work we challenge the common wisdom that optimizing the inducing inputs in variational framework yields optimal performance.
arXiv Detail & Related papers (2020-03-06T08:53:18Z) - The k-tied Normal Distribution: A Compact Parameterization of Gaussian
Mean Field Posteriors in Bayesian Neural Networks [46.677567663908185]
Variational Bayesian Inference is a popular methodology for approxing posteriorimating over Bayesian neural network weights.
Recent work has explored ever richer parameterizations of the approximate posterior in the hope of improving performance.
We find that by decomposing these variational parameters into a low-rank factorization, we can make our variational approximation more compact without decreasing the models' performance.
arXiv Detail & Related papers (2020-02-07T07:33:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.