Improved Variational Bayesian Phylogenetic Inference with Normalizing
Flows
- URL: http://arxiv.org/abs/2012.00459v1
- Date: Tue, 1 Dec 2020 13:10:00 GMT
- Title: Improved Variational Bayesian Phylogenetic Inference with Normalizing
Flows
- Authors: Cheng Zhang
- Abstract summary: We propose a new type of VBPI, VBPI-NF, as a first step to empower phylogenetic posterior estimation with deep learning techniques.
VBPI-NF uses normalizing flows to provide a rich family of flexible branch length distributions that generalize across different tree topologies.
- Score: 7.119831726757417
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Variational Bayesian phylogenetic inference (VBPI) provides a promising
general variational framework for efficient estimation of phylogenetic
posteriors. However, the current diagonal Lognormal branch length approximation
would significantly restrict the quality of the approximating distributions. In
this paper, we propose a new type of VBPI, VBPI-NF, as a first step to empower
phylogenetic posterior estimation with deep learning techniques. By handling
the non-Euclidean branch length space of phylogenetic models with carefully
designed permutation equivariant transformations, VBPI-NF uses normalizing
flows to provide a rich family of flexible branch length distributions that
generalize across different tree topologies. We show that VBPI-NF significantly
improves upon the vanilla VBPI on a benchmark of challenging real data Bayesian
phylogenetic inference problems. Further investigation also reveals that the
structured parameterization in those permutation equivariant transformations
can provide additional amortization benefit.
Related papers
- Improving Tree Probability Estimation with Stochastic Optimization and Variance Reduction [11.417249588622926]
Subsplit Bayesian networks (SBNs) provide a powerful probabilistic graphical model for tree probability estimation.
The expectation (EM) method currently used for learning SBN parameters does not scale up to large data sets.
We introduce several computationally efficient methods for training SBNs and show that variance reduction could be the key for better performance.
arXiv Detail & Related papers (2024-09-09T02:22:52Z) - Variational Bayesian Phylogenetic Inference with Semi-implicit Branch Length Distributions [6.553961278427792]
We propose a more flexible family of branch length variational posteriors based on semi-implicit hierarchical distributions using graph neural networks.
We show that this construction emits straightforward permutation equivariant distributions, and therefore can handle the non-Euclidean branch length space across different tree topologies with ease.
arXiv Detail & Related papers (2024-08-09T13:29:08Z) - PhyloGFN: Phylogenetic inference with generative flow networks [57.104166650526416]
We introduce the framework of generative flow networks (GFlowNets) to tackle two core problems in phylogenetics: parsimony-based and phylogenetic inference.
Because GFlowNets are well-suited for sampling complex structures, they are a natural choice for exploring and sampling from the multimodal posterior distribution over tree topologies.
We demonstrate that our amortized posterior sampler, PhyloGFN, produces diverse and high-quality evolutionary hypotheses on real benchmark datasets.
arXiv Detail & Related papers (2023-10-12T23:46:08Z) - Improved Variational Bayesian Phylogenetic Inference using Mixtures [4.551386476350572]
VBPI-Mixtures is an algorithm designed to enhance the accuracy of phylogenetic posterior distributions.
VBPI-Mixtures is capable of capturing distributions over tree-topologies that VBPI fails to model.
arXiv Detail & Related papers (2023-10-02T07:18:48Z) - Prior Density Learning in Variational Bayesian Phylogenetic Parameters
Inference [1.03590082373586]
We propose an approach to relax the rigidity of the prior densities by learning their parameters using a gradient-based method and a neural network-based parameterization.
The results of performed simulations show that the approach is powerful in estimating branch lengths and evolutionary model parameters.
arXiv Detail & Related papers (2023-02-06T01:29:15Z) - Manifold Gaussian Variational Bayes on the Precision Matrix [70.44024861252554]
We propose an optimization algorithm for Variational Inference (VI) in complex models.
We develop an efficient algorithm for Gaussian Variational Inference whose updates satisfy the positive definite constraint on the variational covariance matrix.
Due to its black-box nature, MGVBP stands as a ready-to-use solution for VI in complex models.
arXiv Detail & Related papers (2022-10-26T10:12:31Z) - Quasi Black-Box Variational Inference with Natural Gradients for
Bayesian Learning [84.90242084523565]
We develop an optimization algorithm suitable for Bayesian learning in complex models.
Our approach relies on natural gradient updates within a general black-box framework for efficient training with limited model-specific derivations.
arXiv Detail & Related papers (2022-05-23T18:54:27Z) - Fat-Tailed Variational Inference with Anisotropic Tail Adaptive Flows [53.32246823168763]
Fat-tailed densities commonly arise as posterior and marginal distributions in robust models and scale mixtures.
We first improve previous theory on tails of Lipschitz flows by quantifying how tails affect the rate of tail decay.
We then develop an alternative theory for tail parameters which is sensitive to tail-anisotropy.
arXiv Detail & Related papers (2022-05-16T18:03:41Z) - A Variational Approach to Bayesian Phylogenetic Inference [7.251627034538359]
We present a variational framework for Bayesian phylogenetic analysis.
We train the variational approximation via Markov gradient ascent and adopt estimators for continuous and discrete variational parameters.
Experiments on a benchmark of challenging real data phylogenetic inference problems demonstrate the effectiveness and efficiency of our methods.
arXiv Detail & Related papers (2022-04-16T08:23:48Z) - A Variational Bayesian Approach to Learning Latent Variables for
Acoustic Knowledge Transfer [55.20627066525205]
We propose a variational Bayesian (VB) approach to learning distributions of latent variables in deep neural network (DNN) models.
Our proposed VB approach can obtain good improvements on target devices, and consistently outperforms 13 state-of-the-art knowledge transfer algorithms.
arXiv Detail & Related papers (2021-10-16T15:54:01Z) - Quantitative Understanding of VAE as a Non-linearly Scaled Isometric
Embedding [52.48298164494608]
Variational autoencoder (VAE) estimates the posterior parameters of latent variables corresponding to each input data.
This paper provides a quantitative understanding of VAE property through the differential geometric and information-theoretic interpretations of VAE.
arXiv Detail & Related papers (2020-07-30T02:37:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.