Variational Combinatorial Sequential Monte Carlo Methods for Bayesian
Phylogenetic Inference
- URL: http://arxiv.org/abs/2106.00075v1
- Date: Mon, 31 May 2021 19:44:24 GMT
- Title: Variational Combinatorial Sequential Monte Carlo Methods for Bayesian
Phylogenetic Inference
- Authors: Antonio Khalil Moretti, Liyi Zhang, Christian A. Naesseth, Hadiah
Venner, David Blei, Itsik Pe'er
- Abstract summary: We introduce Vari Combinatorial Monte Carlo (VCSMC), a powerful framework that establishes variational search to learn over intricate structures.
We show that VCSMC and CSMC are efficient and explore higher probability spaces than existing methods on a range of tasks.
- Score: 4.339931151475307
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Bayesian phylogenetic inference is often conducted via local or sequential
search over topologies and branch lengths using algorithms such as random-walk
Markov chain Monte Carlo (MCMC) or Combinatorial Sequential Monte Carlo (CSMC).
However, when MCMC is used for evolutionary parameter learning, convergence
requires long runs with inefficient exploration of the state space. We
introduce Variational Combinatorial Sequential Monte Carlo (VCSMC), a powerful
framework that establishes variational sequential search to learn distributions
over intricate combinatorial structures. We then develop nested CSMC, an
efficient proposal distribution for CSMC and prove that nested CSMC is an exact
approximation to the (intractable) locally optimal proposal. We use nested CSMC
to define a second objective, VNCSMC which yields tighter lower bounds than
VCSMC. We show that VCSMC and VNCSMC are computationally efficient and explore
higher probability spaces than existing methods on a range of tasks.
Related papers
- Learning to Explore for Stochastic Gradient MCMC [15.286308920219446]
We propose a meta-learning strategy to build glssgmcmc which can efficiently explore the multi-modal target distributions.
Our algorithm allows the learned SGMCMC to quickly explore the high-density region of the posterior landscape.
arXiv Detail & Related papers (2024-08-17T08:36:42Z) - Convergence Bounds for Sequential Monte Carlo on Multimodal Distributions using Soft Decomposition [6.872242798058046]
We prove bounds on the variance of a function $f$ under the empirical measure of the samples obtained by the Sequential Monte Carlo (SMC) algorithm.
We show that bounds can be obtained in the truly multi-modal setting, with mixing times that depend on local MCMC dynamics.
arXiv Detail & Related papers (2024-05-29T22:43:45Z) - SMC Is All You Need: Parallel Strong Scaling [0.695967916921061]
We develop a fully parallel sequential Monte Carlo (pSMC) method which provably delivers parallel strong scaling.
The pSMC converges to infinitesimal accuracy MSE$=O(varepsilon2)$ with a fixed finite time-complexity Cost$=O(1)$ and with no efficiency leakage.
arXiv Detail & Related papers (2024-02-09T04:13:38Z) - Bayesian Decision Trees Inspired from Evolutionary Algorithms [64.80360020499555]
We propose a replacement of the Markov Chain Monte Carlo (MCMC) with an inherently parallel algorithm, the Sequential Monte Carlo (SMC)
Experiments show that SMC combined with the Evolutionary Algorithms (EA) can produce more accurate results compared to MCMC in 100 times fewer iterations.
arXiv Detail & Related papers (2023-05-30T06:17:35Z) - Quick Adaptive Ternary Segmentation: An Efficient Decoding Procedure For
Hidden Markov Models [70.26374282390401]
Decoding the original signal (i.e., hidden chain) from the noisy observations is one of the main goals in nearly all HMM based data analyses.
We present Quick Adaptive Ternary (QATS), a divide-and-conquer procedure which decodes the hidden sequence in polylogarithmic computational complexity.
arXiv Detail & Related papers (2023-05-29T19:37:48Z) - Parallel Approaches to Accelerate Bayesian Decision Trees [1.9728521995447947]
We propose two methods for exploiting parallelism in the MCMC.
In the first, we replace the MCMC with another numerical Bayesian approach.
In the second, we consider data partitioning.
arXiv Detail & Related papers (2023-01-22T09:56:26Z) - Continuous Monte Carlo Graph Search [61.11769232283621]
Continuous Monte Carlo Graph Search ( CMCGS) is an extension of Monte Carlo Tree Search (MCTS) to online planning.
CMCGS takes advantage of the insight that, during planning, sharing the same action policy between several states can yield high performance.
It can be scaled up through parallelization, and it outperforms the Cross-Entropy Method (CEM) in continuous control with learned dynamics models.
arXiv Detail & Related papers (2022-10-04T07:34:06Z) - Local Sample-weighted Multiple Kernel Clustering with Consensus
Discriminative Graph [73.68184322526338]
Multiple kernel clustering (MKC) is committed to achieving optimal information fusion from a set of base kernels.
This paper proposes a novel local sample-weighted multiple kernel clustering model.
Experimental results demonstrate that our LSWMKC possesses better local manifold representation and outperforms existing kernel or graph-based clustering algo-rithms.
arXiv Detail & Related papers (2022-07-05T05:00:38Z) - Compressed Monte Carlo with application in particle filtering [11.84836209560411]
We introduce the theory and practice of a Compressed MC (C-MC) scheme to compress the statistical information contained in a set of random samples.
C-MC is useful within particle filtering and adaptive IS algorithms, as shown by three novel schemes introduced in this work.
arXiv Detail & Related papers (2021-07-18T14:32:04Z) - Kernel learning approaches for summarising and combining posterior
similarity matrices [68.8204255655161]
We build upon the notion of the posterior similarity matrix (PSM) in order to suggest new approaches for summarising the output of MCMC algorithms for Bayesian clustering models.
A key contribution of our work is the observation that PSMs are positive semi-definite, and hence can be used to define probabilistically-motivated kernel matrices.
arXiv Detail & Related papers (2020-09-27T14:16:14Z) - Involutive MCMC: a Unifying Framework [64.46316409766764]
We describe a wide range of MCMC algorithms in terms of iMCMC.
We formulate a number of "tricks" which one can use as design principles for developing new MCMC algorithms.
We demonstrate the latter with two examples where we transform known reversible MCMC algorithms into more efficient irreversible ones.
arXiv Detail & Related papers (2020-06-30T10:21:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.