elhmc: An R Package for Hamiltonian Monte Carlo Sampling in Bayesian
Empirical Likelihood
- URL: http://arxiv.org/abs/2209.01289v1
- Date: Fri, 2 Sep 2022 22:22:16 GMT
- Title: elhmc: An R Package for Hamiltonian Monte Carlo Sampling in Bayesian
Empirical Likelihood
- Authors: Dang Trung Kien and Neo Han Wei and Sanjay Chaudhuri
- Abstract summary: We describe a package for sampling from an empirical likelihood-based posterior using a Hamiltonian Carlo method.
An MCMC sample is drawn from the BayesEL posterior of the parameters, with various details required by the user.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this article, we describe a {\tt R} package for sampling from an empirical
likelihood-based posterior using a Hamiltonian Monte Carlo method. Empirical
likelihood-based methodologies have been used in Bayesian modeling of many
problems of interest in recent times. This semiparametric procedure can easily
combine the flexibility of a non-parametric distribution estimator together
with the interpretability of a parametric model. The model is specified by
estimating equations-based constraints. Drawing an inference from a Bayesian
empirical likelihood (BayesEL) posterior is challenging. The likelihood is
computed numerically, so no closed expression of the posterior exists.
Moreover, for any sample of finite size, the support of the likelihood is
non-convex, which hinders the fast mixing of many Markov Chain Monte Carlo
(MCMC) procedures. It has been recently shown that using the properties of the
gradient of log empirical likelihood, one can devise an efficient Hamiltonian
Monte Carlo (HMC) algorithm to sample from a BayesEL posterior.
The package requires the user to specify only the estimating equations, the
prior, and their respective gradients. An MCMC sample drawn from the BayesEL
posterior of the parameters, with various details required by the user is
obtained.
Related papers
- Feynman-Kac Correctors in Diffusion: Annealing, Guidance, and Product of Experts [64.34482582690927]
We provide an efficient and principled method for sampling from a sequence of annealed, geometric-averaged, or product distributions derived from pretrained score-based models.
We propose Sequential Monte Carlo (SMC) resampling algorithms that leverage inference-time scaling to improve sampling quality.
arXiv Detail & Related papers (2025-03-04T17:46:51Z) - von Mises Quasi-Processes for Bayesian Circular Regression [57.88921637944379]
We explore a family of expressive and interpretable distributions over circle-valued random functions.
The resulting probability model has connections with continuous spin models in statistical physics.
For posterior inference, we introduce a new Stratonovich-like augmentation that lends itself to fast Markov Chain Monte Carlo sampling.
arXiv Detail & Related papers (2024-06-19T01:57:21Z) - Gaussian Process Regression with Soft Inequality and Monotonicity Constraints [0.0]
We introduce a new GP method that enforces the physical constraints in a probabilistic manner.
This GP model is trained by the quantum-inspired Hamiltonian Monte Carlo (QHMC)
arXiv Detail & Related papers (2024-04-03T17:09:25Z) - Unbiased Kinetic Langevin Monte Carlo with Inexact Gradients [0.8749675983608172]
We present an unbiased method for posterior means based on kinetic Langevin dynamics.
Our proposed estimator is unbiased, attains finite variance, and satisfies a central limit theorem.
Our results demonstrate that in large-scale applications, the unbiased algorithm we present can be 2-3 orders of magnitude more efficient than the gold-standard" randomized Hamiltonian Monte Carlo.
arXiv Detail & Related papers (2023-11-08T21:19:52Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Object based Bayesian full-waveform inversion for shear elastography [0.0]
We develop a computational framework to quantify uncertainty in shear elastography imaging of anomalies in tissues.
We find the posterior probability of parameter fields representing the geometry of the anomalies and their shear moduli.
We demonstrate the approach on synthetic two dimensional tests with smooth and irregular shapes.
arXiv Detail & Related papers (2023-05-11T08:25:25Z) - A Two-step Metropolis Hastings Method for Bayesian Empirical Likelihood
Computation with Application to Bayesian Model Selection [0.0]
We propose a two-step reversible Monte Carlo algorithm to sample from a set posterior of a Markov chain.
We also discuss empirical likelihood and extend our two-step reversible Monte Carlo algorithm to a jump chain procedure to sample from the resulting set.
arXiv Detail & Related papers (2022-09-02T20:40:21Z) - Low-variance estimation in the Plackett-Luce model via quasi-Monte Carlo
sampling [58.14878401145309]
We develop a novel approach to producing more sample-efficient estimators of expectations in the PL model.
We illustrate our findings both theoretically and empirically using real-world recommendation data from Amazon Music and the Yahoo learning-to-rank challenge.
arXiv Detail & Related papers (2022-05-12T11:15:47Z) - Inverting brain grey matter models with likelihood-free inference: a
tool for trustable cytoarchitecture measurements [62.997667081978825]
characterisation of the brain grey matter cytoarchitecture with quantitative sensitivity to soma density and volume remains an unsolved challenge in dMRI.
We propose a new forward model, specifically a new system of equations, requiring a few relatively sparse b-shells.
We then apply modern tools from Bayesian analysis known as likelihood-free inference (LFI) to invert our proposed model.
arXiv Detail & Related papers (2021-11-15T09:08:27Z) - A fast asynchronous MCMC sampler for sparse Bayesian inference [10.535140830570256]
We propose a very fast approximate Markov Chain Monte Carlo (MCMC) sampling framework that is applicable to a large class of sparse Bayesian inference problems.
We show that in high-dimensional linear regression problems, the Markov chain generated by the proposed algorithm admits an invariant distribution that recovers correctly the main signal.
arXiv Detail & Related papers (2021-08-14T02:20:49Z) - Efficiently Sampling Functions from Gaussian Process Posteriors [76.94808614373609]
We propose an easy-to-use and general-purpose approach for fast posterior sampling.
We demonstrate how decoupled sample paths accurately represent Gaussian process posteriors at a fraction of the usual cost.
arXiv Detail & Related papers (2020-02-21T14:03:16Z) - Distributed, partially collapsed MCMC for Bayesian Nonparametrics [68.5279360794418]
We exploit the fact that completely random measures, which commonly used models like the Dirichlet process and the beta-Bernoulli process can be expressed as, are decomposable into independent sub-measures.
We use this decomposition to partition the latent measure into a finite measure containing only instantiated components, and an infinite measure containing all other components.
The resulting hybrid algorithm can be applied to allow scalable inference without sacrificing convergence guarantees.
arXiv Detail & Related papers (2020-01-15T23:10:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.