Weighted Riesz Particles
- URL: http://arxiv.org/abs/2312.00621v1
- Date: Fri, 1 Dec 2023 14:36:46 GMT
- Title: Weighted Riesz Particles
- Authors: Xiongming Dai, Gerald Baumgartner
- Abstract summary: We consider the target distribution as a mapping where the infinite-dimensional space of the parameters consists of a number of deterministic submanifolds.
We study the properties of the point, called Riesz, and embed it into sequential MCMC.
We find that there will be higher acceptance rates with fewer evaluations.
- Score: 0.0
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Markov chain Monte Carlo (MCMC) methods are simulated by local exploration of
complex statistical distributions, and while bypassing the cumbersome
requirement of a specific analytical expression for the target, this stochastic
exploration of an uncertain parameter space comes at the expense of a large
number of samples, and this computational complexity increases with parameter
dimensionality. Although at the exploration level, some methods are proposed to
accelerate the convergence of the algorithm, such as tempering, Hamiltonian
Monte Carlo, Rao-redwellization, and scalable methods for better performance,
it cannot avoid the stochastic nature of this exploration. We consider the
target distribution as a mapping where the infinite-dimensional Eulerian space
of the parameters consists of a number of deterministic submanifolds and
propose a generalized energy metric, termed weighted Riesz energy, where a
number of points is generated through pairwise interactions, to discretize
rectifiable submanifolds. We study the properties of the point, called Riesz
particle, and embed it into sequential MCMC, and we find that there will be
higher acceptance rates with fewer evaluations, we validate it through
experimental comparative analysis from a linear Gaussian state-space model with
synthetic data and a non-linear stochastic volatility model with real-world
data.
Related papers
- Kinetic Interacting Particle Langevin Monte Carlo [0.0]
This paper introduces and analyses interacting underdamped Langevin algorithms, for statistical inference in latent variable models.
We propose a diffusion process that evolves jointly in the space of parameters and latent variables.
We provide two explicit discretisations of this diffusion as practical algorithms to estimate parameters of statistical models.
arXiv Detail & Related papers (2024-07-08T09:52:46Z) - Dimension-free Relaxation Times of Informed MCMC Samplers on Discrete Spaces [5.075066314996696]
We develop general mixing time bounds for Metropolis-Hastings algorithms on discrete spaces.
We establish sufficient conditions for a class of informed Metropolis-Hastings algorithms to attain relaxation times independent of the problem dimension.
arXiv Detail & Related papers (2024-04-05T02:40:45Z) - Chebyshev Particles [0.0]
We are first to consider the posterior distribution of the objective as a mapping of samples in an infinite-dimensional Euclidean space.
We propose a new criterion by maximizing the weighted Riesz polarization quantity, to discretize rectifiable submanifolds via pairwise interaction.
We have achieved high performance from the experiments for parameter inference in a linear state-space model with synthetic data and a non-linear volatility model with real-world data.
arXiv Detail & Related papers (2023-09-10T16:40:30Z) - Probabilistic Unrolling: Scalable, Inverse-Free Maximum Likelihood
Estimation for Latent Gaussian Models [69.22568644711113]
We introduce probabilistic unrolling, a method that combines Monte Carlo sampling with iterative linear solvers to circumvent matrix inversions.
Our theoretical analyses reveal that unrolling and backpropagation through the iterations of the solver can accelerate gradient estimation for maximum likelihood estimation.
In experiments on simulated and real data, we demonstrate that probabilistic unrolling learns latent Gaussian models up to an order of magnitude faster than gradient EM, with minimal losses in model performance.
arXiv Detail & Related papers (2023-06-05T21:08:34Z) - Object based Bayesian full-waveform inversion for shear elastography [0.0]
We develop a computational framework to quantify uncertainty in shear elastography imaging of anomalies in tissues.
We find the posterior probability of parameter fields representing the geometry of the anomalies and their shear moduli.
We demonstrate the approach on synthetic two dimensional tests with smooth and irregular shapes.
arXiv Detail & Related papers (2023-05-11T08:25:25Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Monte Carlo Neural PDE Solver for Learning PDEs via Probabilistic Representation [59.45669299295436]
We propose a Monte Carlo PDE solver for training unsupervised neural solvers.
We use the PDEs' probabilistic representation, which regards macroscopic phenomena as ensembles of random particles.
Our experiments on convection-diffusion, Allen-Cahn, and Navier-Stokes equations demonstrate significant improvements in accuracy and efficiency.
arXiv Detail & Related papers (2023-02-10T08:05:19Z) - Nonconvex Stochastic Scaled-Gradient Descent and Generalized Eigenvector
Problems [98.34292831923335]
Motivated by the problem of online correlation analysis, we propose the emphStochastic Scaled-Gradient Descent (SSD) algorithm.
We bring these ideas together in an application to online correlation analysis, deriving for the first time an optimal one-time-scale algorithm with an explicit rate of local convergence to normality.
arXiv Detail & Related papers (2021-12-29T18:46:52Z) - Surrogate models for quantum spin systems based on reduced order
modeling [0.0]
We present a methodology to investigate phase-diagrams of quantum models based on the principle of the reduced basis method (RBM)
We benchmark the method in two test cases, a chain of excited Rydberg atoms and a geometrically frustrated antiferromagnetic two-dimensional lattice model.
arXiv Detail & Related papers (2021-10-29T10:17:39Z) - Sampling in Combinatorial Spaces with SurVAE Flow Augmented MCMC [83.48593305367523]
Hybrid Monte Carlo is a powerful Markov Chain Monte Carlo method for sampling from complex continuous distributions.
We introduce a new approach based on augmenting Monte Carlo methods with SurVAE Flows to sample from discrete distributions.
We demonstrate the efficacy of our algorithm on a range of examples from statistics, computational physics and machine learning, and observe improvements compared to alternative algorithms.
arXiv Detail & Related papers (2021-02-04T02:21:08Z) - Understanding Implicit Regularization in Over-Parameterized Single Index
Model [55.41685740015095]
We design regularization-free algorithms for the high-dimensional single index model.
We provide theoretical guarantees for the induced implicit regularization phenomenon.
arXiv Detail & Related papers (2020-07-16T13:27:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.