Multiplicative Gaussian Particle Filter
- URL: http://arxiv.org/abs/2003.00218v1
- Date: Sat, 29 Feb 2020 09:19:38 GMT
- Title: Multiplicative Gaussian Particle Filter
- Authors: Xuan Su, Wee Sun Lee, Zhen Zhang
- Abstract summary: We propose a new sampling-based approach for approximate inference in filtering problems.
Instead of approximating conditional distributions with a finite set of states, as done in particle filters, our approach approximates the distribution with a weighted sum of functions from a set of continuous functions.
- Score: 18.615555573235987
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a new sampling-based approach for approximate inference in
filtering problems. Instead of approximating conditional distributions with a
finite set of states, as done in particle filters, our approach approximates
the distribution with a weighted sum of functions from a set of continuous
functions. Central to the approach is the use of sampling to approximate
multiplications in the Bayes filter. We provide theoretical analysis, giving
conditions for sampling to give good approximation. We next specialize to the
case of weighted sums of Gaussians, and show how properties of Gaussians enable
closed-form transition and efficient multiplication. Lastly, we conduct
preliminary experiments on a robot localization problem and compare performance
with the particle filter, to demonstrate the potential of the proposed method.
Related papers
- Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Nonlinear Filtering with Brenier Optimal Transport Maps [4.745059103971596]
This paper is concerned with the problem of nonlinear filtering, i.e., computing the conditional distribution of the state of a dynamical system.
Conventional sequential importance resampling (SIR) particle filters suffer from fundamental limitations, in scenarios involving degenerate likelihoods or high-dimensional states.
In this paper, we explore an alternative method, which is based on estimating the Brenier optimal transport (OT) map from the current prior distribution of the state to the posterior distribution at the next time step.
arXiv Detail & Related papers (2023-10-21T01:34:30Z) - Noise-Free Sampling Algorithms via Regularized Wasserstein Proximals [3.4240632942024685]
We consider the problem of sampling from a distribution governed by a potential function.
This work proposes an explicit score based MCMC method that is deterministic, resulting in a deterministic evolution for particles.
arXiv Detail & Related papers (2023-08-28T23:51:33Z) - Sampling with Mollified Interaction Energy Descent [57.00583139477843]
We present a new optimization-based method for sampling called mollified interaction energy descent (MIED)
MIED minimizes a new class of energies on probability measures called mollified interaction energies (MIEs)
We show experimentally that for unconstrained sampling problems our algorithm performs on par with existing particle-based algorithms like SVGD.
arXiv Detail & Related papers (2022-10-24T16:54:18Z) - Continuous-time Particle Filtering for Latent Stochastic Differential
Equations [37.51802583388233]
We propose continuous latent particle filters, an approach that extends particle filtering to the continuous-time domain.
We demonstrate how continuous latent particle filters can be used as a generic plug-in replacement for inference techniques relying on a learned variational posterior.
arXiv Detail & Related papers (2022-09-01T01:05:31Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Pathwise Conditioning of Gaussian Processes [72.61885354624604]
Conventional approaches for simulating Gaussian process posteriors view samples as draws from marginal distributions of process values at finite sets of input locations.
This distribution-centric characterization leads to generative strategies that scale cubically in the size of the desired random vector.
We show how this pathwise interpretation of conditioning gives rise to a general family of approximations that lend themselves to efficiently sampling Gaussian process posteriors.
arXiv Detail & Related papers (2020-11-08T17:09:37Z) - Innovative And Additive Outlier Robust Kalman Filtering With A Robust
Particle Filter [68.8204255655161]
We propose CE-BASS, a particle mixture Kalman filter which is robust to both innovative and additive outliers, and able to fully capture multi-modality in the distribution of the hidden state.
Furthermore, the particle sampling approach re-samples past states, which enables CE-BASS to handle innovative outliers which are not immediately visible in the observations, such as trend changes.
arXiv Detail & Related papers (2020-07-07T07:11:09Z) - Approximating Posterior Predictive Distributions by Averaging Output
From Many Particle Filters [0.0]
This paper introduces the it particle swarm filter (not to be confused with particle swarm optimization)
It targets an approximation to the sequence of posterior predictive distributions by averaging expectation approximations from many particle filters.
A law of large numbers and a central limit theorem are provided, as well as a numerical study of simulated data from a volatility model.
arXiv Detail & Related papers (2020-06-27T16:14:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.