Ensemble Transport Filter via Optimized Maximum Mean Discrepancy
- URL: http://arxiv.org/abs/2407.11518v1
- Date: Tue, 16 Jul 2024 08:54:12 GMT
- Title: Ensemble Transport Filter via Optimized Maximum Mean Discrepancy
- Authors: Dengfei Zeng, Lijian Jiang,
- Abstract summary: We present a new ensemble-based filter method by reconstructing the analysis step of the particle filter through a transport map.
The transport map is constructed through an optimization problem described by the Maximum Mean Discrepancy loss function.
A few numerical examples are presented to illustrate the advantage of the proposed method over the ensemble Kalman filter.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a new ensemble-based filter method by reconstructing the analysis step of the particle filter through a transport map, which directly transports prior particles to posterior particles. The transport map is constructed through an optimization problem described by the Maximum Mean Discrepancy loss function, which matches the expectation information of the approximated posterior and reference posterior. The proposed method inherits the accurate estimation of the posterior distribution from particle filtering. To improve the robustness of Maximum Mean Discrepancy, a variance penalty term is used to guide the optimization. It prioritizes minimizing the discrepancy between the expectations of highly informative statistics for the approximated and reference posteriors. The penalty term significantly enhances the robustness of the proposed method and leads to a better approximation of the posterior. A few numerical examples are presented to illustrate the advantage of the proposed method over the ensemble Kalman filter.
Related papers
- Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Improving Diffusion Models for Inverse Problems Using Optimal Posterior Covariance [52.093434664236014]
Recent diffusion models provide a promising zero-shot solution to noisy linear inverse problems without retraining for specific inverse problems.
Inspired by this finding, we propose to improve recent methods by using more principled covariance determined by maximum likelihood estimation.
arXiv Detail & Related papers (2024-02-03T13:35:39Z) - Nonlinear Filtering with Brenier Optimal Transport Maps [4.745059103971596]
This paper is concerned with the problem of nonlinear filtering, i.e., computing the conditional distribution of the state of a dynamical system.
Conventional sequential importance resampling (SIR) particle filters suffer from fundamental limitations, in scenarios involving degenerate likelihoods or high-dimensional states.
In this paper, we explore an alternative method, which is based on estimating the Brenier optimal transport (OT) map from the current prior distribution of the state to the posterior distribution at the next time step.
arXiv Detail & Related papers (2023-10-21T01:34:30Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - An Optimal Transport Formulation of Bayes' Law for Nonlinear Filtering
Algorithms [7.919213739992465]
This paper presents a variational representation of the Bayes' law using optimal transportation theory.
By imposing certain structure on the transport map, the solution to the variational problem is used to construct a Brenier-type map.
The proposed methodology is used to derive the optimal transport form of the feedback particle filler (FPF) in the continuous-time limit.
arXiv Detail & Related papers (2022-03-22T16:43:33Z) - A Dimensionality Reduction Method for Finding Least Favorable Priors
with a Focus on Bregman Divergence [108.28566246421742]
This paper develops a dimensionality reduction method that allows us to move the optimization to a finite-dimensional setting with an explicit bound on the dimension.
In order to make progress on the problem, we restrict ourselves to Bayesian risks induced by a relatively large class of loss functions, namely Bregman divergences.
arXiv Detail & Related papers (2022-02-23T16:22:28Z) - Deep Convolutional Correlation Iterative Particle Filter for Visual
Tracking [1.1531505895603305]
This work proposes a novel framework for visual tracking based on the integration of an iterative particle filter, a deep convolutional neural network, and a correlation filter.
We employ a novel strategy to assess the likelihood of the particles after the iterations by applying K-means clustering.
Experimental results on two different benchmark datasets show that our tracker performs favorably against state-of-the-art methods.
arXiv Detail & Related papers (2021-07-07T02:44:43Z) - Variational Refinement for Importance Sampling Using the Forward
Kullback-Leibler Divergence [77.06203118175335]
Variational Inference (VI) is a popular alternative to exact sampling in Bayesian inference.
Importance sampling (IS) is often used to fine-tune and de-bias the estimates of approximate Bayesian inference procedures.
We propose a novel combination of optimization and sampling techniques for approximate Bayesian inference.
arXiv Detail & Related papers (2021-06-30T11:00:24Z) - Variational Transport: A Convergent Particle-BasedAlgorithm for Distributional Optimization [106.70006655990176]
A distributional optimization problem arises widely in machine learning and statistics.
We propose a novel particle-based algorithm, dubbed as variational transport, which approximately performs Wasserstein gradient descent.
We prove that when the objective function satisfies a functional version of the Polyak-Lojasiewicz (PL) (Polyak, 1963) and smoothness conditions, variational transport converges linearly.
arXiv Detail & Related papers (2020-12-21T18:33:13Z) - Approximating Posterior Predictive Distributions by Averaging Output
From Many Particle Filters [0.0]
This paper introduces the it particle swarm filter (not to be confused with particle swarm optimization)
It targets an approximation to the sequence of posterior predictive distributions by averaging expectation approximations from many particle filters.
A law of large numbers and a central limit theorem are provided, as well as a numerical study of simulated data from a volatility model.
arXiv Detail & Related papers (2020-06-27T16:14:02Z) - Multiplicative Gaussian Particle Filter [18.615555573235987]
We propose a new sampling-based approach for approximate inference in filtering problems.
Instead of approximating conditional distributions with a finite set of states, as done in particle filters, our approach approximates the distribution with a weighted sum of functions from a set of continuous functions.
arXiv Detail & Related papers (2020-02-29T09:19:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.