An energy-based deep splitting method for the nonlinear filtering
problem
- URL: http://arxiv.org/abs/2203.17153v1
- Date: Thu, 31 Mar 2022 16:26:54 GMT
- Title: An energy-based deep splitting method for the nonlinear filtering
problem
- Authors: Kasper B{\aa}gmark, Adam Andersson, Stig Larsson
- Abstract summary: The main goal of this paper is to approximately solve the nonlinear filtering problem through deep learning.
This is achieved by solving the Zakai equation by a deep splitting method, previously developed for approximate solution of (stochastic) partial differential equations.
This is combined with an energy-based model for the approximation of functions by a deep neural network.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The main goal of this paper is to approximately solve the nonlinear filtering
problem through deep learning. This is achieved by solving the Zakai equation
by a deep splitting method, previously developed for approximate solution of
(stochastic) partial differential equations. This is combined with an
energy-based model for the approximation of functions by a deep neural network.
This results in a computationally fast filter that takes observations as input
and that does not require re-training when new observations are received. The
method is tested on three examples, one linear Gaussian and two nonlinear. The
method shows promising performance when benchmarked against the Kalman filter
and the bootstrap particle filter.
Related papers
- Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - [Re] The Discriminative Kalman Filter for Bayesian Filtering with
Nonlinear and Non-Gaussian Observation Models [25.587281577467405]
Kalman filters provide a straightforward and interpretable means to estimate hidden or latent variables.
One such application is neural decoding for neuroprostheses.
This work provides an open-source Python alternative to the authors' algorithm for highly non-linear or non-Gaussian observation models.
arXiv Detail & Related papers (2024-01-24T21:00:42Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - An application of the splitting-up method for the computation of a
neural network representation for the solution for the filtering equations [68.8204255655161]
Filtering equations play a central role in many real-life applications, including numerical weather prediction, finance and engineering.
One of the classical approaches to approximate the solution of the filtering equations is to use a PDE inspired method, called the splitting-up method.
We combine this method with a neural network representation to produce an approximation of the unnormalised conditional distribution of the signal process.
arXiv Detail & Related papers (2022-01-10T11:01:36Z) - Tensor Network Kalman Filtering for Large-Scale LS-SVMs [17.36231167296782]
Least squares support vector machines are used for nonlinear regression and classification.
A framework based on tensor networks and the Kalman filter is presented to alleviate the demanding memory and computational complexities.
Results show that our method can achieve high performance and is particularly useful when alternative methods are computationally infeasible.
arXiv Detail & Related papers (2021-10-26T08:54:03Z) - Unsharp Mask Guided Filtering [53.14430987860308]
The goal of this paper is guided image filtering, which emphasizes the importance of structure transfer during filtering.
We propose a new and simplified formulation of the guided filter inspired by unsharp masking.
Our formulation enjoys a filtering prior to a low-pass filter and enables explicit structure transfer by estimating a single coefficient.
arXiv Detail & Related papers (2021-06-02T19:15:34Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.