[Re] The Discriminative Kalman Filter for Bayesian Filtering with
Nonlinear and Non-Gaussian Observation Models
- URL: http://arxiv.org/abs/2401.14429v1
- Date: Wed, 24 Jan 2024 21:00:42 GMT
- Title: [Re] The Discriminative Kalman Filter for Bayesian Filtering with
Nonlinear and Non-Gaussian Observation Models
- Authors: Josue Casco-Rodriguez, Caleb Kemere, Richard G. Baraniuk
- Abstract summary: Kalman filters provide a straightforward and interpretable means to estimate hidden or latent variables.
One such application is neural decoding for neuroprostheses.
This work provides an open-source Python alternative to the authors' algorithm for highly non-linear or non-Gaussian observation models.
- Score: 25.587281577467405
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Kalman filters provide a straightforward and interpretable means to estimate
hidden or latent variables, and have found numerous applications in control,
robotics, signal processing, and machine learning. One such application is
neural decoding for neuroprostheses. In 2020, Burkhart et al. thoroughly
evaluated their new version of the Kalman filter that leverages Bayes' theorem
to improve filter performance for highly non-linear or non-Gaussian observation
models. This work provides an open-source Python alternative to the authors'
MATLAB algorithm. Specifically, we reproduce their most salient results for
neuroscientific contexts and further examine the efficacy of their filter using
multiple random seeds and previously unused trials from the authors' dataset.
All experiments were performed offline on a single computer.
Related papers
- Learning Optimal Filters Using Variational Inference [0.3749861135832072]
We present a framework for learning a parameterized analysis map - the map that takes a forecast distribution and observations to the filtering distribution.
We show that this methodology can be used to learn gain matrices for filtering linear and nonlinear dynamical systems.
Future work will apply this framework to learn new filtering algorithms.
arXiv Detail & Related papers (2024-06-26T04:51:14Z) - Outlier-robust Kalman Filtering through Generalised Bayes [45.51425214486509]
We derive a novel, provably robust, and closed-form Bayesian update rule for online filtering in state-space models.
Our method matches or outperforms other robust filtering methods at a much lower computational cost.
arXiv Detail & Related papers (2024-05-09T09:40:56Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Implicit Maximum a Posteriori Filtering via Adaptive Optimization [4.767884267554628]
We frame the standard Bayesian filtering problem as optimization over a time-varying objective.
We show that our framework results in filters that are effective, robust, and scalable to high-dimensional systems.
arXiv Detail & Related papers (2023-11-17T15:30:44Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - An energy-based deep splitting method for the nonlinear filtering
problem [0.0]
The main goal of this paper is to approximately solve the nonlinear filtering problem through deep learning.
This is achieved by solving the Zakai equation by a deep splitting method, previously developed for approximate solution of (stochastic) partial differential equations.
This is combined with an energy-based model for the approximation of functions by a deep neural network.
arXiv Detail & Related papers (2022-03-31T16:26:54Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Learning Versatile Convolution Filters for Efficient Visual Recognition [125.34595948003745]
This paper introduces versatile filters to construct efficient convolutional neural networks.
We conduct theoretical analysis on network complexity and an efficient convolution scheme is introduced.
Experimental results on benchmark datasets and neural networks demonstrate that our versatile filters are able to achieve comparable accuracy as that of original filters.
arXiv Detail & Related papers (2021-09-20T06:07:14Z) - Machine learning-based conditional mean filter: a generalization of the
ensemble Kalman filter for nonlinear data assimilation [42.60602838972598]
We propose a machine learning-based ensemble conditional mean filter (ML-EnCMF) for tracking possibly high-dimensional non-Gaussian state models with nonlinear dynamics based on sparse observations.
The proposed filtering method is developed based on the conditional expectation and numerically implemented using machine learning (ML) techniques combined with the ensemble method.
arXiv Detail & Related papers (2021-06-15T06:40:32Z) - Neural Kalman Filtering [62.997667081978825]
We show that a gradient-descent approximation to the Kalman filter requires only local computations with variance weighted prediction errors.
We also show that it is possible under the same scheme to adaptively learn the dynamics model with a learning rule that corresponds directly to Hebbian plasticity.
arXiv Detail & Related papers (2021-02-19T16:43:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.