Machine learning-based conditional mean filter: a generalization of the
ensemble Kalman filter for nonlinear data assimilation
- URL: http://arxiv.org/abs/2106.07908v1
- Date: Tue, 15 Jun 2021 06:40:32 GMT
- Title: Machine learning-based conditional mean filter: a generalization of the
ensemble Kalman filter for nonlinear data assimilation
- Authors: Truong-Vinh Hoang (1), Sebastian Krumscheid (1), Hermann G. Matthies
(2) and Ra\'ul Tempone (1 and 3) ((1) Chair of Mathematics for Uncertainty
Quantification, RWTH Aachen University, (2) Technische Universit\"at
Braunschweig (3) Computer, Electrical and Mathematical Sciences and
Engineering, KAUST, and Alexander von Humboldt professor in Mathematics of
Uncertainty Quantification, RWTH Aachen University)
- Abstract summary: We propose a machine learning-based ensemble conditional mean filter (ML-EnCMF) for tracking possibly high-dimensional non-Gaussian state models with nonlinear dynamics based on sparse observations.
The proposed filtering method is developed based on the conditional expectation and numerically implemented using machine learning (ML) techniques combined with the ensemble method.
- Score: 42.60602838972598
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Filtering is a data assimilation technique that performs the sequential
inference of dynamical systems states from noisy observations. Herein, we
propose a machine learning-based ensemble conditional mean filter (ML-EnCMF)
for tracking possibly high-dimensional non-Gaussian state models with nonlinear
dynamics based on sparse observations. The proposed filtering method is
developed based on the conditional expectation and numerically implemented
using machine learning (ML) techniques combined with the ensemble method. The
contribution of this work is twofold. First, we demonstrate that the ensembles
assimilated using the ensemble conditional mean filter (EnCMF) provide an
unbiased estimator of the Bayesian posterior mean, and their variance matches
the expected conditional variance. Second, we implement the EnCMF using
artificial neural networks, which have a significant advantage in representing
nonlinear functions over high-dimensional domains such as the conditional mean.
Finally, we demonstrate the effectiveness of the ML-EnCMF for tracking the
states of Lorenz-63 and Lorenz-96 systems under the chaotic regime. Numerical
results show that the ML-EnCMF outperforms the ensemble Kalman filter.
Related papers
- Learning Optimal Filters Using Variational Inference [0.3749861135832072]
We present a framework for learning a parameterized analysis map - the map that takes a forecast distribution and observations to the filtering distribution.
We show that this methodology can be used to learn gain matrices for filtering linear and nonlinear dynamical systems.
Future work will apply this framework to learn new filtering algorithms.
arXiv Detail & Related papers (2024-06-26T04:51:14Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Implicit Maximum a Posteriori Filtering via Adaptive Optimization [4.767884267554628]
We frame the standard Bayesian filtering problem as optimization over a time-varying objective.
We show that our framework results in filters that are effective, robust, and scalable to high-dimensional systems.
arXiv Detail & Related papers (2023-11-17T15:30:44Z) - Nonlinear Filtering with Brenier Optimal Transport Maps [4.745059103971596]
This paper is concerned with the problem of nonlinear filtering, i.e., computing the conditional distribution of the state of a dynamical system.
Conventional sequential importance resampling (SIR) particle filters suffer from fundamental limitations, in scenarios involving degenerate likelihoods or high-dimensional states.
In this paper, we explore an alternative method, which is based on estimating the Brenier optimal transport (OT) map from the current prior distribution of the state to the posterior distribution at the next time step.
arXiv Detail & Related papers (2023-10-21T01:34:30Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Autoregressive Score Matching [113.4502004812927]
We propose autoregressive conditional score models (AR-CSM) where we parameterize the joint distribution in terms of the derivatives of univariable log-conditionals (scores)
For AR-CSM models, this divergence between data and model distributions can be computed and optimized efficiently, requiring no expensive sampling or adversarial training.
We show with extensive experimental results that it can be applied to density estimation on synthetic data, image generation, image denoising, and training latent variable models with implicit encoders.
arXiv Detail & Related papers (2020-10-24T07:01:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.