Flow-based Bayesian filtering for high-dimensional nonlinear stochastic dynamical systems
- URL: http://arxiv.org/abs/2502.16232v2
- Date: Wed, 05 Mar 2025 08:42:40 GMT
- Title: Flow-based Bayesian filtering for high-dimensional nonlinear stochastic dynamical systems
- Authors: Xintong Wang, Xiaofei Guan, Ling Guo, Hao Wu,
- Abstract summary: We propose a flow-based Bayesian filter (FBF) that integrates normalizing flows to construct a novel latent linear state-space model with Gaussian filtering distributions.<n>This framework facilitates efficient density estimation and sampling using invertible transformations provided by normalizing flows.<n> Numerical experiments demonstrate the superior accuracy and efficiency of FBF.
- Score: 4.382988524355736
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Bayesian filtering for high-dimensional nonlinear stochastic dynamical systems is a fundamental yet challenging problem in many fields of science and engineering. Existing methods face significant obstacles: Gaussian-based filters struggle with non-Gaussian distributions, while sequential Monte Carlo methods are computationally intensive and prone to particle degeneracy in high dimensions. Although generative models in machine learning have made significant progress in modeling high-dimensional non-Gaussian distributions, their inefficiency in online updating limits their applicability to filtering problems. To address these challenges, we propose a flow-based Bayesian filter (FBF) that integrates normalizing flows to construct a novel latent linear state-space model with Gaussian filtering distributions. This framework facilitates efficient density estimation and sampling using invertible transformations provided by normalizing flows, and it enables the construction of filters in a data-driven manner, without requiring prior knowledge of system dynamics or observation models. Numerical experiments demonstrate the superior accuracy and efficiency of FBF.
Related papers
- TrackDiffuser: Nearly Model-Free Bayesian Filtering with Diffusion Model [23.40376181606577]
We present TrackDiffuser, a generative framework addressing both challenges by reformulating Bayesian filtering as a conditional diffusion model.
Our approach implicitly learns system dynamics from data to mitigate the effects of inaccurate SSM.
TrackDiffuser exhibits remarkable robustness to SSM inaccuracies, offering a practical solution for real-world state estimation problems.
arXiv Detail & Related papers (2025-02-08T16:21:18Z) - On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - Normalizing Flow-based Differentiable Particle Filters [16.164656853940464]
We present a differentiable particle filtering framework that usesconditional normalizing flows to build its dynamic model, proposal distribution, and measurement model.
We derive the theoretical properties of the proposed filters and evaluate the proposed normalizing flow-based differentiable particle filters' performance through a series of numerical experiments.
arXiv Detail & Related papers (2024-03-03T12:23:17Z) - Closed-form Filtering for Non-linear Systems [83.91296397912218]
We propose a new class of filters based on Gaussian PSD Models, which offer several advantages in terms of density approximation and computational efficiency.
We show that filtering can be efficiently performed in closed form when transitions and observations are Gaussian PSD Models.
Our proposed estimator enjoys strong theoretical guarantees, with estimation error that depends on the quality of the approximation and is adaptive to the regularity of the transition probabilities.
arXiv Detail & Related papers (2024-02-15T08:51:49Z) - Nonlinear Filtering with Brenier Optimal Transport Maps [4.745059103971596]
This paper is concerned with the problem of nonlinear filtering, i.e., computing the conditional distribution of the state of a dynamical system.
Conventional sequential importance resampling (SIR) particle filters suffer from fundamental limitations, in scenarios involving degenerate likelihoods or high-dimensional states.
In this paper, we explore an alternative method, which is based on estimating the Brenier optimal transport (OT) map from the current prior distribution of the state to the posterior distribution at the next time step.
arXiv Detail & Related papers (2023-10-21T01:34:30Z) - An Ensemble Score Filter for Tracking High-Dimensional Nonlinear Dynamical Systems [10.997994515823798]
We propose an ensemble score filter (EnSF) for solving high-dimensional nonlinear filtering problems.
Unlike existing diffusion models that train neural networks to approximate the score function, we develop a training-free score estimation.
EnSF provides surprising performance, compared with the state-of-the-art Local Ensemble Transform Kalman Filter method.
arXiv Detail & Related papers (2023-09-02T16:48:02Z) - Computational Doob's h-transforms for Online Filtering of Discretely
Observed Diffusions [65.74069050283998]
We propose a computational framework to approximate Doob's $h$-transforms.
The proposed approach can be orders of magnitude more efficient than state-of-the-art particle filters.
arXiv Detail & Related papers (2022-06-07T15:03:05Z) - Deep Learning for the Benes Filter [91.3755431537592]
We present a new numerical method based on the mesh-free neural network representation of the density of the solution of the Benes model.
We discuss the role of nonlinearity in the filtering model equations for the choice of the domain of the neural network.
arXiv Detail & Related papers (2022-03-09T14:08:38Z) - Nonlinear Isometric Manifold Learning for Injective Normalizing Flows [58.720142291102135]
We use isometries to separate manifold learning and density estimation.
We also employ autoencoders to design embeddings with explicit inverses that do not distort the probability distribution.
arXiv Detail & Related papers (2022-03-08T08:57:43Z) - Efficient CDF Approximations for Normalizing Flows [64.60846767084877]
We build upon the diffeomorphic properties of normalizing flows to estimate the cumulative distribution function (CDF) over a closed region.
Our experiments on popular flow architectures and UCI datasets show a marked improvement in sample efficiency as compared to traditional estimators.
arXiv Detail & Related papers (2022-02-23T06:11:49Z) - Deep Variational Models for Collaborative Filtering-based Recommender
Systems [63.995130144110156]
Deep learning provides accurate collaborative filtering models to improve recommender system results.
Our proposed models apply the variational concept to injectity in the latent space of the deep architecture.
Results show the superiority of the proposed approach in scenarios where the variational enrichment exceeds the injected noise effect.
arXiv Detail & Related papers (2021-07-27T08:59:39Z) - Learning Likelihoods with Conditional Normalizing Flows [54.60456010771409]
Conditional normalizing flows (CNFs) are efficient in sampling and inference.
We present a study of CNFs where the base density to output space mapping is conditioned on an input x, to model conditional densities p(y|x)
arXiv Detail & Related papers (2019-11-29T19:17:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.