Joint State Estimation and Noise Identification Based on Variational
Optimization
- URL: http://arxiv.org/abs/2312.09585v1
- Date: Fri, 15 Dec 2023 07:47:03 GMT
- Title: Joint State Estimation and Noise Identification Based on Variational
Optimization
- Authors: Hua Lan and Shijie Zhao and Jinjie Hu and Zengfu Wang and Jing Fu
- Abstract summary: A novel adaptive Kalman filter method based on conjugate-computation variational inference, referred to as CVIAKF, is proposed.
The effectiveness of CVIAKF is validated through synthetic and real-world datasets of maneuvering target tracking.
- Score: 8.536356569523127
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this article, the state estimation problems with unknown process noise and
measurement noise covariances for both linear and nonlinear systems are
considered. By formulating the joint estimation of system state and noise
parameters into an optimization problem, a novel adaptive Kalman filter method
based on conjugate-computation variational inference, referred to as CVIAKF, is
proposed to approximate the joint posterior probability density function of the
latent variables. Unlike the existing adaptive Kalman filter methods utilizing
variational inference in natural-parameter space, CVIAKF performs optimization
in expectation-parameter space, resulting in a faster and simpler solution.
Meanwhile, CVIAKF divides optimization objectives into conjugate and
non-conjugate parts of nonlinear dynamical models, whereas conjugate
computations and stochastic mirror-descent are applied, respectively.
Remarkably, the reparameterization trick is used to reduce the variance of
stochastic gradients of the non-conjugate parts. The effectiveness of CVIAKF is
validated through synthetic and real-world datasets of maneuvering target
tracking.
Related papers
- Accelerated zero-order SGD under high-order smoothness and overparameterized regime [79.85163929026146]
We present a novel gradient-free algorithm to solve convex optimization problems.
Such problems are encountered in medicine, physics, and machine learning.
We provide convergence guarantees for the proposed algorithm under both types of noise.
arXiv Detail & Related papers (2024-11-21T10:26:17Z) - Semi-Implicit Functional Gradient Flow [30.32233517392456]
We propose a functional gradient ParVI method that uses perturbed particles as the approximation family.
The corresponding functional gradient flow, which can be estimated via denoising score matching, exhibits strong theoretical convergence guarantee.
arXiv Detail & Related papers (2024-10-23T15:00:30Z) - Variance-Reducing Couplings for Random Features [57.73648780299374]
Random features (RFs) are a popular technique to scale up kernel methods in machine learning.
We find couplings to improve RFs defined on both Euclidean and discrete input spaces.
We reach surprising conclusions about the benefits and limitations of variance reduction as a paradigm.
arXiv Detail & Related papers (2024-05-26T12:25:09Z) - Nonparametric Automatic Differentiation Variational Inference with
Spline Approximation [7.5620760132717795]
We develop a nonparametric approximation approach that enables flexible posterior approximation for distributions with complicated structures.
Compared with widely-used nonparametrical inference methods, the proposed method is easy to implement and adaptive to various data structures.
Experiments demonstrate the efficiency of the proposed method in approximating complex posterior distributions and improving the performance of generative models with incomplete data.
arXiv Detail & Related papers (2024-03-10T20:22:06Z) - Towards stable real-world equation discovery with assessing
differentiating quality influence [52.2980614912553]
We propose alternatives to the commonly used finite differences-based method.
We evaluate these methods in terms of applicability to problems, similar to the real ones, and their ability to ensure the convergence of equation discovery algorithms.
arXiv Detail & Related papers (2023-11-09T23:32:06Z) - CoLiDE: Concomitant Linear DAG Estimation [12.415463205960156]
We deal with the problem of learning acyclic graph structure from observational data to a linear equation.
We propose a new convex score function for sparsity-aware learning DAGs.
arXiv Detail & Related papers (2023-10-04T15:32:27Z) - Learning Unnormalized Statistical Models via Compositional Optimization [73.30514599338407]
Noise-contrastive estimation(NCE) has been proposed by formulating the objective as the logistic loss of the real data and the artificial noise.
In this paper, we study it a direct approach for optimizing the negative log-likelihood of unnormalized models.
arXiv Detail & Related papers (2023-06-13T01:18:16Z) - Low-rank extended Kalman filtering for online learning of neural
networks from streaming data [71.97861600347959]
We propose an efficient online approximate Bayesian inference algorithm for estimating the parameters of a nonlinear function from a potentially non-stationary data stream.
The method is based on the extended Kalman filter (EKF), but uses a novel low-rank plus diagonal decomposition of the posterior matrix.
In contrast to methods based on variational inference, our method is fully deterministic, and does not require step-size tuning.
arXiv Detail & Related papers (2023-05-31T03:48:49Z) - Variational Nonlinear Kalman Filtering with Unknown Process Noise
Covariance [24.23243651301339]
This paper presents a solution for identification of nonlinear state estimation and model parameters based on the approximate Bayesian inference principle.
The performance of the proposed method is verified on radar target tracking applications by both simulated and real-world data.
arXiv Detail & Related papers (2023-05-06T03:34:39Z) - Variational Kalman Filtering with Hinf-Based Correction for Robust
Bayesian Learning in High Dimensions [2.294014185517203]
We address the problem of convergence of sequential variational inference filter (VIF) through the application of a robust variational objective and Hinf-norm based correction.
A novel VIF- Hinf recursion that employs consecutive variational inference and Hinf based optimization steps is proposed.
arXiv Detail & Related papers (2022-04-27T17:38:13Z) - A Priori Denoising Strategies for Sparse Identification of Nonlinear
Dynamical Systems: A Comparative Study [68.8204255655161]
We investigate and compare the performance of several local and global smoothing techniques to a priori denoise the state measurements.
We show that, in general, global methods, which use the entire measurement data set, outperform local methods, which employ a neighboring data subset around a local point.
arXiv Detail & Related papers (2022-01-29T23:31:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.