Variational Nonlinear System Identification
- URL: http://arxiv.org/abs/2012.05072v1
- Date: Tue, 8 Dec 2020 05:43:50 GMT
- Title: Variational Nonlinear System Identification
- Authors: Jarrad Courts, Adrian Wills, Thomas Sch\"on, Brett Ninness
- Abstract summary: This paper considers parameter estimation for nonlinear state-space models, which is an important but challenging problem.
We employ a variational inference (VI) approach, which is a principled method that has deep connections to maximum likelihood estimation.
This VI approach ultimately provides estimates of the model as solutions to an optimisation problem, which is deterministic, tractable and can be solved using standard optimisation tools.
- Score: 0.8793721044482611
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper considers parameter estimation for nonlinear state-space models,
which is an important but challenging problem. We address this challenge by
employing a variational inference (VI) approach, which is a principled method
that has deep connections to maximum likelihood estimation. This VI approach
ultimately provides estimates of the model as solutions to an optimisation
problem, which is deterministic, tractable and can be solved using standard
optimisation tools. A specialisation of this approach for systems with additive
Gaussian noise is also detailed. The proposed method is examined numerically on
a range of simulation and real examples with a focus on robustness to parameter
initialisations; we additionally perform favourable comparisons against
state-of-the-art alternatives.
Related papers
- Towards Convexity in Anomaly Detection: A New Formulation of SSLM with Unique Optimal Solutions [12.250410918282615]
An unsolved issue in widely used methods as Support Vector Description (SVDD) Small and Large Sphere SVM (MvMs)
We introduce a novel SSLM demonstrated to be impossible with traditional non approaches.
arXiv Detail & Related papers (2024-10-31T09:42:39Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - Likelihood Ratio Confidence Sets for Sequential Decision Making [51.66638486226482]
We revisit the likelihood-based inference principle and propose to use likelihood ratios to construct valid confidence sequences.
Our method is especially suitable for problems with well-specified likelihoods.
We show how to provably choose the best sequence of estimators and shed light on connections to online convex optimization.
arXiv Detail & Related papers (2023-11-08T00:10:21Z) - An Optimization-based Deep Equilibrium Model for Hyperspectral Image
Deconvolution with Convergence Guarantees [71.57324258813675]
We propose a novel methodology for addressing the hyperspectral image deconvolution problem.
A new optimization problem is formulated, leveraging a learnable regularizer in the form of a neural network.
The derived iterative solver is then expressed as a fixed-point calculation problem within the Deep Equilibrium framework.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - Robust identification of non-autonomous dynamical systems using
stochastic dynamics models [0.0]
This paper considers the problem of system identification (ID) of linear and nonlinear non-autonomous systems from noisy and sparse data.
We propose and analyze an objective function derived from a Bayesian formulation for learning a hidden Markov model.
We show that our proposed approach has improved smoothness and inherent regularization that make it well-suited for system ID.
arXiv Detail & Related papers (2022-12-20T16:36:23Z) - A Variational Inference Approach to Inverse Problems with Gamma
Hyperpriors [60.489902135153415]
This paper introduces a variational iterative alternating scheme for hierarchical inverse problems with gamma hyperpriors.
The proposed variational inference approach yields accurate reconstruction, provides meaningful uncertainty quantification, and is easy to implement.
arXiv Detail & Related papers (2021-11-26T06:33:29Z) - Linear-Time Probabilistic Solutions of Boundary Value Problems [27.70274403550477]
We introduce a Gauss--Markov prior and tailor it specifically to BVPs.
This allows computing a posterior distribution over the solution in linear time, at a quality and cost comparable to that of well-established, non-probabilistic methods.
arXiv Detail & Related papers (2021-06-14T21:19:17Z) - Variational State and Parameter Estimation [0.8049701904919515]
This paper considers the problem of computing Bayesian estimates of both states and model parameters for nonlinear state-space models.
A variational approach is used to provide an assumed density which approximates the desired, intractable, distribution.
The proposed method is compared against state-of-the-art Hamiltonian Monte Carlo in two numerical examples.
arXiv Detail & Related papers (2020-12-14T05:35:29Z) - Robust, Accurate Stochastic Optimization for Variational Inference [68.83746081733464]
We show that common optimization methods lead to poor variational approximations if the problem is moderately large.
Motivated by these findings, we develop a more robust and accurate optimization framework by viewing the underlying algorithm as producing a Markov chain.
arXiv Detail & Related papers (2020-09-01T19:12:11Z) - Variable selection for Gaussian process regression through a sparse
projection [0.802904964931021]
This paper presents a new variable selection approach integrated with Gaussian process (GP) regression.
The choice of tuning parameters and the accuracy of the estimation are evaluated with the simulation some chosen benchmark approaches.
arXiv Detail & Related papers (2020-08-25T01:06:10Z) - Implicit differentiation of Lasso-type models for hyperparameter
optimization [82.73138686390514]
We introduce an efficient implicit differentiation algorithm, without matrix inversion, tailored for Lasso-type problems.
Our approach scales to high-dimensional data by leveraging the sparsity of the solutions.
arXiv Detail & Related papers (2020-02-20T18:43:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.