Error Bounds for Kernel-Based Linear System Identification with Unknown
Hyperparameters
- URL: http://arxiv.org/abs/2303.09842v1
- Date: Fri, 17 Mar 2023 08:52:16 GMT
- Title: Error Bounds for Kernel-Based Linear System Identification with Unknown
Hyperparameters
- Authors: Mingzhou Yin, Roy S. Smith
- Abstract summary: kernel-based method has been successfully applied in linear system identification using stable kernel designs.
From a Gaussian process, it automatically provides probabilistic error bounds for the identified models.
- Score: 0.38073142980733
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The kernel-based method has been successfully applied in linear system
identification using stable kernel designs. From a Gaussian process
perspective, it automatically provides probabilistic error bounds for the
identified models from the posterior covariance, which are useful in robust and
stochastic control. However, the error bounds require knowledge of the true
hyperparameters in the kernel design and are demonstrated to be inaccurate with
estimated hyperparameters for lightly damped systems or in the presence of high
noise. In this work, we provide reliable quantification of the estimation error
when the hyperparameters are unknown. The bounds are obtained by first
constructing a high-probability set for the true hyperparameters from the
marginal likelihood function and then finding the worst-case posterior
covariance within the set. The proposed bound is proven to contain the true
model with a high probability and its validity is verified in numerical
simulation.
Related papers
- Stochastic Marginal Likelihood Gradients using Neural Tangent Kernels [78.6096486885658]
We introduce lower bounds to the linearized Laplace approximation of the marginal likelihood.
These bounds are amenable togradient-based optimization and allow to trade off estimation accuracy against computational complexity.
arXiv Detail & Related papers (2023-06-06T19:02:57Z) - Probabilities Are Not Enough: Formal Controller Synthesis for Stochastic
Dynamical Models with Epistemic Uncertainty [68.00748155945047]
Capturing uncertainty in models of complex dynamical systems is crucial to designing safe controllers.
Several approaches use formal abstractions to synthesize policies that satisfy temporal specifications related to safety and reachability.
Our contribution is a novel abstraction-based controller method for continuous-state models with noise, uncertain parameters, and external disturbances.
arXiv Detail & Related papers (2022-10-12T07:57:03Z) - Scalable Gaussian Process Hyperparameter Optimization via Coverage
Regularization [0.0]
We present a novel algorithm which estimates the smoothness and length-scale parameters in the Matern kernel in order to improve robustness of the resulting prediction uncertainties.
We achieve improved UQ over leave-one-out likelihood while maintaining a high degree of scalability as demonstrated in numerical experiments.
arXiv Detail & Related papers (2022-09-22T19:23:37Z) - Off-the-grid learning of sparse mixtures from a continuous dictionary [0.0]
We consider a general non-linear model where the signal is a finite mixture of unknown, possibly increasing, number of features issued from a continuous dictionary parameterized by a real nonlinear parameter.
We propose an off-the-grid optimization method, that is, a method which does not use any discretization scheme on the parameter space.
We use recent results on the geometry of off-the-grid methods to give minimal separation on the true underlying non-linear parameters such that interpolating certificate functions can be constructed.
arXiv Detail & Related papers (2022-06-29T07:55:20Z) - Experimental Design for Linear Functionals in Reproducing Kernel Hilbert
Spaces [102.08678737900541]
We provide algorithms for constructing bias-aware designs for linear functionals.
We derive non-asymptotic confidence sets for fixed and adaptive designs under sub-Gaussian noise.
arXiv Detail & Related papers (2022-05-26T20:56:25Z) - Hybrid Random Features [60.116392415715275]
We propose a new class of random feature methods for linearizing softmax and Gaussian kernels called hybrid random features (HRFs)
HRFs automatically adapt the quality of kernel estimation to provide most accurate approximation in the defined regions of interest.
arXiv Detail & Related papers (2021-10-08T20:22:59Z) - Gaussian Process Uniform Error Bounds with Unknown Hyperparameters for
Safety-Critical Applications [71.23286211775084]
We introduce robust Gaussian process uniform error bounds in settings with unknown hyper parameters.
Our approach computes a confidence region in the space of hyper parameters, which enables us to obtain a probabilistic upper bound for the model error.
Experiments show that the bound performs significantly better than vanilla and fully Bayesian processes.
arXiv Detail & Related papers (2021-09-06T17:10:01Z) - Uniform Error and Posterior Variance Bounds for Gaussian Process
Regression with Application to Safe Control [11.42419064387567]
We present a novel uniform error bound using Lipschitz and an analysis of the posterior variance function for a large class of kernels.
We show how these results can be used to guarantee safe control of an unknown dynamical system.
arXiv Detail & Related papers (2021-01-13T20:06:30Z) - Gaussian Process-based Min-norm Stabilizing Controller for
Control-Affine Systems with Uncertain Input Effects and Dynamics [90.81186513537777]
We propose a novel compound kernel that captures the control-affine nature of the problem.
We show that this resulting optimization problem is convex, and we call it Gaussian Process-based Control Lyapunov Function Second-Order Cone Program (GP-CLF-SOCP)
arXiv Detail & Related papers (2020-11-14T01:27:32Z) - Marginalised Gaussian Processes with Nested Sampling [10.495114898741203]
Gaussian Process (GPs) models are a rich distribution over functions with inductive biases controlled by a kernel function.
This work presents an alternative learning procedure where the hyperparameters of the kernel function are marginalised using Nested Sampling (NS)
arXiv Detail & Related papers (2020-10-30T16:04:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.