Sample Complexity Bounds for Linear System Identification from a Finite Set
- URL: http://arxiv.org/abs/2409.11141v2
- Date: Mon, 02 Dec 2024 14:03:32 GMT
- Title: Sample Complexity Bounds for Linear System Identification from a Finite Set
- Authors: Nicolas Chatzikiriakos, Andrea Iannelli,
- Abstract summary: We use the maximum likelihood estimator to identify the true system.<n>We leverage tools from information theory to provide a lower bound to the sample complexity.<n>The derived sample complexity bounds are analyzed analytically and numerically.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper considers a finite sample perspective on the problem of identifying an LTI system from a finite set of possible systems using trajectory data. To this end, we use the maximum likelihood estimator to identify the true system and provide an upper bound for its sample complexity. Crucially, the derived bound does not rely on a potentially restrictive stability assumption. Additionally, we leverage tools from information theory to provide a lower bound to the sample complexity that holds independently of the used estimator. The derived sample complexity bounds are analyzed analytically and numerically.
Related papers
- High Effort, Low Gain: Fundamental Limits of Active Learning for Linear Dynamical Systems [1.530715277464342]
We consider the problem of identifying an unknown linear dynamical system given a finite hypothesis class.<n>We present sample complexity lower bounds that capture the choice of the selected excitation input.<n>We propose an active learning algorithm that sequentially excites the system optimally with respect to the current estimate.
arXiv Detail & Related papers (2025-09-15T13:29:24Z) - Outlier-Robust Linear System Identification Under Heavy-tailed Noise [2.07180164747172]
We consider the problem of estimating the state transition matrix of a linear time-invariant (LTI) system.
We develop a novel robust system identification algorithm that relies on constructing multiple weakly-concentrated estimators.
We show that our algorithm and analysis technique can be easily extended to account for scenarios where an adversary can arbitrarily corrupt a small fraction of the collected trajectory data.
arXiv Detail & Related papers (2024-12-31T12:53:02Z) - Unveiling the Statistical Foundations of Chain-of-Thought Prompting Methods [59.779795063072655]
Chain-of-Thought (CoT) prompting and its variants have gained popularity as effective methods for solving multi-step reasoning problems.
We analyze CoT prompting from a statistical estimation perspective, providing a comprehensive characterization of its sample complexity.
arXiv Detail & Related papers (2024-08-25T04:07:18Z) - A least-square method for non-asymptotic identification in linear switching control [17.938732931331064]
It is known that the underlying partially-observed linear dynamical system lies within a finite collection of known candidate models.
We characterize the finite-time sample complexity of this problem by leveraging recent advances in the non-asymptotic analysis of linear least-square methods.
We propose a data-driven switching strategy that identifies the unknown parameters of the underlying system.
arXiv Detail & Related papers (2024-04-11T20:55:38Z) - Sample Complexity of the Sign-Perturbed Sums Identification Method:
Scalar Case [0.0]
Sign-Perturbed Sum (SPS) is a powerful finite-sample system identification algorithm.
This paper studies the behaviour of SPS confidence intervals.
arXiv Detail & Related papers (2024-01-28T22:44:41Z) - Fast Shapley Value Estimation: A Unified Approach [71.92014859992263]
We propose a straightforward and efficient Shapley estimator, SimSHAP, by eliminating redundant techniques.
In our analysis of existing approaches, we observe that estimators can be unified as a linear transformation of randomly summed values from feature subsets.
Our experiments validate the effectiveness of our SimSHAP, which significantly accelerates the computation of accurate Shapley values.
arXiv Detail & Related papers (2023-11-02T06:09:24Z) - On the Benefits of Leveraging Structural Information in Planning Over
the Learned Model [3.3512508970931236]
We investigate the benefits of leveraging structural information about the system in terms of reducing sample complexity.
Our analysis shows that there can be a significant saving in sample complexity by leveraging structural information about the model.
arXiv Detail & Related papers (2023-03-15T18:18:01Z) - On the Fundamental Limits of Matrix Completion: Leveraging Hierarchical
Similarity Graphs [39.00971122472004]
We study the matrix completion problem that leverages hierarchical similarity graphs as side information in recommender systems.
Under a hierarchical block model, we characterize the exact information-theoretic limit on the number of observed matrix entries.
We analyze the optimal sample complexity and identify different regimes whose characteristics rely on quality metrics of side information of the hierarchical similarity graph.
arXiv Detail & Related papers (2021-09-12T02:38:54Z) - Stochastic Approximation for Online Tensorial Independent Component
Analysis [98.34292831923335]
Independent component analysis (ICA) has been a popular dimension reduction tool in statistical machine learning and signal processing.
In this paper, we present a by-product online tensorial algorithm that estimates for each independent component.
arXiv Detail & Related papers (2020-12-28T18:52:37Z) - Revisiting the Sample Complexity of Sparse Spectrum Approximation of
Gaussian Processes [60.479499225746295]
We introduce a new scalable approximation for Gaussian processes with provable guarantees which hold simultaneously over its entire parameter space.
Our approximation is obtained from an improved sample complexity analysis for sparse spectrum Gaussian processes (SSGPs)
arXiv Detail & Related papers (2020-11-17T05:41:50Z) - Finite-time Identification of Stable Linear Systems: Optimality of the
Least-Squares Estimator [79.3239137440876]
We present a new finite-time analysis of the estimation error of the Ordinary Least Squares (OLS) estimator for stable linear time-invariant systems.
We characterize the number of observed samples sufficient for the OLS estimator to be $(varepsilon,delta)$-PAC, i.e., to yield an estimation error less than $varepsilon$ with probability at least $1-delta$.
arXiv Detail & Related papers (2020-03-17T20:59:17Z) - The Simulator: Understanding Adaptive Sampling in the
Moderate-Confidence Regime [52.38455827779212]
We propose a novel technique for analyzing adaptive sampling called the em Simulator.
We prove the first instance-based lower bounds the top-k problem which incorporate the appropriate log-factors.
Our new analysis inspires a simple and near-optimal for the best-arm and top-k identification, the first em practical of its kind for the latter problem.
arXiv Detail & Related papers (2017-02-16T23:42:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.