From Spectral Theorem to Statistical Independence with Application to
System Identification
- URL: http://arxiv.org/abs/2310.10523v1
- Date: Mon, 16 Oct 2023 15:40:43 GMT
- Title: From Spectral Theorem to Statistical Independence with Application to
System Identification
- Authors: Muhammad Abdullah Naeem, Amir Khazraei and Miroslav Pajic
- Abstract summary: We provide first quantitative handle on decay rate of finite powers of state transition matrix $|Ak|$.
It is shown that when a stable dynamical system has only one distinct eigenvalue and discrepancy of $n-1$: $|A|$ has a dependence on $n$, resulting dynamics are inseparable.
We show that element-wise error is essentially a variant of well-know Littlewood-Offord problem.
- Score: 11.98319841778396
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: High dimensional random dynamical systems are ubiquitous, including -- but
not limited to -- cyber-physical systems, daily return on different stocks of
S&P 1500 and velocity profile of interacting particle systems around
McKeanVlasov limit. Mathematically, underlying phenomenon can be captured via a
stable $n$-dimensional linear transformation `$A$' and additive randomness.
System identification aims at extracting useful information about underlying
dynamical system, given a length $N$ trajectory from it (corresponds to an $n
\times N$ dimensional data matrix). We use spectral theorem for non-Hermitian
operators to show that spatio-temperal correlations are dictated by the
discrepancy between algebraic and geometric multiplicity of distinct
eigenvalues corresponding to state transition matrix. Small discrepancies imply
that original trajectory essentially comprises of multiple lower dimensional
random dynamical systems living on $A$ invariant subspaces and are
statistically independent of each other. In the process, we provide first
quantitative handle on decay rate of finite powers of state transition matrix
$\|A^{k}\|$ . It is shown that when a stable dynamical system has only one
distinct eigenvalue and discrepancy of $n-1$: $\|A\|$ has a dependence on $n$,
resulting dynamics are spatially inseparable and consequently there exist at
least one row with covariates of typical size $\Theta\big(\sqrt{N-n+1}$
$e^{n}\big)$ i.e., even under stability assumption, covariates can suffer from
curse of dimensionality. In the light of these findings we set the stage for
non-asymptotic error analysis in estimation of state transition matrix $A$ via
least squares regression on observed trajectory by showing that element-wise
error is essentially a variant of well-know Littlewood-Offord problem.
Related papers
- Joint Learning of Linear Dynamical Systems under Smoothness Constraints [5.2395896768723045]
We consider the problem of joint learning of multiple linear dynamical systems.
In particular, we show conditions under which the mean-squared error bounds on the mean-squared error (MSE) converges to zero as $m$ increases.
arXiv Detail & Related papers (2024-06-03T08:29:42Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Computational-Statistical Gaps in Gaussian Single-Index Models [77.1473134227844]
Single-Index Models are high-dimensional regression problems with planted structure.
We show that computationally efficient algorithms, both within the Statistical Query (SQ) and the Low-Degree Polynomial (LDP) framework, necessarily require $Omega(dkstar/2)$ samples.
arXiv Detail & Related papers (2024-03-08T18:50:19Z) - Effective Minkowski Dimension of Deep Nonparametric Regression: Function
Approximation and Statistical Theories [70.90012822736988]
Existing theories on deep nonparametric regression have shown that when the input data lie on a low-dimensional manifold, deep neural networks can adapt to intrinsic data structures.
This paper introduces a relaxed assumption that input data are concentrated around a subset of $mathbbRd$ denoted by $mathcalS$, and the intrinsic dimension $mathcalS$ can be characterized by a new complexity notation -- effective Minkowski dimension.
arXiv Detail & Related papers (2023-06-26T17:13:31Z) - Learning and Concentration for High Dimensional Linear Gaussians: an
Invariant Subspace Approach [0.0]
We study non-asymptotic bounds on correlation between two time realizations of stable linear systems with isotropic Gaussian noise.
Our analysis provide first interpretable and geometric explanation into intricacies of learning and concentration for random dynamical systems.
arXiv Detail & Related papers (2023-04-04T11:11:26Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Learning the Dynamics of Autonomous Linear Systems From Multiple
Trajectories [2.2268031040603447]
Existing results on learning rate and consistency of autonomous linear system identification rely on observations of steady state behaviors from a single long trajectory.
We consider the scenario of learning system dynamics based on multiple short trajectories, where there are no easily observed steady state behaviors.
We show that one can adjust the length of the trajectories to achieve a learning rate of $mathcalO(sqrtfraclogNN)$ for strictly stable systems and a learning rate of $mathcalO(frac(logN)dsqr
arXiv Detail & Related papers (2022-03-24T01:29:53Z) - Many-Body Quantum Chaos and Space-time Translational Invariance [0.0]
We study the consequences of having translational invariance in space and in time in many-body quantum chaotic systems.
We consider an ensemble of random quantum circuits, composed of single-site random unitaries and nearest neighbour couplings.
We numerically demonstrate, with simulations of two distinct circuit models, that in such a scaling limit, most microscopic details become unimportant.
arXiv Detail & Related papers (2021-09-09T18:00:00Z) - Improved rates for prediction and identification of partially observed
linear dynamical systems [4.68299658663016]
Identification of a linear time-in dynamical system from partial observations is a fundamental problem in control theory.
We propose an algorithm that learns such systems with non-asymptotic statistical rates depending on the inherentity $d$ of the system.
Our algorithm is based on multi-scale low-rank approximation SVD applied to Hankel matrices of increasing sizes.
arXiv Detail & Related papers (2020-11-19T18:04:18Z) - A Random Matrix Analysis of Random Fourier Features: Beyond the Gaussian
Kernel, a Precise Phase Transition, and the Corresponding Double Descent [85.77233010209368]
This article characterizes the exacts of random Fourier feature (RFF) regression, in the realistic setting where the number of data samples $n$ is all large and comparable.
This analysis also provides accurate estimates of training and test regression errors for large $n,p,N$.
arXiv Detail & Related papers (2020-06-09T02:05:40Z) - Sparse Identification of Nonlinear Dynamical Systems via Reweighted
$\ell_1$-regularized Least Squares [62.997667081978825]
This work proposes an iterative sparse-regularized regression method to recover governing equations of nonlinear systems from noisy state measurements.
The aim of this work is to improve the accuracy and robustness of the method in the presence of state measurement noise.
arXiv Detail & Related papers (2020-05-27T08:30:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.