PAC bounds of continuous Linear Parameter-Varying systems related to
neural ODEs
- URL: http://arxiv.org/abs/2307.03630v1
- Date: Fri, 7 Jul 2023 14:39:18 GMT
- Title: PAC bounds of continuous Linear Parameter-Varying systems related to
neural ODEs
- Authors: D\'aniel R\'acz and Mih\'aly Petreczky and B\'alint Dar\'oczy
- Abstract summary: We consider the problem of learning Neural Ordinary Differential Equations (neural ODEs) within the context of Linear -Varying (LPV) systems in continuous-time.
We provide Probably Approximately Correct (PAC) bounds under stability for LPV systems related to neural ODEs.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider the problem of learning Neural Ordinary Differential Equations
(neural ODEs) within the context of Linear Parameter-Varying (LPV) systems in
continuous-time. LPV systems contain bilinear systems which are known to be
universal approximators for non-linear systems. Moreover, a large class of
neural ODEs can be embedded into LPV systems. As our main contribution we
provide Probably Approximately Correct (PAC) bounds under stability for LPV
systems related to neural ODEs. The resulting bounds have the advantage that
they do not depend on the integration interval.
Related papers
- ControlSynth Neural ODEs: Modeling Dynamical Systems with Guaranteed Convergence [1.1720409777196028]
Neural ODEs (NODEs) are continuous-time neural networks (NNs) that can process data without the limitation of time intervals.
We show that despite their highly nonlinear nature, convergence can be guaranteed via tractable linear inequalities.
In the composition of CSODEs, we introduce an extra control term for learning the potential simultaneous capture of dynamics at different scales.
arXiv Detail & Related papers (2024-11-04T17:20:42Z) - Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Metric-Entropy Limits on Nonlinear Dynamical System Learning [4.069144210024563]
We show that recurrent neural networks (RNNs) are capable of learning nonlinear systems that satisfy a Lipschitz property and forget past inputs fast enough in a metric-entropy optimal manner.
As the sets of sequence-to-sequence maps we consider are significantly more massive than function classes generally considered in deep neural network approximation theory, a refined metric-entropy characterization is needed.
arXiv Detail & Related papers (2024-07-01T12:57:03Z) - A finite-sample generalization bound for stable LPV systems [0.0]
We derive a PAC bound for stable continuous-time linear parameter-varying (LPV) systems.
Our bound depends on the H2 norm of the chosen class of the LPV systems, but does not depend on the time interval for which the signals are considered.
arXiv Detail & Related papers (2024-05-16T12:42:36Z) - A Constructive Approach to Function Realization by Neural Stochastic
Differential Equations [8.04975023021212]
We introduce structural restrictions on system dynamics and characterize the class of functions that can be realized by such a system.
The systems are implemented as a cascade interconnection of a neural differential equation (Neural SDE), a deterministic dynamical system, and a readout map.
arXiv Detail & Related papers (2023-07-01T03:44:46Z) - Identifiability and Asymptotics in Learning Homogeneous Linear ODE Systems from Discrete Observations [114.17826109037048]
Ordinary Differential Equations (ODEs) have recently gained a lot of attention in machine learning.
theoretical aspects, e.g., identifiability and properties of statistical estimation are still obscure.
This paper derives a sufficient condition for the identifiability of homogeneous linear ODE systems from a sequence of equally-spaced error-free observations sampled from a single trajectory.
arXiv Detail & Related papers (2022-10-12T06:46:38Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Metric Entropy Limits on Recurrent Neural Network Learning of Linear
Dynamical Systems [0.0]
We show that RNNs can optimally learn - or identify in system-theory parlance - stable LTI systems.
For LTI systems whose input-output relation is characterized through a difference equation, this means that RNNs can learn the difference equation from input-output traces in a metric-entropy optimal manner.
arXiv Detail & Related papers (2021-05-06T10:12:30Z) - Linear embedding of nonlinear dynamical systems and prospects for
efficient quantum algorithms [74.17312533172291]
We describe a method for mapping any finite nonlinear dynamical system to an infinite linear dynamical system (embedding)
We then explore an approach for approximating the resulting infinite linear system with finite linear systems (truncation)
arXiv Detail & Related papers (2020-12-12T00:01:10Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Active Learning for Nonlinear System Identification with Guarantees [102.43355665393067]
We study a class of nonlinear dynamical systems whose state transitions depend linearly on a known feature embedding of state-action pairs.
We propose an active learning approach that achieves this by repeating three steps: trajectory planning, trajectory tracking, and re-estimation of the system from all available data.
We show that our method estimates nonlinear dynamical systems at a parametric rate, similar to the statistical rate of standard linear regression.
arXiv Detail & Related papers (2020-06-18T04:54:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.