A Low-complexity Structured Neural Network to Realize States of Dynamical Systems
- URL: http://arxiv.org/abs/2503.23697v1
- Date: Mon, 31 Mar 2025 03:52:38 GMT
- Title: A Low-complexity Structured Neural Network to Realize States of Dynamical Systems
- Authors: Hansaka Aluvihare, Levi Lingsch, Xianqi Li, Sirani M. Perera,
- Abstract summary: This paper stems from data-driven learning to advance states of dynamical systems utilizing a structured neural network (StNN)<n>We present numerical simulations to solve dynamical systems utilizing the StNN based on the Hankel operator.<n>We show that the proposed StNN paves way for realizing state-space dynamical systems with a low-complexity learning enabling prediction and understanding of future states.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Data-driven learning is rapidly evolving and places a new perspective on realizing state-space dynamical systems. However, dynamical systems derived from nonlinear ordinary differential equations (ODEs) suffer from limitations in computational efficiency. Thus, this paper stems from data-driven learning to advance states of dynamical systems utilizing a structured neural network (StNN). The proposed learning technique also seeks to identify an optimal, low-complexity operator to solve dynamical systems, the so-called Hankel operator, derived from time-delay measurements. Thus, we utilize the StNN based on the Hankel operator to solve dynamical systems as an alternative to existing data-driven techniques. We show that the proposed StNN reduces the number of parameters and computational complexity compared with the conventional neural networks and also with the classical data-driven techniques, such as Sparse Identification of Nonlinear Dynamics (SINDy) and Hankel Alternative view of Koopman (HAVOK), which is commonly known as delay-Dynamic Mode Decomposition(DMD) or Hankel-DMD. More specifically, we present numerical simulations to solve dynamical systems utilizing the StNN based on the Hankel operator beginning from the fundamental Lotka-Volterra model, where we compare the StNN with the LEarning Across Dynamical Systems (LEADS), and extend our analysis to highly nonlinear and chaotic Lorenz systems, comparing the StNN with conventional neural networks, SINDy, and HAVOK. Hence, we show that the proposed StNN paves the way for realizing state-space dynamical systems with a low-complexity learning algorithm, enabling prediction and understanding of future states.
Related papers
- SINDyG: Sparse Identification of Nonlinear Dynamical Systems from Graph-Structured Data [0.27624021966289597]
We develop a new method called Sparse Identification of Dynamical Systems from Graph-structured data (SINDyG)<n>SINDyG incorporates the network structure into sparse regression to identify model parameters that explain the underlying network dynamics.<n>Our experiments validate the improved accuracy and simplicity of discovered network dynamics.
arXiv Detail & Related papers (2024-09-02T17:51:37Z) - Systematic construction of continuous-time neural networks for linear dynamical systems [0.0]
We discuss a systematic approach to constructing neural architectures for modeling a subclass of dynamical systems.
We use a variant of continuous-time neural networks in which the output of each neuron evolves continuously as a solution of a first-order or second-order Ordinary Differential Equation (ODE)
Instead of deriving the network architecture and parameters from data, we propose a gradient-free algorithm to compute sparse architecture and network parameters directly from the given LTI system.
arXiv Detail & Related papers (2024-03-24T16:16:41Z) - Latent Dynamics Networks (LDNets): learning the intrinsic dynamics of
spatio-temporal processes [2.3694122563610924]
Latent Dynamics Network (LDNet) is able to discover low-dimensional intrinsic dynamics of possibly non-Markovian dynamical systems.
LDNets are lightweight and easy-to-train, with excellent accuracy and generalization properties, even in time-extrapolation regimes.
arXiv Detail & Related papers (2023-04-28T21:11:13Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Recurrent Neural Networks for Dynamical Systems: Applications to
Ordinary Differential Equations, Collective Motion, and Hydrological Modeling [0.20999222360659606]
We train and test RNNs uniquely in each task to demonstrate the broad applicability of RNNs in reconstruction and forecasting the dynamics of dynamical systems.
We analyze the performance of RNNs applied to three tasks: reconstruction of correct Lorenz solutions for a system with an error formulation, reconstruction of corrupted collective motion, trajectories, and forecasting of streamflow time series possessing spikes.
arXiv Detail & Related papers (2022-02-14T20:34:49Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - A novel Deep Neural Network architecture for non-linear system
identification [78.69776924618505]
We present a novel Deep Neural Network (DNN) architecture for non-linear system identification.
Inspired by fading memory systems, we introduce inductive bias (on the architecture) and regularization (on the loss function)
This architecture allows for automatic complexity selection based solely on available data.
arXiv Detail & Related papers (2021-06-06T10:06:07Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Neural Dynamical Systems: Balancing Structure and Flexibility in
Physical Prediction [14.788494279754481]
We introduce Neural Dynamical Systems (NDS), a method of learning dynamical models in various gray-box settings.
NDS uses neural networks to estimate free parameters of the system, predicts residual terms, and numerically integrates over time to predict future states.
arXiv Detail & Related papers (2020-06-23T00:50:48Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.