Explicit construction of recurrent neural networks effectively approximating discrete dynamical systems
- URL: http://arxiv.org/abs/2409.19278v1
- Date: Sat, 28 Sep 2024 07:59:45 GMT
- Title: Explicit construction of recurrent neural networks effectively approximating discrete dynamical systems
- Authors: Chikara Nakayama, Tsuyoshi Yoneda,
- Abstract summary: We consider arbitrary bounded discrete time series originating from dynamical system with recursivity.
We provide an explicit construction of recurrent neural networks which effectively approximate the corresponding discrete dynamical systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider arbitrary bounded discrete time series originating from dynamical system with recursivity. More precisely, we provide an explicit construction of recurrent neural networks which effectively approximate the corresponding discrete dynamical systems.
Related papers
- Neural Symbolic Regression of Complex Network Dynamics [28.356824329954495]
We propose Physically Inspired Neural Dynamics Regression (PI-NDSR) to automatically learn the symbolic expression of dynamics.
We evaluate our method on synthetic datasets generated by various dynamics and real datasets on disease spreading.
arXiv Detail & Related papers (2024-10-15T02:02:30Z) - Systematic construction of continuous-time neural networks for linear dynamical systems [0.0]
We discuss a systematic approach to constructing neural architectures for modeling a subclass of dynamical systems.
We use a variant of continuous-time neural networks in which the output of each neuron evolves continuously as a solution of a first-order or second-order Ordinary Differential Equation (ODE)
Instead of deriving the network architecture and parameters from data, we propose a gradient-free algorithm to compute sparse architecture and network parameters directly from the given LTI system.
arXiv Detail & Related papers (2024-03-24T16:16:41Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Recurrent Neural Networks for Dynamical Systems: Applications to
Ordinary Differential Equations, Collective Motion, and Hydrological Modeling [0.20999222360659606]
We train and test RNNs uniquely in each task to demonstrate the broad applicability of RNNs in reconstruction and forecasting the dynamics of dynamical systems.
We analyze the performance of RNNs applied to three tasks: reconstruction of correct Lorenz solutions for a system with an error formulation, reconstruction of corrupted collective motion, trajectories, and forecasting of streamflow time series possessing spikes.
arXiv Detail & Related papers (2022-02-14T20:34:49Z) - Reconstructing a dynamical system and forecasting time series by
self-consistent deep learning [4.947248396489835]
We introduce a self-consistent deep-learning framework for a noisy deterministic time series.
It provides unsupervised filtering, state-space reconstruction, identification of the underlying differential equations and forecasting.
arXiv Detail & Related papers (2021-08-04T06:10:58Z) - Spatio-Temporal Activation Function To Map Complex Dynamical Systems [0.0]
Reservoir computing, which is a subset of recurrent neural networks, is actively used to simulate complex dynamical systems.
The inclusion of a temporal term alters the fundamental nature of an activation function, it provides capability to capture the complex dynamics of time series data.
arXiv Detail & Related papers (2020-09-06T23:08:25Z) - Continuous-in-Depth Neural Networks [107.47887213490134]
We first show that ResNets fail to be meaningful dynamical in this richer sense.
We then demonstrate that neural network models can learn to represent continuous dynamical systems.
We introduce ContinuousNet as a continuous-in-depth generalization of ResNet architectures.
arXiv Detail & Related papers (2020-08-05T22:54:09Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.