OS-net: Orbitally Stable Neural Networks
- URL: http://arxiv.org/abs/2309.14822v1
- Date: Tue, 26 Sep 2023 10:40:04 GMT
- Title: OS-net: Orbitally Stable Neural Networks
- Authors: Marieme Ngom and Carlo Graziani
- Abstract summary: We introduce OS-net, a new family of neural network architectures specifically designed for periodic dynamical data.
We derive conditions on the network weights to ensure stability of the resulting dynamics.
We demonstrate the efficacy of our approach by applying OS-net to discover the dynamics underlying the R"ossler and Sprott's systems.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: We introduce OS-net (Orbitally Stable neural NETworks), a new family of
neural network architectures specifically designed for periodic dynamical data.
OS-net is a special case of Neural Ordinary Differential Equations (NODEs) and
takes full advantage of the adjoint method based backpropagation method.
Utilizing ODE theory, we derive conditions on the network weights to ensure
stability of the resulting dynamics. We demonstrate the efficacy of our
approach by applying OS-net to discover the dynamics underlying the R\"{o}ssler
and Sprott's systems, two dynamical systems known for their period doubling
attractors and chaotic behavior.
Related papers
- Explicit construction of recurrent neural networks effectively approximating discrete dynamical systems [0.0]
We consider arbitrary bounded discrete time series originating from dynamical system with recursivity.
We provide an explicit construction of recurrent neural networks which effectively approximate the corresponding discrete dynamical systems.
arXiv Detail & Related papers (2024-09-28T07:59:45Z) - Learning Governing Equations of Unobserved States in Dynamical Systems [0.0]
We employ a hybrid neural ODE structure to learn governing equations of partially-observed dynamical systems.
We demonstrate that the method is capable of successfully learning the true underlying governing equations of unobserved states within these systems.
arXiv Detail & Related papers (2024-04-29T10:28:14Z) - Systematic construction of continuous-time neural networks for linear dynamical systems [0.0]
We discuss a systematic approach to constructing neural architectures for modeling a subclass of dynamical systems.
We use a variant of continuous-time neural networks in which the output of each neuron evolves continuously as a solution of a first-order or second-order Ordinary Differential Equation (ODE)
Instead of deriving the network architecture and parameters from data, we propose a gradient-free algorithm to compute sparse architecture and network parameters directly from the given LTI system.
arXiv Detail & Related papers (2024-03-24T16:16:41Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - ConCerNet: A Contrastive Learning Based Framework for Automated
Conservation Law Discovery and Trustworthy Dynamical System Prediction [82.81767856234956]
This paper proposes a new learning framework named ConCerNet to improve the trustworthiness of the DNN based dynamics modeling.
We show that our method consistently outperforms the baseline neural networks in both coordinate error and conservation metrics.
arXiv Detail & Related papers (2023-02-11T21:07:30Z) - Learning Trajectories of Hamiltonian Systems with Neural Networks [81.38804205212425]
We propose to enhance Hamiltonian neural networks with an estimation of a continuous-time trajectory of the modeled system.
We demonstrate that the proposed integration scheme works well for HNNs, especially with low sampling rates, noisy and irregular observations.
arXiv Detail & Related papers (2022-04-11T13:25:45Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - DyNODE: Neural Ordinary Differential Equations for Dynamics Modeling in
Continuous Control [0.0]
We present a novel approach that captures the underlying dynamics of a system by incorporating control in a neural ordinary differential equation framework.
Results indicate that a simple DyNODE architecture when combined with an actor-critic reinforcement learning algorithm outperforms canonical neural networks.
arXiv Detail & Related papers (2020-09-09T12:56:58Z) - Neural Dynamical Systems: Balancing Structure and Flexibility in
Physical Prediction [14.788494279754481]
We introduce Neural Dynamical Systems (NDS), a method of learning dynamical models in various gray-box settings.
NDS uses neural networks to estimate free parameters of the system, predicts residual terms, and numerically integrates over time to predict future states.
arXiv Detail & Related papers (2020-06-23T00:50:48Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.