On the effectiveness of neural priors in modeling dynamical systems
- URL: http://arxiv.org/abs/2303.05728v1
- Date: Fri, 10 Mar 2023 06:21:24 GMT
- Title: On the effectiveness of neural priors in modeling dynamical systems
- Authors: Sameera Ramasinghe, Hemanth Saratchandran, Violetta Shevchenko, Simon
Lucey
- Abstract summary: We discuss the architectural regularization that neural networks offer when learning such systems.
We show that simple coordinate networks with few layers can be used to solve multiple problems in modelling dynamical systems.
- Score: 28.69155113611877
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Modelling dynamical systems is an integral component for understanding the
natural world. To this end, neural networks are becoming an increasingly
popular candidate owing to their ability to learn complex functions from large
amounts of data. Despite this recent progress, there has not been an adequate
discussion on the architectural regularization that neural networks offer when
learning such systems, hindering their efficient usage. In this paper, we
initiate a discussion in this direction using coordinate networks as a test
bed. We interpret dynamical systems and coordinate networks from a signal
processing lens, and show that simple coordinate networks with few layers can
be used to solve multiple problems in modelling dynamical systems, without any
explicit regularizers.
Related papers
- From Lazy to Rich: Exact Learning Dynamics in Deep Linear Networks [47.13391046553908]
In artificial networks, the effectiveness of these models relies on their ability to build task specific representation.
Prior studies highlight that different initializations can place networks in either a lazy regime, where representations remain static, or a rich/feature learning regime, where representations evolve dynamically.
These solutions capture the evolution of representations and the Neural Kernel across the spectrum from the rich to the lazy regimes.
arXiv Detail & Related papers (2024-09-22T23:19:04Z) - Efficient PAC Learnability of Dynamical Systems Over Multilayer Networks [30.424671907681688]
We study the learnability of dynamical systems over multilayer networks, which are more realistic and challenging.
We present an efficient PAC learning algorithm with provable guarantees to show that the learner only requires a small number of training examples to infer an unknown system.
arXiv Detail & Related papers (2024-05-11T02:35:08Z) - Systematic construction of continuous-time neural networks for linear dynamical systems [0.0]
We discuss a systematic approach to constructing neural architectures for modeling a subclass of dynamical systems.
We use a variant of continuous-time neural networks in which the output of each neuron evolves continuously as a solution of a first-order or second-order Ordinary Differential Equation (ODE)
Instead of deriving the network architecture and parameters from data, we propose a gradient-free algorithm to compute sparse architecture and network parameters directly from the given LTI system.
arXiv Detail & Related papers (2024-03-24T16:16:41Z) - Mechanistic Neural Networks for Scientific Machine Learning [58.99592521721158]
We present Mechanistic Neural Networks, a neural network design for machine learning applications in the sciences.
It incorporates a new Mechanistic Block in standard architectures to explicitly learn governing differential equations as representations.
Central to our approach is a novel Relaxed Linear Programming solver (NeuRLP) inspired by a technique that reduces solving linear ODEs to solving linear programs.
arXiv Detail & Related papers (2024-02-20T15:23:24Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Learning Contact Dynamics using Physically Structured Neural Networks [81.73947303886753]
We use connections between deep neural networks and differential equations to design a family of deep network architectures for representing contact dynamics between objects.
We show that these networks can learn discontinuous contact events in a data-efficient manner from noisy observations.
Our results indicate that an idealised form of touch feedback is a key component of making this learning problem tractable.
arXiv Detail & Related papers (2021-02-22T17:33:51Z) - Physical deep learning based on optimal control of dynamical systems [0.0]
In this study, we perform pattern recognition based on the optimal control of continuous-time dynamical systems.
As a key example, we apply the dynamics-based recognition approach to an optoelectronic delay system.
This is in contrast to conventional multilayer neural networks, which require a large number of weight parameters to be trained.
arXiv Detail & Related papers (2020-12-16T06:38:01Z) - Foundations and modelling of dynamic networks using Dynamic Graph Neural
Networks: A survey [11.18312489268624]
We establish a foundation of dynamic networks with consistent, detailed terminology and notation.
We present a comprehensive survey of dynamic graph neural network models using the proposed terminology.
arXiv Detail & Related papers (2020-05-13T23:56:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.