Learning Fine Scale Dynamics from Coarse Observations via Inner
Recurrence
- URL: http://arxiv.org/abs/2206.01807v1
- Date: Fri, 3 Jun 2022 20:28:52 GMT
- Title: Learning Fine Scale Dynamics from Coarse Observations via Inner
Recurrence
- Authors: Victor Churchill, Dongbin Xiu
- Abstract summary: Recent work has focused on data-driven learning of the evolution of unknown systems via deep neural networks (DNNs)
This paper presents a computational technique to learn the fine-scale dynamics from such coarsely observed data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent work has focused on data-driven learning of the evolution of unknown
systems via deep neural networks (DNNs), with the goal of conducting long term
prediction of the dynamics of the unknown system. In many real-world
applications, data from time-dependent systems are often collected on a time
scale that is coarser than desired, due to various restrictions during the data
acquisition process. Consequently, the observed dynamics can be severely
under-sampled and do not reflect the true dynamics of the underlying system.
This paper presents a computational technique to learn the fine-scale dynamics
from such coarsely observed data. The method employs inner recurrence of a DNN
to recover the fine-scale evolution operator of the underlying system. In
addition to mathematical justification, several challenging numerical examples,
including unknown systems of both ordinary and partial differential equations,
are presented to demonstrate the effectiveness of the proposed method.
Related papers
- Learning System Dynamics without Forgetting [60.08612207170659]
Predicting trajectories of systems with unknown dynamics is crucial in various research fields, including physics and biology.
We present a novel framework of Mode-switching Graph ODE (MS-GODE), which can continually learn varying dynamics.
We construct a novel benchmark of biological dynamic systems, featuring diverse systems with disparate dynamics.
arXiv Detail & Related papers (2024-06-30T14:55:18Z) - Inferring stochastic low-rank recurrent neural networks from neural data [5.179844449042386]
A central aim in computational neuroscience is to relate the activity of large neurons to an underlying dynamical system.
Low-rank recurrent neural networks (RNNs) exhibit such interpretability by having tractable dynamics.
Here, we propose to fit low-rank RNNs with variational sequential Monte Carlo methods.
arXiv Detail & Related papers (2024-06-24T15:57:49Z) - Leveraging Neural Koopman Operators to Learn Continuous Representations
of Dynamical Systems from Scarce Data [0.0]
We propose a new deep Koopman framework that represents dynamics in an intrinsically continuous way.
This framework leads to better performance on limited training data.
arXiv Detail & Related papers (2023-03-13T10:16:19Z) - Decomposed Linear Dynamical Systems (dLDS) for learning the latent
components of neural dynamics [6.829711787905569]
We propose a new decomposed dynamical system model that represents complex non-stationary and nonlinear dynamics of time series data.
Our model is trained through a dictionary learning procedure, where we leverage recent results in tracking sparse vectors over time.
In both continuous-time and discrete-time instructional examples we demonstrate that our model can well approximate the original system.
arXiv Detail & Related papers (2022-06-07T02:25:38Z) - Learning effective dynamics from data-driven stochastic systems [2.4578723416255754]
This work is devoted to investigating the effective dynamics for slow-fast dynamical systems.
We propose a novel algorithm including a neural network called Auto-SDE to learn in slow manifold.
arXiv Detail & Related papers (2022-05-09T09:56:58Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Leveraging the structure of dynamical systems for data-driven modeling [111.45324708884813]
We consider the impact of the training set and its structure on the quality of the long-term prediction.
We show how an informed design of the training set, based on invariants of the system and the structure of the underlying attractor, significantly improves the resulting models.
arXiv Detail & Related papers (2021-12-15T20:09:20Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Learning Continuous System Dynamics from Irregularly-Sampled Partial
Observations [33.63818978256567]
We present LG-ODE, a latent ordinary differential equation generative model for modeling multi-agent dynamic system with known graph structure.
It can simultaneously learn the embedding of high dimensional trajectories and infer continuous latent system dynamics.
Our model employs a novel encoder parameterized by a graph neural network that can infer initial states in an unsupervised way.
arXiv Detail & Related papers (2020-11-08T01:02:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.