Reconstructing a dynamical system and forecasting time series by
self-consistent deep learning
- URL: http://arxiv.org/abs/2108.01862v1
- Date: Wed, 4 Aug 2021 06:10:58 GMT
- Title: Reconstructing a dynamical system and forecasting time series by
self-consistent deep learning
- Authors: Zhe Wang and Claude Guet
- Abstract summary: We introduce a self-consistent deep-learning framework for a noisy deterministic time series.
It provides unsupervised filtering, state-space reconstruction, identification of the underlying differential equations and forecasting.
- Score: 4.947248396489835
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We introduce a self-consistent deep-learning framework which, for a noisy
deterministic time series, provides unsupervised filtering, state-space
reconstruction, identification of the underlying differential equations and
forecasting. Without a priori information on the signal, we embed the time
series in a state space, where deterministic structures, i.e. attractors, are
revealed. Under the assumption that the evolution of solution trajectories is
described by an unknown dynamical system, we filter out stochastic outliers.
The embedding function, the solution trajectories and the dynamical systems are
constructed using deep neural networks, respectively. By exploiting the
differentiability of the neural solution trajectory, the neural dynamical
system is defined locally at each time, mitigating the need for propagating
gradients through numerical solvers. On a chaotic time series masked by
additive Gaussian noise, we demonstrate the filtering ability and the
predictive power of the proposed framework.
Related papers
- Deep Recurrent Stochastic Configuration Networks for Modelling Nonlinear Dynamic Systems [3.8719670789415925]
This paper proposes a novel deep reservoir computing framework, termed deep recurrent configuration network (DeepRSCN)
DeepRSCNs are incrementally constructed, with all reservoir nodes directly linked to the final output.
Given a set of training samples, DeepRSCNs can quickly generate learning representations, which consist of random basis functions with cascaded input readout weights.
arXiv Detail & Related papers (2024-10-28T10:33:15Z) - Neural Harmonium: An Interpretable Deep Structure for Nonlinear Dynamic
System Identification with Application to Audio Processing [4.599180419117645]
Interpretability helps us understand a model's ability to generalize and reveal its limitations.
We introduce a causal interpretable deep structure for modeling dynamic systems.
Our proposed model makes use of the harmonic analysis by modeling the system in a time-frequency domain.
arXiv Detail & Related papers (2023-10-10T21:32:15Z) - Spectral learning of Bernoulli linear dynamical systems models [21.3534487101893]
We develop a learning method for fast, efficient fitting of latent linear dynamical system models.
Our approach extends traditional subspace identification methods to the Bernoulli setting.
We show that the estimator provides real world settings by analyzing data from mice performing a sensory decision-making task.
arXiv Detail & Related papers (2023-03-03T16:29:12Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Supervised DKRC with Images for Offline System Identification [77.34726150561087]
Modern dynamical systems are becoming increasingly non-linear and complex.
There is a need for a framework to model these systems in a compact and comprehensive representation for prediction and control.
Our approach learns these basis functions using a supervised learning approach.
arXiv Detail & Related papers (2021-09-06T04:39:06Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural Dynamic Mode Decomposition for End-to-End Modeling of Nonlinear
Dynamics [49.41640137945938]
We propose a neural dynamic mode decomposition for estimating a lift function based on neural networks.
With our proposed method, the forecast error is backpropagated through the neural networks and the spectral decomposition.
Our experiments demonstrate the effectiveness of our proposed method in terms of eigenvalue estimation and forecast performance.
arXiv Detail & Related papers (2020-12-11T08:34:26Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Data-driven learning of non-autonomous systems [4.459185142332526]
We present a numerical framework for recovering unknown non-autonomous dynamical systems with time-dependent inputs.
To circumvent the difficulty presented by the non-autonomous nature of the system, our method transforms the solution state into piecewise of the system over a discrete set of time instances.
arXiv Detail & Related papers (2020-06-02T15:33:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.