Deep learning delay coordinate dynamics for chaotic attractors from
partial observable data
- URL: http://arxiv.org/abs/2211.11061v1
- Date: Sun, 20 Nov 2022 19:25:02 GMT
- Title: Deep learning delay coordinate dynamics for chaotic attractors from
partial observable data
- Authors: Charles D. Young and Michael D. Graham
- Abstract summary: We utilize deep artificial neural networks to learn discrete discrete time maps and continuous time flows of the partial state.
We demonstrate the capacity of deep ANNs to predict chaotic behavior from a scalar observation on a manifold of dimension three via the Lorenz system.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: A common problem in time series analysis is to predict dynamics with only
scalar or partial observations of the underlying dynamical system. For data on
a smooth compact manifold, Takens theorem proves a time delayed embedding of
the partial state is diffeomorphic to the attractor, although for chaotic and
highly nonlinear systems learning these delay coordinate mappings is
challenging. We utilize deep artificial neural networks (ANNs) to learn
discrete discrete time maps and continuous time flows of the partial state.
Given training data for the full state, we also learn a reconstruction map.
Thus, predictions of a time series can be made from the current state and
several previous observations with embedding parameters determined from time
series analysis. The state space for time evolution is of comparable dimension
to reduced order manifold models. These are advantages over recurrent neural
network models, which require a high dimensional internal state or additional
memory terms and hyperparameters. We demonstrate the capacity of deep ANNs to
predict chaotic behavior from a scalar observation on a manifold of dimension
three via the Lorenz system. We also consider multivariate observations on the
Kuramoto-Sivashinsky equation, where the observation dimension required for
accurately reproducing dynamics increases with the manifold dimension via the
spatial extent of the system.
Related papers
- Autoregressive with Slack Time Series Model for Forecasting a
Partially-Observed Dynamical Time Series [3.0232957374216953]
We introduce the autoregressive with slack time series (ARS) model, that simultaneously estimates the evolution function and imputes missing variables as a slack time series.
From a theoretical perspective, we prove that a 2-dimensional time-invariant and linear system can be reconstructed by utilizing observations from a single, partially observed dimension of the system.
arXiv Detail & Related papers (2023-06-28T23:07:43Z) - STONet: A Neural-Operator-Driven Spatio-temporal Network [38.5696882090282]
Graph-based graph-temporal neural networks are effective to model spatial dependency among discrete points sampled irregularly.
We propose atemporal framework based on neural operators for PDEs, which learn the mechanisms governing the dynamics of spatially-continuous physical quantities.
Experiments show our model's performance on forecasting spatially-continuous physic quantities, and its superior to unseen spatial points and ability to handle temporally-irregular data.
arXiv Detail & Related papers (2022-04-18T17:20:12Z) - Multivariate Time Series Forecasting with Dynamic Graph Neural ODEs [65.18780403244178]
We propose a continuous model to forecast Multivariate Time series with dynamic Graph neural Ordinary Differential Equations (MTGODE)
Specifically, we first abstract multivariate time series into dynamic graphs with time-evolving node features and unknown graph structures.
Then, we design and solve a neural ODE to complement missing graph topologies and unify both spatial and temporal message passing.
arXiv Detail & Related papers (2022-02-17T02:17:31Z) - Time Series Forecasting with Ensembled Stochastic Differential Equations
Driven by L\'evy Noise [2.3076895420652965]
We use a collection of SDEs equipped with neural networks to predict long-term trend of noisy time series.
Our contributions are, first, we use the phase space reconstruction method to extract intrinsic dimension of the time series data.
Second, we explore SDEs driven by $alpha$-stable L'evy motion to model the time series data and solve the problem through neural network approximation.
arXiv Detail & Related papers (2021-11-25T16:49:01Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Stochastically forced ensemble dynamic mode decomposition for
forecasting and analysis of near-periodic systems [65.44033635330604]
We introduce a novel load forecasting method in which observed dynamics are modeled as a forced linear system.
We show that its use of intrinsic linear dynamics offers a number of desirable properties in terms of interpretability and parsimony.
Results are presented for a test case using load data from an electrical grid.
arXiv Detail & Related papers (2020-10-08T20:25:52Z) - Time-Reversal Symmetric ODE Network [138.02741983098454]
Time-reversal symmetry is a fundamental property that frequently holds in classical and quantum mechanics.
We propose a novel loss function that measures how well our ordinary differential equation (ODE) networks comply with this time-reversal symmetry.
We show that, even for systems that do not possess the full time-reversal symmetry, TRS-ODENs can achieve better predictive performances over baselines.
arXiv Detail & Related papers (2020-07-22T12:19:40Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.