Data-Driven Reduced-Order Modeling of Spatiotemporal Chaos with Neural
Ordinary Differential Equations
- URL: http://arxiv.org/abs/2109.00060v1
- Date: Tue, 31 Aug 2021 20:00:33 GMT
- Title: Data-Driven Reduced-Order Modeling of Spatiotemporal Chaos with Neural
Ordinary Differential Equations
- Authors: Alec J. Linot and Michael D. Graham
- Abstract summary: We present a data-driven reduced order modeling method that capitalizes on the chaotic dynamics of partial differential equations.
We find that dimension reduction improves performance relative to predictions in the ambient space.
With the low-dimensional model, we find excellent short- and long-time statistical recreation of the true dynamics for widely spaced data.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dissipative partial differential equations that exhibit chaotic dynamics tend
to evolve to attractors that exist on finite-dimensional manifolds. We present
a data-driven reduced order modeling method that capitalizes on this fact by
finding the coordinates of this manifold and finding an ordinary differential
equation (ODE) describing the dynamics in this coordinate system. The manifold
coordinates are discovered using an undercomplete autoencoder -- a neural
network (NN) that reduces then expands dimension. Then the ODE, in these
coordinates, is approximated by a NN using the neural ODE framework. Both of
these methods only require snapshots of data to learn a model, and the data can
be widely and/or unevenly spaced. We apply this framework to the
Kuramoto-Sivashinsky for different domain sizes that exhibit chaotic dynamics.
With this system, we find that dimension reduction improves performance
relative to predictions in the ambient space, where artifacts arise. Then, with
the low-dimensional model, we vary the training data spacing and find excellent
short- and long-time statistical recreation of the true dynamics for widely
spaced data (spacing of ~0.7 Lyapunov times). We end by comparing performance
with various degrees of dimension reduction, and find a "sweet spot" in terms
of performance vs. dimension.
Related papers
- On the Trajectory Regularity of ODE-based Diffusion Sampling [79.17334230868693]
Diffusion-based generative models use differential equations to establish a smooth connection between a complex data distribution and a tractable prior distribution.
In this paper, we identify several intriguing trajectory properties in the ODE-based sampling process of diffusion models.
arXiv Detail & Related papers (2024-05-18T15:59:41Z) - A Geometric Perspective on Diffusion Models [57.27857591493788]
We inspect the ODE-based sampling of a popular variance-exploding SDE.
We establish a theoretical relationship between the optimal ODE-based sampling and the classic mean-shift (mode-seeking) algorithm.
arXiv Detail & Related papers (2023-05-31T15:33:16Z) - Autoencoders for discovering manifold dimension and coordinates in data
from complex dynamical systems [0.0]
Autoencoder framework combines implicit regularization with internal linear layers and $L$ regularization (weight decay)
We show that this framework can be naturally extended for applications of state-space modeling and forecasting.
arXiv Detail & Related papers (2023-05-01T21:14:47Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Data-driven low-dimensional dynamic model of Kolmogorov flow [0.0]
Reduced order models (ROMs) that capture flow dynamics are of interest for decreasing computational costs for simulation.
This work presents a data-driven framework for minimal-dimensional models that effectively capture the dynamics and properties of the flow.
We apply this to Kolmogorov flow in a regime consisting of chaotic and intermittent behavior.
arXiv Detail & Related papers (2022-10-29T23:05:39Z) - Optimizing differential equations to fit data and predict outcomes [0.0]
Recent technical advances in automatic differentiation through numerical differential equation solvers potentially change the fitting process into a relatively easy problem.
This article illustrates how to overcome a variety of common challenges, using the classic ecological data for oscillations in hare and lynx populations.
arXiv Detail & Related papers (2022-04-16T16:08:08Z) - Path Development Network with Finite-dimensional Lie Group Representation [3.9983665898166425]
We propose a novel, trainable path development layer, which exploits representations of sequential data through finite-dimensional Lie groups.
Our proposed layer, analogous to recurrent neural networks (RNN), possesses an explicit, simple recurrent unit that alleviates the gradient issues.
Empirical results on a range of datasets show that the development layer consistently and significantly outperforms signature features on accuracy and dimensionality.
arXiv Detail & Related papers (2022-04-02T02:01:00Z) - Charts and atlases for nonlinear data-driven models of dynamics on
manifolds [0.0]
We introduce a method for learning minimal-dimensional dynamical models from high-dimensional time series data that lie on a low-dimensional manifold.
We apply this method to examples ranging from simple periodic dynamics to complex, nominally high-dimensional non-periodic bursting dynamics of the Kuramoto-Sivashinsky equation.
arXiv Detail & Related papers (2021-08-12T19:06:08Z) - Mix Dimension in Poincar\'{e} Geometry for 3D Skeleton-based Action
Recognition [57.98278794950759]
Graph Convolutional Networks (GCNs) have already demonstrated their powerful ability to model the irregular data.
We present a novel spatial-temporal GCN architecture which is defined via the Poincar'e geometry.
We evaluate our method on two current largest scale 3D datasets.
arXiv Detail & Related papers (2020-07-30T18:23:18Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.