Learning effective dynamics from data-driven stochastic systems
- URL: http://arxiv.org/abs/2205.04151v3
- Date: Sat, 30 Dec 2023 03:27:49 GMT
- Title: Learning effective dynamics from data-driven stochastic systems
- Authors: Lingyu Feng, Ting Gao, Min Dai and Jinqiao Duan
- Abstract summary: This work is devoted to investigating the effective dynamics for slow-fast dynamical systems.
We propose a novel algorithm including a neural network called Auto-SDE to learn in slow manifold.
- Score: 2.4578723416255754
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Multiscale stochastic dynamical systems have been widely adopted to a variety
of scientific and engineering problems due to their capability of depicting
complex phenomena in many real world applications. This work is devoted to
investigating the effective dynamics for slow-fast stochastic dynamical
systems. Given observation data on a short-term period satisfying some unknown
slow-fast stochastic systems, we propose a novel algorithm including a neural
network called Auto-SDE to learn invariant slow manifold. Our approach captures
the evolutionary nature of a series of time-dependent autoencoder neural
networks with the loss constructed from a discretized stochastic differential
equation. Our algorithm is also validated to be accurate, stable and effective
through numerical experiments under various evaluation metrics.
Related papers
- Learning Controlled Stochastic Differential Equations [61.82896036131116]
This work proposes a novel method for estimating both drift and diffusion coefficients of continuous, multidimensional, nonlinear controlled differential equations with non-uniform diffusion.
We provide strong theoretical guarantees, including finite-sample bounds for (L2), (Linfty), and risk metrics, with learning rates adaptive to coefficients' regularity.
Our method is available as an open-source Python library.
arXiv Detail & Related papers (2024-11-04T11:09:58Z) - Modeling Randomly Observed Spatiotemporal Dynamical Systems [7.381752536547389]
Currently available neural network-based modeling approaches fall short when faced with data collected randomly over time and space.
In response, we developed a new method that effectively handles such randomly sampled data.
Our model integrates techniques from amortized variational inference, neural differential equations, neural point processes, and implicit neural representations to predict both the dynamics of the system and the timings and locations of future observations.
arXiv Detail & Related papers (2024-06-01T09:03:32Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Newton-Cotes Graph Neural Networks: On the Time Evolution of Dynamic
Systems [49.50674348130157]
We propose a new approach to predict the integration based on several velocity estimations with Newton-Cotes formulas.
Experiments on several benchmarks empirically demonstrate consistent and significant improvement compared with the state-of-the-art methods.
arXiv Detail & Related papers (2023-05-24T02:23:00Z) - Learning Fine Scale Dynamics from Coarse Observations via Inner
Recurrence [0.0]
Recent work has focused on data-driven learning of the evolution of unknown systems via deep neural networks (DNNs)
This paper presents a computational technique to learn the fine-scale dynamics from such coarsely observed data.
arXiv Detail & Related papers (2022-06-03T20:28:52Z) - Stochastic Physics-Informed Neural Networks (SPINN): A Moment-Matching
Framework for Learning Hidden Physics within Stochastic Differential
Equations [4.482886054198202]
We propose a framework for training deep neural networks to learn equations that represent hidden physics within differential equations (SDEs)
The proposed framework relies on uncertainty propagation and moment-matching techniques along with state-of-the-art deep learning strategies.
arXiv Detail & Related papers (2021-09-03T16:59:12Z) - Consistency of mechanistic causal discovery in continuous-time using
Neural ODEs [85.7910042199734]
We consider causal discovery in continuous-time for the study of dynamical systems.
We propose a causal discovery algorithm based on penalized Neural ODEs.
arXiv Detail & Related papers (2021-05-06T08:48:02Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z) - Learning Continuous-Time Dynamics by Stochastic Differential Networks [32.63114111531396]
We propose a flexible continuous-time recurrent neural network named Variational Differential Networks (VSDN)
VSDN embeds the complicated dynamics of the sporadic time series by neural Differential Equations (SDE)
We show that VSDNs outperform state-of-the-art continuous-time deep learning models and achieve remarkable performance on prediction and tasks for sporadic time series.
arXiv Detail & Related papers (2020-06-11T01:40:34Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.