Learning POD of Complex Dynamics Using Heavy-ball Neural ODEs
- URL: http://arxiv.org/abs/2202.12373v1
- Date: Thu, 24 Feb 2022 22:00:25 GMT
- Title: Learning POD of Complex Dynamics Using Heavy-ball Neural ODEs
- Authors: Justin Baker and Elena Cherkaev and Akil Narayan and Bao Wang
- Abstract summary: We leverage the recently proposed heavy-ball neural ODEs (HBNODEs) for learning data-driven reduced-order models.
HBNODE enjoys several practical advantages for learning POD-based ROMs with theoretical guarantees.
- Score: 7.388910452780173
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: Proper orthogonal decomposition (POD) allows reduced-order modeling of
complex dynamical systems at a substantial level, while maintaining a high
degree of accuracy in modeling the underlying dynamical systems. Advances in
machine learning algorithms enable learning POD-based dynamics from data and
making accurate and fast predictions of dynamical systems. In this paper, we
leverage the recently proposed heavy-ball neural ODEs (HBNODEs) [Xia et al.
NeurIPS, 2021] for learning data-driven reduced-order models (ROMs) in the POD
context, in particular, for learning dynamics of time-varying coefficients
generated by the POD analysis on training snapshots generated from solving full
order models. HBNODE enjoys several practical advantages for learning POD-based
ROMs with theoretical guarantees, including 1) HBNODE can learn long-term
dependencies effectively from sequential observations and 2) HBNODE is
computationally efficient in both training and testing. We compare HBNODE with
other popular ROMs on several complex dynamical systems, including the von
K\'{a}rm\'{a}n Street flow, the Kurganov-Petrova-Popov equation, and the
one-dimensional Euler equations for fluids modeling.
Related papers
- Parametric Taylor series based latent dynamics identification neural networks [0.3139093405260182]
A new latent identification of nonlinear dynamics, P-TLDINets, is introduced.
It relies on a novel neural network structure based on Taylor series expansion and ResNets.
arXiv Detail & Related papers (2024-10-05T15:10:32Z) - KAN-ODEs: Kolmogorov-Arnold Network Ordinary Differential Equations for Learning Dynamical Systems and Hidden Physics [0.0]
Kolmogorov-Arnold networks (KANs) are an alternative to multi-layer perceptrons (MLPs)
This work applies KANs as the backbone of a neural ordinary differential equation (ODE) framework.
arXiv Detail & Related papers (2024-07-05T00:38:49Z) - Semi-Supervised Learning of Dynamical Systems with Neural Ordinary
Differential Equations: A Teacher-Student Model Approach [10.20098335268973]
TS-NODE is the first semi-supervised approach to modeling dynamical systems with NODE.
We show significant performance improvements over a baseline Neural ODE model on multiple dynamical system modeling tasks.
arXiv Detail & Related papers (2023-10-19T19:17:12Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Gradient-Based Trajectory Optimization With Learned Dynamics [80.41791191022139]
We use machine learning techniques to learn a differentiable dynamics model of the system from data.
We show that a neural network can model highly nonlinear behaviors accurately for large time horizons.
In our hardware experiments, we demonstrate that our learned model can represent complex dynamics for both the Spot and Radio-controlled (RC) car.
arXiv Detail & Related papers (2022-04-09T22:07:34Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - Accelerating Neural ODEs Using Model Order Reduction [0.0]
We show that mathematical model order reduction methods can be used for compressing and accelerating Neural ODEs.
We implement our novel compression method by developing Neural ODEs that integrate the necessary subspace-projection and operations as layers of the neural network.
arXiv Detail & Related papers (2021-05-28T19:27:09Z) - DyNODE: Neural Ordinary Differential Equations for Dynamics Modeling in
Continuous Control [0.0]
We present a novel approach that captures the underlying dynamics of a system by incorporating control in a neural ordinary differential equation framework.
Results indicate that a simple DyNODE architecture when combined with an actor-critic reinforcement learning algorithm outperforms canonical neural networks.
arXiv Detail & Related papers (2020-09-09T12:56:58Z) - An Ode to an ODE [78.97367880223254]
We present a new paradigm for Neural ODE algorithms, called ODEtoODE, where time-dependent parameters of the main flow evolve according to a matrix flow on the group O(d)
This nested system of two flows provides stability and effectiveness of training and provably solves the gradient vanishing-explosion problem.
arXiv Detail & Related papers (2020-06-19T22:05:19Z) - Learning Stable Deep Dynamics Models [91.90131512825504]
We propose an approach for learning dynamical systems that are guaranteed to be stable over the entire state space.
We show that such learning systems are able to model simple dynamical systems and can be combined with additional deep generative models to learn complex dynamics.
arXiv Detail & Related papers (2020-01-17T00:04:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.