Parameterized Neural Ordinary Differential Equations: Applications to
Computational Physics Problems
- URL: http://arxiv.org/abs/2010.14685v1
- Date: Wed, 28 Oct 2020 00:41:28 GMT
- Title: Parameterized Neural Ordinary Differential Equations: Applications to
Computational Physics Problems
- Authors: Kookjin Lee and Eric J. Parish
- Abstract summary: We propose an extension of neural ordinary differential equations (NODEs) by introducing an additional set of ODE input parameters to NODEs.
This extension allows NODEs to learn multiple dynamics specified by the input parameter instances.
We demonstrate the effectiveness of PNODEs with important benchmark problems from computational physics.
- Score: 5.885020100736158
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work proposes an extension of neural ordinary differential equations
(NODEs) by introducing an additional set of ODE input parameters to NODEs. This
extension allows NODEs to learn multiple dynamics specified by the input
parameter instances. Our extension is inspired by the concept of parameterized
ordinary differential equations, which are widely investigated in computational
science and engineering contexts, where characteristics of the governing
equations vary over the input parameters. We apply the proposed parameterized
NODEs (PNODEs) for learning latent dynamics of complex dynamical processes that
arise in computational physics, which is an essential component for enabling
rapid numerical simulations for time-critical physics applications. For this,
we propose an encoder-decoder-type framework, which models latent dynamics as
PNODEs. We demonstrate the effectiveness of PNODEs with important benchmark
problems from computational physics.
Related papers
- Stiff Transfer Learning for Physics-Informed Neural Networks [1.5361702135159845]
We propose a novel approach, stiff transfer learning for physics-informed neural networks (STL-PINNs) to tackle stiff ordinary differential equations (ODEs) and partial differential equations (PDEs)
Our methodology involves training a Multi-Head-PINN in a low-stiff regime, and obtaining the final solution in a high stiff regime by transfer learning.
This addresses the failure modes related to stiffness in PINNs while maintaining computational efficiency by computing "one-shot" solutions.
arXiv Detail & Related papers (2025-01-28T20:27:38Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Solving Differential Equations using Physics-Informed Deep Equilibrium Models [4.237218036051422]
This paper introduces Physics-Informed Deep Equilibrium Models (PIDEQs) for solving initial value problems (IVPs) of ordinary differential equations (ODEs)
By bridging deep learning and physics-based modeling, this work advances computational techniques for solving IVPs, with implications for scientific computing and engineering applications.
arXiv Detail & Related papers (2024-06-05T17:25:29Z) - Physics-informed Discretization-independent Deep Compositional Operator Network [1.2430809884830318]
We introduce a novel physics-informed model architecture which can generalize to various discrete representations of PDE parameters and irregular domain shapes.
Inspired by deep operator neural networks, our model involves a discretization-independent learning of parameter embedding repeatedly.
Numerical results demonstrate the accuracy and efficiency of the proposed method.
arXiv Detail & Related papers (2024-04-21T12:41:30Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - PI-VAE: Physics-Informed Variational Auto-Encoder for stochastic
differential equations [2.741266294612776]
We propose a new class of physics-informed neural networks, called physics-informed Variational Autoencoder (PI-VAE)
PI-VAE consists of a variational autoencoder (VAE), which generates samples of system variables and parameters.
The satisfactory accuracy and efficiency of the proposed method are numerically demonstrated in comparison with physics-informed generative adversarial network (PI-WGAN)
arXiv Detail & Related papers (2022-03-21T21:51:19Z) - Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations [62.81701992551728]
We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
arXiv Detail & Related papers (2022-02-24T20:46:52Z) - HyperPINN: Learning parameterized differential equations with
physics-informed hypernetworks [32.095262903584725]
We propose the HyperPINN, which uses hypernetworks to learn to generate neural networks that can solve a differential equation from a given parameterization.
We demonstrate with experiments on both a PDE and an ODE that this type of model can lead to neural network solutions to differential equations that maintain a small size.
arXiv Detail & Related papers (2021-10-28T17:50:09Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.