Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations
- URL: http://arxiv.org/abs/2202.12358v1
- Date: Thu, 24 Feb 2022 20:46:52 GMT
- Title: Physics Informed RNN-DCT Networks for Time-Dependent Partial
Differential Equations
- Authors: Benjamin Wu, Oliver Hennigh, Jan Kautz, Sanjay Choudhry, Wonmin Byeon
- Abstract summary: We present a physics-informed framework for solving time-dependent partial differential equations.
Our model utilizes discrete cosine transforms to encode spatial and recurrent neural networks.
We show experimental results on the Taylor-Green vortex solution to the Navier-Stokes equations.
- Score: 62.81701992551728
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks allow models to be trained by physical laws
described by general nonlinear partial differential equations. However,
traditional architectures struggle to solve more challenging time-dependent
problems due to their architectural nature. In this work, we present a novel
physics-informed framework for solving time-dependent partial differential
equations. Using only the governing differential equations and problem initial
and boundary conditions, we generate a latent representation of the problem's
spatio-temporal dynamics. Our model utilizes discrete cosine transforms to
encode spatial frequencies and recurrent neural networks to process the time
evolution. This efficiently and flexibly produces a compressed representation
which is used for additional conditioning of physics-informed models. We show
experimental results on the Taylor-Green vortex solution to the Navier-Stokes
equations. Our proposed model achieves state-of-the-art performance on the
Taylor-Green vortex relative to other physics-informed baseline models.
Related papers
- Discovery of Quasi-Integrable Equations from traveling-wave data using the Physics-Informed Neural Networks [0.0]
PINNs are used to study vortex solutions in 2+1 dimensional nonlinear partial differential equations.
We consider PINNs with conservation laws (referred to as cPINNs), deformations of the initial profiles, and a friction approach to improve the identification's resolution.
arXiv Detail & Related papers (2024-10-23T08:29:13Z) - Transport-Embedded Neural Architecture: Redefining the Landscape of physics aware neural models in fluid mechanics [0.0]
A physical problem, the Taylor-Green vortex, defined on a bi-periodic domain, is used as a benchmark to evaluate the performance of both the standard physics-informed neural network and our model.
Results exhibit that while the standard physics-informed neural network fails to predict the solution accurately and merely returns the initial condition for the entire time span, our model successfully captures the temporal changes in the physics.
arXiv Detail & Related papers (2024-10-05T10:32:51Z) - An efficient wavelet-based physics-informed neural networks for singularly perturbed problems [0.0]
Physics-informed neural networks (PINNs) are a class of deep learning models that utilize physics as differential equations.
We present an efficient wavelet-based PINNs model to solve singularly perturbed differential equations.
The architecture allows the training process to search for a solution within wavelet space, making the process faster and more accurate.
arXiv Detail & Related papers (2024-09-18T10:01:37Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Neural Laplace: Learning diverse classes of differential equations in
the Laplace domain [86.52703093858631]
We propose a unified framework for learning diverse classes of differential equations (DEs) including all the aforementioned ones.
Instead of modelling the dynamics in the time domain, we model it in the Laplace domain, where the history-dependencies and discontinuities in time can be represented as summations of complex exponentials.
In the experiments, Neural Laplace shows superior performance in modelling and extrapolating the trajectories of diverse classes of DEs.
arXiv Detail & Related papers (2022-06-10T02:14:59Z) - Learning to Solve PDE-constrained Inverse Problems with Graph Networks [51.89325993156204]
In many application domains across science and engineering, we are interested in solving inverse problems with constraints defined by a partial differential equation (PDE)
Here we explore GNNs to solve such PDE-constrained inverse problems.
We demonstrate computational speedups of up to 90x using GNNs compared to principled solvers.
arXiv Detail & Related papers (2022-06-01T18:48:01Z) - Stiff Neural Ordinary Differential Equations [0.0]
We first show the challenges of learning neural ODE in the classical stiff ODE systems of Robertson's problem.
We then present successful demonstrations in stiff systems of Robertson's problem and an air pollution problem.
arXiv Detail & Related papers (2021-03-29T05:24:56Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z) - Physics Informed Deep Learning for Transport in Porous Media. Buckley
Leverett Problem [0.0]
We present a new hybrid physics-based machine-learning approach to reservoir modeling.
The methodology relies on a series of deep adversarial neural network architecture with physics-based regularization.
The proposed methodology is a simple and elegant way to instill physical knowledge to machine-learning algorithms.
arXiv Detail & Related papers (2020-01-15T08:20:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.