Reduced operator inference for nonlinear partial differential equations
- URL: http://arxiv.org/abs/2102.00083v1
- Date: Fri, 29 Jan 2021 21:50:20 GMT
- Title: Reduced operator inference for nonlinear partial differential equations
- Authors: Elizabeth Qian, Ionut-Gabriel Farcas, and Karen Willcox
- Abstract summary: We present a new machine learning method that learns from data a surrogate model for predicting the evolution of a system governed by a time-dependent nonlinear partial differential equation (PDE)
Our formulation generalizes the Operator Inference method previously developed in [B. Peherstorfer and K. Willcox, Data-driven operator inference for non-intrusive projection-based model reduction, Computer Methods in Applied Mechanics and Engineering, 306] for systems governed by ordinary differential equations.
- Score: 2.389598109913753
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present a new scientific machine learning method that learns from data a
computationally inexpensive surrogate model for predicting the evolution of a
system governed by a time-dependent nonlinear partial differential equation
(PDE), an enabling technology for many computational algorithms used in
engineering settings. Our formulation generalizes to the PDE setting the
Operator Inference method previously developed in [B. Peherstorfer and K.
Willcox, Data-driven operator inference for non-intrusive projection-based
model reduction, Computer Methods in Applied Mechanics and Engineering, 306
(2016)] for systems governed by ordinary differential equations. The method
brings together two main elements. First, ideas from projection-based model
reduction are used to explicitly parametrize the learned model by
low-dimensional polynomial operators which reflect the known form of the
governing PDE. Second, supervised machine learning tools are used to infer from
data the reduced operators of this physics-informed parametrization. For
systems whose governing PDEs contain more general (non-polynomial)
nonlinearities, the learned model performance can be improved through the use
of lifting variable transformations, which expose polynomial structure in the
PDE. The proposed method is demonstrated on a three-dimensional combustion
simulation with over 18 million degrees of freedom, for which the learned
reduced models achieve accurate predictions with a dimension reduction of six
orders of magnitude and model runtime reduction of 5-6 orders of magnitude.
Related papers
- DimOL: Dimensional Awareness as A New 'Dimension' in Operator Learning [63.5925701087252]
We introduce DimOL (Dimension-aware Operator Learning), drawing insights from dimensional analysis.
To implement DimOL, we propose the ProdLayer, which can be seamlessly integrated into FNO-based and Transformer-based PDE solvers.
Empirically, DimOL models achieve up to 48% performance gain within the PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Approximation of Solution Operators for High-dimensional PDEs [2.3076986663832044]
We propose a finite-dimensional control-based method to approximate solution operators for evolutional partial differential equations.
Results are presented for several high-dimensional PDEs, including real-world applications to solving Hamilton-Jacobi-Bellman equations.
arXiv Detail & Related papers (2024-01-18T21:45:09Z) - Energy-Preserving Reduced Operator Inference for Efficient Design and
Control [0.0]
This work presents a physics-preserving reduced model learning approach that targets partial differential equations.
EP-OpInf learns efficient and accurate reduced models that retain this energy-preserving structure.
arXiv Detail & Related papers (2024-01-05T16:39:48Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Neural Partial Differential Equations with Functional Convolution [30.35306295442881]
We present a lightweighted neural PDE representation to discover the hidden structure and predict the solution of different nonlinear PDEs.
We leverage the prior of translational similarity'' of numerical PDE differential operators to drastically reduce the scale of learning model and training data.
arXiv Detail & Related papers (2023-03-10T04:25:38Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Koopman neural operator as a mesh-free solver of non-linear partial differential equations [15.410070455154138]
We propose the Koopman neural operator (KNO), a new neural operator, to overcome these challenges.
By approximating the Koopman operator, an infinite-dimensional operator governing all possible observations of the dynamic system, we can equivalently learn the solution of a non-linear PDE family.
The KNO exhibits notable advantages compared with previous state-of-the-art models.
arXiv Detail & Related papers (2023-01-24T14:10:15Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z) - Non-intrusive Nonlinear Model Reduction via Machine Learning
Approximations to Low-dimensional Operators [0.0]
We propose a method that enables traditionally intrusive reduced-order models to be accurately approximated in a non-intrusive manner.
The approach approximates the low-dimensional operators associated with projection-based reduced-order models (ROMs) using modern machine-learning regression techniques.
In addition to enabling nonintrusivity, we demonstrate that the approach also leads to very low computational complexity, achieving up to $1000times$ reduction in run time.
arXiv Detail & Related papers (2021-06-17T17:04:42Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.