STENCIL-NET: Data-driven solution-adaptive discretization of partial
differential equations
- URL: http://arxiv.org/abs/2101.06182v2
- Date: Mon, 18 Jan 2021 10:31:17 GMT
- Title: STENCIL-NET: Data-driven solution-adaptive discretization of partial
differential equations
- Authors: Suryanarayana Maddu, Dominik Sturm, Bevan L. Cheeseman, Christian L.
M\"uller, Ivo F. Sbalzarini
- Abstract summary: We present STENCIL-NET, an artificial neural network architecture for data-driven learning of problem- and resolution-specific local discretizations of nonlinear PDEs.
Knowing the actual PDE is not necessary, as solution data is sufficient to train the network to learn the discrete operators.
A once-trained STENCIL-NET model can be used to predict solutions of the PDE on larger domains and for longer times than it was trained for.
- Score: 2.362412515574206
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Numerical methods for approximately solving partial differential equations
(PDE) are at the core of scientific computing. Often, this requires
high-resolution or adaptive discretization grids to capture relevant
spatio-temporal features in the PDE solution, e.g., in applications like
turbulence, combustion, and shock propagation. Numerical approximation also
requires knowing the PDE in order to construct problem-specific
discretizations. Systematically deriving such solution-adaptive discrete
operators, however, is a current challenge. Here we present STENCIL-NET, an
artificial neural network architecture for data-driven learning of problem- and
resolution-specific local discretizations of nonlinear PDEs. STENCIL-NET
achieves numerically stable discretization of the operators in an unknown
nonlinear PDE by spatially and temporally adaptive parametric pooling on
regular Cartesian grids, and by incorporating knowledge about discrete time
integration. Knowing the actual PDE is not necessary, as solution data is
sufficient to train the network to learn the discrete operators. A once-trained
STENCIL-NET model can be used to predict solutions of the PDE on larger spatial
domains and for longer times than it was trained for, hence addressing the
problem of PDE-constrained extrapolation from data. To support this claim, we
present numerical experiments on long-term forecasting of chaotic PDE solutions
on coarse spatio-temporal grids. We also quantify the speed-up achieved by
substituting base-line numerical methods with equation-free STENCIL-NET
predictions on coarser grids with little compromise on accuracy.
Related papers
- A Deep Learning approach for parametrized and time dependent Partial Differential Equations using Dimensionality Reduction and Neural ODEs [46.685771141109306]
We propose an autoregressive and data-driven method using the analogy with classical numerical solvers for time-dependent, parametric and (typically) nonlinear PDEs.
We show that by leveraging DR we can deliver not only more accurate predictions, but also a considerably lighter and faster Deep Learning model.
arXiv Detail & Related papers (2025-02-12T11:16:15Z) - MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation [48.41289705783405]
We propose a PDE-embedded network with multiscale time stepping (MultiPDENet)
In particular, we design a convolutional filter based on the structure of finite difference with a small number of parameters to optimize.
A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction.
arXiv Detail & Related papers (2025-01-27T12:15:51Z) - Adaptation of uncertainty-penalized Bayesian information criterion for parametric partial differential equation discovery [1.1049608786515839]
We introduce an extension of the uncertainty-penalized Bayesian information criterion (UBIC) to solve parametric PDE discovery problems efficiently.
UBIC uses quantified PDE uncertainty over different temporal or spatial points to prevent overfitting in model selection.
We show that our extended UBIC can identify the true number of terms and their varying coefficients accurately, even in the presence of noise.
arXiv Detail & Related papers (2024-08-15T12:10:50Z) - Solving partial differential equations with sampled neural networks [1.8590821261905535]
Approximation of solutions to partial differential equations (PDE) is an important problem in computational science and engineering.
We discuss how sampling the hidden weights and biases of the ansatz network from data-agnostic and data-dependent probability distributions allows us to progress on both challenges.
arXiv Detail & Related papers (2024-05-31T14:24:39Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh [24.572840023107574]
Partial differential equations (PDEs) are often computationally challenging to solve.
We present a meta-learning based method which learns to rapidly solve problems from a distribution of related PDEs.
arXiv Detail & Related papers (2022-11-03T06:17:52Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.