Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations
- URL: http://arxiv.org/abs/2109.01467v1
- Date: Fri, 3 Sep 2021 12:03:10 GMT
- Title: Semi-Implicit Neural Solver for Time-dependent Partial Differential
Equations
- Authors: Suprosanna Shit, Ivan Ezhov, Leon M\"achler, Abinav R., Jana Lipkova,
Johannes C. Paetzold, Florian Kofler, Marie Piraud, Bjoern H. Menze
- Abstract summary: We propose a neural solver to learn an optimal iterative scheme in a data-driven fashion for any class of PDEs.
We provide theoretical guarantees for the correctness and convergence of neural solvers analogous to conventional iterative solvers.
- Score: 4.246966726709308
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Fast and accurate solutions of time-dependent partial differential equations
(PDEs) are of pivotal interest to many research fields, including physics,
engineering, and biology. Generally, implicit/semi-implicit schemes are
preferred over explicit ones to improve stability and correctness. However,
existing semi-implicit methods are usually iterative and employ a
general-purpose solver, which may be sub-optimal for a specific class of PDEs.
In this paper, we propose a neural solver to learn an optimal iterative scheme
in a data-driven fashion for any class of PDEs. Specifically, we modify a
single iteration of a semi-implicit solver using a deep neural network. We
provide theoretical guarantees for the correctness and convergence of neural
solvers analogous to conventional iterative solvers. In addition to the
commonly used Dirichlet boundary condition, we adopt a diffuse domain approach
to incorporate a diverse type of boundary conditions, e.g., Neumann. We show
that the proposed neural solver can go beyond linear PDEs and applies to a
class of non-linear PDEs, where the non-linear component is non-stiff. We
demonstrate the efficacy of our method on 2D and 3D scenarios. To this end, we
show how our model generalizes to parameter settings, which are different from
training; and achieves faster convergence than semi-implicit schemes.
Related papers
- Vectorized Conditional Neural Fields: A Framework for Solving Time-dependent Parametric Partial Differential Equations [14.052158194490715]
We propose Vectorized Conditional Neural Fields (VCNeFs) to represent the solution of time-dependent PDEs as neural fields.
VCNeFs compute, for a set of multiple-temporal query points, their solutions in parallel and model their complexity.
An extensive set of experiments demonstrates that VCNeFs are competitive with and often outperform existing ML-based surrogate models.
arXiv Detail & Related papers (2024-06-06T10:02:06Z) - Constrained or Unconstrained? Neural-Network-Based Equation Discovery from Data [0.0]
We represent the PDE as a neural network and use an intermediate state representation similar to a Physics-Informed Neural Network (PINN)
We present a penalty method and a widely used trust-region barrier method to solve this constrained optimization problem.
Our results on the Burgers' and the Korteweg-De Vreis equations demonstrate that the latter constrained method outperforms the penalty method.
arXiv Detail & Related papers (2024-05-30T01:55:44Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - JAX-DIPS: Neural bootstrapping of finite discretization methods and
application to elliptic problems with discontinuities [0.0]
This strategy can be used to efficiently train neural network surrogate models of partial differential equations.
The presented neural bootstrapping method (hereby dubbed NBM) is based on evaluation of the finite discretization residuals of the PDE system.
We show NBM is competitive in terms of memory and training speed with other PINN-type frameworks.
arXiv Detail & Related papers (2022-10-25T20:13:26Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Solving and Learning Nonlinear PDEs with Gaussian Processes [11.09729362243947]
We introduce a simple, rigorous, and unified framework for solving nonlinear partial differential equations.
The proposed approach provides a natural generalization of collocation kernel methods to nonlinear PDEs and IPs.
For IPs, while the traditional approach has been to iterate between the identifications of parameters in the PDE and the numerical approximation of its solution, our algorithm tackles both simultaneously.
arXiv Detail & Related papers (2021-03-24T03:16:08Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Meta-Solver for Neural Ordinary Differential Equations [77.8918415523446]
We investigate how the variability in solvers' space can improve neural ODEs performance.
We show that the right choice of solver parameterization can significantly affect neural ODEs models in terms of robustness to adversarial attacks.
arXiv Detail & Related papers (2021-03-15T17:26:34Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.