Lie Point Symmetry Data Augmentation for Neural PDE Solvers
- URL: http://arxiv.org/abs/2202.07643v1
- Date: Tue, 15 Feb 2022 18:43:17 GMT
- Title: Lie Point Symmetry Data Augmentation for Neural PDE Solvers
- Authors: Johannes Brandstetter, Max Welling, Daniel E. Worrall
- Abstract summary: We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
- Score: 69.72427135610106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks are increasingly being used to solve partial differential
equations (PDEs), replacing slower numerical solvers. However, a critical issue
is that neural PDE solvers require high-quality ground truth data, which
usually must come from the very solvers they are designed to replace. Thus, we
are presented with a proverbial chicken-and-egg problem. In this paper, we
present a method, which can partially alleviate this problem, by improving
neural PDE solver sample complexity -- Lie point symmetry data augmentation
(LPSDA). In the context of PDEs, it turns out that we are able to
quantitatively derive an exhaustive list of data transformations, based on the
Lie point symmetry group of the PDEs in question, something not possible in
other application areas. We present this framework and demonstrate how it can
easily be deployed to improve neural PDE solver sample complexity by an order
of magnitude.
Related papers
- Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - Lie Point Symmetry and Physics Informed Networks [59.56218517113066]
We propose a loss function that informs the network about Lie point symmetries in the same way that PINN models try to enforce the underlying PDE through a loss function.
Our symmetry loss ensures that the infinitesimal generators of the Lie group conserve the PDE solutions.
Empirical evaluations indicate that the inductive bias introduced by the Lie point symmetries of the PDEs greatly boosts the sample efficiency of PINNs.
arXiv Detail & Related papers (2023-11-07T19:07:16Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Meta-PDE: Learning to Solve PDEs Quickly Without a Mesh [24.572840023107574]
Partial differential equations (PDEs) are often computationally challenging to solve.
We present a meta-learning based method which learns to rapidly solve problems from a distribution of related PDEs.
arXiv Detail & Related papers (2022-11-03T06:17:52Z) - Physics-Aware Neural Networks for Boundary Layer Linear Problems [0.0]
Physics-Informed Neural Networks (PINNs) approximate the solution of general partial differential equations (PDEs) by adding them in some form as terms of the loss/cost of a Neural Network.
This paper explores PINNs for linear PDEs whose solutions may present one or more boundary layers.
arXiv Detail & Related papers (2022-07-15T21:15:06Z) - Learning time-dependent PDE solver using Message Passing Graph Neural
Networks [0.0]
We introduce a graph neural network approach to finding efficient PDE solvers through learning using message-passing models.
We use graphs to represent PDE-data on an unstructured mesh and show that message passing graph neural networks (MPGNN) can parameterize governing equations.
We show that a recurrent graph neural network approach can find a temporal sequence of solutions to a PDE.
arXiv Detail & Related papers (2022-04-15T21:10:32Z) - Solving PDEs on Unknown Manifolds with Machine Learning [8.220217498103315]
This paper presents a mesh-free computational framework and machine learning theory for solving elliptic PDEs on unknown manifold.
We show that the proposed NN solver can robustly generalize the PDE on new data points with errors that are almost identical to generalizations on new data points.
arXiv Detail & Related papers (2021-06-12T03:55:15Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - PDE-based Group Equivariant Convolutional Neural Networks [1.949912057689623]
We present a PDE-based framework that generalizes Group equivariant Convolutional Neural Networks (G-CNNs)
In this framework, a network layer is seen as a set of PDE-solvers where geometrically meaningful PDE-coefficients become the layer's trainable weights.
We present experiments to demonstrate the strength of the proposed PDE-G-CNNs in increasing the performance of deep learning based imaging applications.
arXiv Detail & Related papers (2020-01-24T15:00:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.