Separable Physics-Informed Neural Networks
- URL: http://arxiv.org/abs/2306.15969v4
- Date: Tue, 31 Oct 2023 08:23:53 GMT
- Title: Separable Physics-Informed Neural Networks
- Authors: Junwoo Cho, Seungtae Nam, Hyunmo Yang, Seok-Bae Yun, Youngjoon Hong,
Eunbyung Park
- Abstract summary: We propose a network architecture and training algorithm for PINNs.
SPINN operates on a per-axis basis to significantly reduce the number of network propagations in multi-dimensional PDEs.
We show that SPINN can solve a chaotic (2+1)-d Navier-Stokes equation significantly faster than the best-performing prior method.
- Score: 6.439575695132489
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Physics-informed neural networks (PINNs) have recently emerged as promising
data-driven PDE solvers showing encouraging results on various PDEs. However,
there is a fundamental limitation of training PINNs to solve multi-dimensional
PDEs and approximate highly complex solution functions. The number of training
points (collocation points) required on these challenging PDEs grows
substantially, but it is severely limited due to the expensive computational
costs and heavy memory overhead. To overcome this issue, we propose a network
architecture and training algorithm for PINNs. The proposed method, separable
PINN (SPINN), operates on a per-axis basis to significantly reduce the number
of network propagations in multi-dimensional PDEs unlike point-wise processing
in conventional PINNs. We also propose using forward-mode automatic
differentiation to reduce the computational cost of computing PDE residuals,
enabling a large number of collocation points (>10^7) on a single commodity
GPU. The experimental results show drastically reduced computational costs (62x
in wall-clock time, 1,394x in FLOPs given the same number of collocation
points) in multi-dimensional PDEs while achieving better accuracy. Furthermore,
we present that SPINN can solve a chaotic (2+1)-d Navier-Stokes equation
significantly faster than the best-performing prior method (9 minutes vs 10
hours in a single GPU), maintaining accuracy. Finally, we showcase that SPINN
can accurately obtain the solution of a highly nonlinear and multi-dimensional
PDE, a (3+1)-d Navier-Stokes equation. For visualized results and code, please
see https://jwcho5576.github.io/spinn.github.io/.
Related papers
- Tackling the Curse of Dimensionality with Physics-Informed Neural Networks [24.86574584293979]
We develop a new method of scaling up physics-informed neural networks (PINNs) to solve arbitrary high-dimensional PDEs.
We demonstrate in various tests that the proposed method can solve many notoriously hard high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-23T12:18:12Z) - iPINNs: Incremental learning for Physics-informed neural networks [66.4795381419701]
Physics-informed neural networks (PINNs) have recently become a powerful tool for solving partial differential equations (PDEs)
We propose incremental PINNs that can learn multiple tasks sequentially without additional parameters for new tasks and improve performance for every equation in the sequence.
Our approach learns multiple PDEs starting from the simplest one by creating its own subnetwork for each PDE and allowing each subnetwork to overlap with previously learnedworks.
arXiv Detail & Related papers (2023-04-10T20:19:20Z) - A physics-informed neural network framework for modeling obstacle-related equations [3.687313790402688]
Physics-informed neural networks (PINNs) are an attractive tool for solving partial differential equations based on sparse and noisy data.
Here we extend PINNs to solve obstacle-related PDEs which present a great computational challenge.
The performance of the proposed PINNs is demonstrated in multiple scenarios for linear and nonlinear PDEs subject to regular and irregular obstacles.
arXiv Detail & Related papers (2023-04-07T09:22:28Z) - Separable PINN: Mitigating the Curse of Dimensionality in
Physics-Informed Neural Networks [6.439575695132489]
Physics-informed neural networks (PINNs) have emerged as new data-driven PDE solvers for both forward and inverse problems.
We demonstrate that the computations in automatic differentiation (AD) can be significantly reduced by leveraging forward-mode AD when training PINN.
We propose a network architecture, called separable PINN (SPINN), which can facilitate forward-mode AD for more efficient computation.
arXiv Detail & Related papers (2022-11-16T08:46:52Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Learning time-dependent PDE solver using Message Passing Graph Neural
Networks [0.0]
We introduce a graph neural network approach to finding efficient PDE solvers through learning using message-passing models.
We use graphs to represent PDE-data on an unstructured mesh and show that message passing graph neural networks (MPGNN) can parameterize governing equations.
We show that a recurrent graph neural network approach can find a temporal sequence of solutions to a PDE.
arXiv Detail & Related papers (2022-04-15T21:10:32Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Lie Point Symmetry Data Augmentation for Neural PDE Solvers [69.72427135610106]
We present a method, which can partially alleviate this problem, by improving neural PDE solver sample complexity.
In the context of PDEs, it turns out that we are able to quantitatively derive an exhaustive list of data transformations.
We show how it can easily be deployed to improve neural PDE solver sample complexity by an order of magnitude.
arXiv Detail & Related papers (2022-02-15T18:43:17Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.