PIXEL: Physics-Informed Cell Representations for Fast and Accurate PDE
Solvers
- URL: http://arxiv.org/abs/2207.12800v1
- Date: Tue, 26 Jul 2022 10:46:56 GMT
- Title: PIXEL: Physics-Informed Cell Representations for Fast and Accurate PDE
Solvers
- Authors: Namgyu Kang, Byeonghyeon Lee, Youngjoon Hong, Seok-Bae Yun, Eunbyung
Park
- Abstract summary: We propose a new kind of data-driven PDEs solver, physics-informed cell representations (PIXEL)
PIXEL elegantly combines classical numerical methods and learning-based approaches.
We show that PIXEL achieves fast convergence speed and high accuracy.
- Score: 4.1173475271436155
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: With the increases in computational power and advances in machine learning,
data-driven learning-based methods have gained significant attention in solving
PDEs. Physics-informed neural networks (PINNs) have recently emerged and
succeeded in various forward and inverse PDEs problems thanks to their
excellent properties, such as flexibility, mesh-free solutions, and
unsupervised training. However, their slower convergence speed and relatively
inaccurate solutions often limit their broader applicability in many science
and engineering domains. This paper proposes a new kind of data-driven PDEs
solver, physics-informed cell representations (PIXEL), elegantly combining
classical numerical methods and learning-based approaches. We adopt a grid
structure from the numerical methods to improve accuracy and convergence speed
and overcome the spectral bias presented in PINNs. Moreover, the proposed
method enjoys the same benefits in PINNs, e.g., using the same optimization
frameworks to solve both forward and inverse PDE problems and readily enforcing
PDE constraints with modern automatic differentiation techniques. We provide
experimental results on various challenging PDEs that the original PINNs have
struggled with and show that PIXEL achieves fast convergence speed and high
accuracy.
Related papers
- Learning a Neural Solver for Parametric PDE to Enhance Physics-Informed Methods [14.791541465418263]
We propose learning a solver, i.e., solving partial differential equations (PDEs) using a physics-informed iterative algorithm trained on data.
Our method learns to condition a gradient descent algorithm that automatically adapts to each PDE instance.
We demonstrate the effectiveness of our method through empirical experiments on multiple datasets.
arXiv Detail & Related papers (2024-10-09T12:28:32Z) - VS-PINN: A fast and efficient training of physics-informed neural networks using variable-scaling methods for solving PDEs with stiff behavior [0.0]
We propose a new method for training PINNs using variable-scaling techniques.
We will demonstrate the effectiveness of the proposed method for these problems and confirm that it can significantly improve the training efficiency and performance of PINNs.
arXiv Detail & Related papers (2024-06-10T14:11:15Z) - Unisolver: PDE-Conditional Transformers Are Universal PDE Solvers [55.0876373185983]
We present the Universal PDE solver (Unisolver) capable of solving a wide scope of PDEs.
Our key finding is that a PDE solution is fundamentally under the control of a series of PDE components.
Unisolver achieves consistent state-of-the-art results on three challenging large-scale benchmarks.
arXiv Detail & Related papers (2024-05-27T15:34:35Z) - A Gaussian Process Framework for Solving Forward and Inverse Problems Involving Nonlinear Partial Differential Equations [0.0]
We introduce kernel-weighted Corrective Residuals (CoRes) to integrate the strengths of kernel methods and deep NNs for solving nonlinear PDE systems.
CoRes consistently outperforms competing methods in solving a broad range of benchmark problems.
We believe our findings have the potential to spark a renewed interest in leveraging kernel methods for solving PDEs.
arXiv Detail & Related papers (2024-01-07T14:09:42Z) - A unified scalable framework for causal sweeping strategies for
Physics-Informed Neural Networks (PINNs) and their temporal decompositions [22.514769448363754]
Training challenges in PINNs and XPINNs for time-dependent PDEs are discussed.
We propose a new stacked-decomposition method that bridges the gap between PINNs and XPINNs.
We also formulate a new time-sweeping collocation point algorithm inspired by the previous PINNs causality.
arXiv Detail & Related papers (2023-02-28T01:19:21Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - PhyCRNet: Physics-informed Convolutional-Recurrent Network for Solving
Spatiotemporal PDEs [8.220908558735884]
Partial differential equations (PDEs) play a fundamental role in modeling and simulating problems across a wide range of disciplines.
Recent advances in deep learning have shown the great potential of physics-informed neural networks (NNs) to solve PDEs as a basis for data-driven inverse analysis.
We propose the novel physics-informed convolutional-recurrent learning architectures (PhyCRNet and PhCRyNet-s) for solving PDEs without any labeled data.
arXiv Detail & Related papers (2021-06-26T22:22:19Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.