ASPINN: An asymptotic strategy for solving singularly perturbed differential equations
- URL: http://arxiv.org/abs/2409.13185v1
- Date: Fri, 20 Sep 2024 03:25:17 GMT
- Title: ASPINN: An asymptotic strategy for solving singularly perturbed differential equations
- Authors: Sen Wang, Peizhi Zhao, Tao Song,
- Abstract summary: We propose Asymptotic Physics-Informed Neural Networks (ASPINN), a generalization of Physics-Informed Neural Networks (PINN) and General-Kindred Physics-Informed Neural Networks (GKPINN)
ASPINN has a strong fitting ability for solving SPDEs due to the placement of exponential layers at the boundary layer.
We demonstrate the effect of ASPINN by solving diverse classes of SPDEs, which clearly shows that the ASPINN method is promising in boundary layer problems.
- Score: 12.14934707131722
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Solving Singularly Perturbed Differential Equations (SPDEs) presents challenges due to the rapid change of their solutions at the boundary layer. In this manuscript, We propose Asymptotic Physics-Informed Neural Networks (ASPINN), a generalization of Physics-Informed Neural Networks (PINN) and General-Kindred Physics-Informed Neural Networks (GKPINN) approaches. This is a decomposition method based on the idea of asymptotic analysis. Compared to PINN, the ASPINN method has a strong fitting ability for solving SPDEs due to the placement of exponential layers at the boundary layer. Unlike GKPINN, ASPINN lessens the number of fully connected layers, thereby reducing the training cost more effectively. Moreover, ASPINN theoretically approximates the solution at the boundary layer more accurately, which accuracy is also improved compared to GKPINN. We demonstrate the effect of ASPINN by solving diverse classes of SPDEs, which clearly shows that the ASPINN method is promising in boundary layer problems. Furthermore, we introduce Chebyshev Kolmogorov-Arnold Networks (Chebyshev-KAN) instead of MLP, achieving better performance in various experiments.
Related papers
- General-Kindred Physics-Informed Neural Network to the Solutions of Singularly Perturbed Differential Equations [11.121415128908566]
We propose the General-Kindred Physics-Informed Neural Network (GKPINN) for solving Singular Perturbation Differential Equations (SPDEs)
This approach utilizes prior knowledge of the boundary layer from the equation and establishes a novel network to assist PINN in approxing the boundary layer.
The research findings underscore the exceptional performance of our novel approach, GKPINN, which delivers a remarkable enhancement in reducing the $L$ error by two to four orders of magnitude compared to the established PINN methodology.
arXiv Detail & Related papers (2024-08-27T02:03:22Z) - Residual resampling-based physics-informed neural network for neutron diffusion equations [7.105073499157097]
The neutron diffusion equation plays a pivotal role in the analysis of nuclear reactors.
Traditional PINN approaches often utilize fully connected network (FCN) architecture.
R2-PINN effectively overcomes the limitations inherent in current methods, providing more accurate and robust solutions for neutron diffusion equations.
arXiv Detail & Related papers (2024-06-23T13:49:31Z) - Error Analysis and Numerical Algorithm for PDE Approximation with Hidden-Layer Concatenated Physics Informed Neural Networks [0.9693477883827689]
We present the hidden-layerd physics informed neural network (HLConcPINN) method.
It combines hidden-layerd feed-forward neural networks, a modified block time marching strategy, and a physics informed approach for approximating partial differential equations (PDEs)
We show that its approximation error of the solution can be effectively controlled by the training loss for dynamic simulations with long time horizons.
arXiv Detail & Related papers (2024-06-10T15:12:53Z) - Globally Optimal Training of Neural Networks with Threshold Activation
Functions [63.03759813952481]
We study weight decay regularized training problems of deep neural networks with threshold activations.
We derive a simplified convex optimization formulation when the dataset can be shattered at a certain layer of the network.
arXiv Detail & Related papers (2023-03-06T18:59:13Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Ensemble learning for Physics Informed Neural Networks: a Gradient Boosting approach [10.250994619846416]
We present a new training paradigm referred to as "gradient boosting" (GB)
Instead of learning the solution of a given PDE using a single neural network directly, our algorithm employs a sequence of neural networks to achieve a superior outcome.
This work also unlocks the door to employing ensemble learning techniques in PINNs.
arXiv Detail & Related papers (2023-02-25T19:11:44Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Anisotropic, Sparse and Interpretable Physics-Informed Neural Networks
for PDEs [0.0]
We present ASPINN, an anisotropic extension of our earlier work called SPINN--Sparse, Physics-informed, and Interpretable Neural Networks--to solve PDEs.
ASPINNs generalize radial basis function networks.
We also streamline the training of ASPINNs into a form that is closer to that of supervised learning algorithms.
arXiv Detail & Related papers (2022-07-01T12:24:43Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Comparative Analysis of Interval Reachability for Robust Implicit and
Feedforward Neural Networks [64.23331120621118]
We use interval reachability analysis to obtain robustness guarantees for implicit neural networks (INNs)
INNs are a class of implicit learning models that use implicit equations as layers.
We show that our approach performs at least as well as, and generally better than, applying state-of-the-art interval bound propagation methods to INNs.
arXiv Detail & Related papers (2022-04-01T03:31:27Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.