Learning Traveling Solitary Waves Using Separable Gaussian Neural
Networks
- URL: http://arxiv.org/abs/2403.04883v1
- Date: Thu, 7 Mar 2024 20:16:18 GMT
- Title: Learning Traveling Solitary Waves Using Separable Gaussian Neural
Networks
- Authors: Siyuan Xing and Efstathios G. Charalampidis
- Abstract summary: We apply a machine-learning approach to learn traveling solitary waves across various families of partial differential equations (PDEs)
Our approach integrates a novel interpretable neural network (NN) architecture into the framework of Physics-Informed Neural Networks (PINNs)
- Score: 0.9065034043031668
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we apply a machine-learning approach to learn traveling
solitary waves across various families of partial differential equations
(PDEs). Our approach integrates a novel interpretable neural network (NN)
architecture, called Separable Gaussian Neural Networks (SGNN) into the
framework of Physics-Informed Neural Networks (PINNs). Unlike the traditional
PINNs that treat spatial and temporal data as independent inputs, the present
method leverages wave characteristics to transform data into the so-called
co-traveling wave frame. This adaptation effectively addresses the issue of
propagation failure in PINNs when applied to large computational domains. Here,
the SGNN architecture demonstrates robust approximation capabilities for
single-peakon, multi-peakon, and stationary solutions within the
(1+1)-dimensional, $b$-family of PDEs. In addition, we expand our
investigations, and explore not only peakon solutions in the $ab$-family but
also compacton solutions in (2+1)-dimensional, Rosenau-Hyman family of PDEs. A
comparative analysis with MLP reveals that SGNN achieves comparable accuracy
with fewer than a tenth of the neurons, underscoring its efficiency and
potential for broader application in solving complex nonlinear PDEs.
Related papers
- General-Kindred Physics-Informed Neural Network to the Solutions of Singularly Perturbed Differential Equations [11.121415128908566]
We propose the General-Kindred Physics-Informed Neural Network (GKPINN) for solving Singular Perturbation Differential Equations (SPDEs)
This approach utilizes prior knowledge of the boundary layer from the equation and establishes a novel network to assist PINN in approxing the boundary layer.
The research findings underscore the exceptional performance of our novel approach, GKPINN, which delivers a remarkable enhancement in reducing the $L$ error by two to four orders of magnitude compared to the established PINN methodology.
arXiv Detail & Related papers (2024-08-27T02:03:22Z) - Functional Tensor Decompositions for Physics-Informed Neural Networks [8.66932181641177]
Physics-Informed Neural Networks (PINNs) have shown continuous and increasing promise in approximating partial differential equations (PDEs)
We propose a generalized PINN version of the classical variable separable method.
Our methodology significantly enhances the performance of PINNs, as evidenced by improved results on complex high-dimensional PDEs.
arXiv Detail & Related papers (2024-08-23T14:24:43Z) - Solving PDEs on Spheres with Physics-Informed Convolutional Neural Networks [17.69666422395703]
Physics-informed neural networks (PINNs) have been demonstrated to be efficient in solving partial differential equations (PDEs)
In this paper, we establish rigorous analysis of the physics-informed convolutional neural network (PICNN) for solving PDEs on the sphere.
arXiv Detail & Related papers (2023-08-18T14:58:23Z) - PINNsFormer: A Transformer-Based Framework For Physics-Informed Neural Networks [22.39904196850583]
Physics-Informed Neural Networks (PINNs) have emerged as a promising deep learning framework for approximating numerical solutions to partial differential equations (PDEs)
We introduce a novel Transformer-based framework, termed PINNsFormer, designed to address this limitation.
PINNsFormer achieves superior generalization ability and accuracy across various scenarios, including PINNs failure modes and high-dimensional PDEs.
arXiv Detail & Related papers (2023-07-21T18:06:27Z) - Enforcing Continuous Physical Symmetries in Deep Learning Network for
Solving Partial Differential Equations [3.6317085868198467]
We introduce a new method, symmetry-enhanced physics informed neural network (SPINN) where the invariant surface conditions induced by the Lie symmetries of PDEs are embedded into the loss function of PINN.
We show that SPINN performs better than PINN with fewer training points and simpler architecture of neural network.
arXiv Detail & Related papers (2022-06-19T00:44:22Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Revisiting PINNs: Generative Adversarial Physics-informed Neural
Networks and Point-weighting Method [70.19159220248805]
Physics-informed neural networks (PINNs) provide a deep learning framework for numerically solving partial differential equations (PDEs)
We propose the generative adversarial neural network (GA-PINN), which integrates the generative adversarial (GA) mechanism with the structure of PINNs.
Inspired from the weighting strategy of the Adaboost method, we then introduce a point-weighting (PW) method to improve the training efficiency of PINNs.
arXiv Detail & Related papers (2022-05-18T06:50:44Z) - Improved Training of Physics-Informed Neural Networks with Model
Ensembles [81.38804205212425]
We propose to expand the solution interval gradually to make the PINN converge to the correct solution.
All ensemble members converge to the same solution in the vicinity of observed data.
We show experimentally that the proposed method can improve the accuracy of the found solution.
arXiv Detail & Related papers (2022-04-11T14:05:34Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - dNNsolve: an efficient NN-based PDE solver [62.997667081978825]
We introduce dNNsolve, that makes use of dual Neural Networks to solve ODEs/PDEs.
We show that dNNsolve is capable of solving a broad range of ODEs/PDEs in 1, 2 and 3 spacetime dimensions.
arXiv Detail & Related papers (2021-03-15T19:14:41Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.