Data-driven rogue waves and parameter discovery in the defocusing NLS
equation with a potential using the PINN deep learning
- URL: http://arxiv.org/abs/2012.09984v1
- Date: Fri, 18 Dec 2020 00:09:21 GMT
- Title: Data-driven rogue waves and parameter discovery in the defocusing NLS
equation with a potential using the PINN deep learning
- Authors: Li Wang, Zhenya Yan
- Abstract summary: We use the multi-layer PINN deep learning method to study the data-driven rogue wave solutions of the defocusing nonlinear Schr"odinger (NLS) equation with the time-dependent potential.
Results will be useful to further discuss the rogue wave solutions of the defocusing NLS equation with a potential in the study of deep learning neural networks.
- Score: 7.400475825464313
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The physics-informed neural networks (PINNs) can be used to deep learn the
nonlinear partial differential equations and other types of physical models. In
this paper, we use the multi-layer PINN deep learning method to study the
data-driven rogue wave solutions of the defocusing nonlinear Schr\"odinger
(NLS) equation with the time-dependent potential by considering several initial
conditions such as the rogue wave, Jacobi elliptic cosine function,
two-Gaussian function, or three-hyperbolic-secant function, and periodic
boundary conditions. Moreover, the multi-layer PINN algorithm can also be used
to learn the parameter in the defocusing NLS equation with the time-dependent
potential under the sense of the rogue wave solution. These results will be
useful to further discuss the rogue wave solutions of the defocusing NLS
equation with a potential in the study of deep learning neural networks.
Related papers
- Solving Poisson Equations using Neural Walk-on-Spheres [80.1675792181381]
We propose Neural Walk-on-Spheres (NWoS), a novel neural PDE solver for the efficient solution of high-dimensional Poisson equations.
We demonstrate the superiority of NWoS in accuracy, speed, and computational costs.
arXiv Detail & Related papers (2024-06-05T17:59:22Z) - Learning solutions of parametric Navier-Stokes with physics-informed
neural networks [0.3989223013441816]
We leverageformed-Informed Neural Networks (PINs) to learn solution functions of parametric Navier-Stokes equations (NSE)
We consider the parameter(s) of interest as inputs of PINs along with coordinates, and train PINs on numerical solutions of parametric-PDES for instances of the parameters.
We show that our proposed approach results in optimizing PINN models that learn the solution functions while making sure that flow predictions are in line with conservational laws of mass and momentum.
arXiv Detail & Related papers (2024-02-05T16:19:53Z) - Deep learning soliton dynamics and complex potentials recognition for 1D
and 2D PT-symmetric saturable nonlinear Schr\"odinger equations [0.43512163406552]
We extend the physics-informed neural networks (PINNs) to learn data-driven stationary and non-stationary solitons of 1D and 2D saturable nonlinear Schr"odinger equations.
arXiv Detail & Related papers (2023-09-29T14:49:24Z) - Data-driven localized waves and parameter discovery in the massive
Thirring model via extended physics-informed neural networks with interface
zones [3.522950356329991]
We study data-driven localized wave solutions and parameter discovery in the massive Thirring (MT) model via the deep learning.
For higher-order localized wave solutions, we employ the extended PINNs (XPINNs) with domain decomposition.
Experimental results show that this improved version of XPINNs reduce the complexity of computation with faster convergence rate.
arXiv Detail & Related papers (2023-09-29T13:50:32Z) - Error Analysis of Physics-Informed Neural Networks for Approximating
Dynamic PDEs of Second Order in Time [1.123111111659464]
We consider the approximation of a class of dynamic partial differential equations (PDE) of second order in time by the physics-informed neural network (PINN) approach.
Our analyses show that, with feed-forward neural networks having two hidden layers and the $tanh$ activation function, the PINN approximation errors for the solution field can be effectively bounded by the training loss and the number of training data points.
We present ample numerical experiments with the new PINN algorithm for the wave equation, the Sine-Gordon equation and the linear elastodynamic equation, which show that the method can capture
arXiv Detail & Related papers (2023-03-22T00:51:11Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Neural Basis Functions for Accelerating Solutions to High Mach Euler
Equations [63.8376359764052]
We propose an approach to solving partial differential equations (PDEs) using a set of neural networks.
We regress a set of neural networks onto a reduced order Proper Orthogonal Decomposition (POD) basis.
These networks are then used in combination with a branch network that ingests the parameters of the prescribed PDE to compute a reduced order approximation to the PDE.
arXiv Detail & Related papers (2022-08-02T18:27:13Z) - Auto-PINN: Understanding and Optimizing Physics-Informed Neural
Architecture [77.59766598165551]
Physics-informed neural networks (PINNs) are revolutionizing science and engineering practice by bringing together the power of deep learning to bear on scientific computation.
Here, we propose Auto-PINN, which employs Neural Architecture Search (NAS) techniques to PINN design.
A comprehensive set of pre-experiments using standard PDE benchmarks allows us to probe the structure-performance relationship in PINNs.
arXiv Detail & Related papers (2022-05-27T03:24:31Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - On the eigenvector bias of Fourier feature networks: From regression to
solving multi-scale PDEs with physics-informed neural networks [0.0]
We show that neural networks (PINNs) struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.
We construct novel architectures that employ multi-scale random observational features and justify how such coordinate embedding layers can lead to robust and accurate PINN models.
arXiv Detail & Related papers (2020-12-18T04:19:30Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.