Deep learning soliton dynamics and complex potentials recognition for 1D
and 2D PT-symmetric saturable nonlinear Schr\"odinger equations
- URL: http://arxiv.org/abs/2310.02276v1
- Date: Fri, 29 Sep 2023 14:49:24 GMT
- Title: Deep learning soliton dynamics and complex potentials recognition for 1D
and 2D PT-symmetric saturable nonlinear Schr\"odinger equations
- Authors: Jin Song, Zhenya Yan
- Abstract summary: We extend the physics-informed neural networks (PINNs) to learn data-driven stationary and non-stationary solitons of 1D and 2D saturable nonlinear Schr"odinger equations.
- Score: 0.43512163406552
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we firstly extend the physics-informed neural networks (PINNs)
to learn data-driven stationary and non-stationary solitons of 1D and 2D
saturable nonlinear Schr\"odinger equations (SNLSEs) with two fundamental
PT-symmetric Scarf-II and periodic potentials in optical fibers. Secondly, the
data-driven inverse problems are studied for PT-symmetric potential functions
discovery rather than just potential parameters in the 1D and 2D SNLSEs.
Particularly, we propose a modified PINNs (mPINNs) scheme to identify directly
the PT potential functions of the 1D and 2D SNLSEs by the solution data. And
the inverse problems about 1D and 2D PT -symmetric potentials depending on
propagation distance z are also investigated using mPINNs method. We also
identify the potential functions by the PINNs applied to the stationary
equation of the SNLSE. Furthermore, two network structures are compared under
different parameter conditions such that the predicted PT potentials can
achieve the similar high accuracy. These results illustrate that the
established deep neural networks can be successfully used in 1D and 2D SNLSEs
with high accuracies. Moreover, some main factors affecting neural networks
performance are discussed in 1D and 2D PT Scarf-II and periodic potentials,
including activation functions, structures of the networks, and sizes of the
training data. In particular, twelve different nonlinear activation functions
are in detail analyzed containing the periodic and non-periodic functions such
that it is concluded that selecting activation functions according to the form
of solution and equation usually can achieve better effect.
Related papers
- Fourier Neural Operators for Learning Dynamics in Quantum Spin Systems [77.88054335119074]
We use FNOs to model the evolution of random quantum spin systems.
We apply FNOs to a compact set of Hamiltonian observables instead of the entire $2n$ quantum wavefunction.
arXiv Detail & Related papers (2024-09-05T07:18:09Z) - Data-driven 2D stationary quantum droplets and wave propagations in the amended GP equation with two potentials via deep neural networks learning [0.3683202928838613]
We develop a systematic deep learning approach to solve two-dimensional (2D) stationary quantum droplets (QDs)
We investigate their wave propagation in the 2D amended Gross-Pitaevskii equation with Lee-Huang-Yang correction and two kinds of potentials.
The learned stationary QDs are used as the initial value conditions for physics-informed neural networks (PINNs) to explore their evolutions in the some space-time region.
arXiv Detail & Related papers (2024-09-04T00:01:15Z) - Two-stage initial-value iterative physics-informed neural networks for simulating solitary waves of nonlinear wave equations [12.702685828829201]
We propose a new two-stage initial-value iterative neural network (IINN) algorithm for solitary wave computations.
The proposed IINN method is efficiently applied to learn some types of solutions in different nonlinear wave equations.
arXiv Detail & Related papers (2024-09-02T10:00:02Z) - Exploring Linear Feature Disentanglement For Neural Networks [63.20827189693117]
Non-linear activation functions, e.g., Sigmoid, ReLU, and Tanh, have achieved great success in neural networks (NNs)
Due to the complex non-linear characteristic of samples, the objective of those activation functions is to project samples from their original feature space to a linear separable feature space.
This phenomenon ignites our interest in exploring whether all features need to be transformed by all non-linear functions in current typical NNs.
arXiv Detail & Related papers (2022-03-22T13:09:17Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Characterizing possible failure modes in physics-informed neural
networks [55.83255669840384]
Recent work in scientific machine learning has developed so-called physics-informed neural network (PINN) models.
We demonstrate that, while existing PINN methodologies can learn good models for relatively trivial problems, they can easily fail to learn relevant physical phenomena even for simple PDEs.
We show that these possible failure modes are not due to the lack of expressivity in the NN architecture, but that the PINN's setup makes the loss landscape very hard to optimize.
arXiv Detail & Related papers (2021-09-02T16:06:45Z) - Incorporating NODE with Pre-trained Neural Differential Operator for
Learning Dynamics [73.77459272878025]
We propose to enhance the supervised signal in learning dynamics by pre-training a neural differential operator (NDO)
NDO is pre-trained on a class of symbolic functions, and it learns the mapping between the trajectory samples of these functions to their derivatives.
We provide theoretical guarantee on that the output of NDO can well approximate the ground truth derivatives by proper tuning the complexity of the library.
arXiv Detail & Related papers (2021-06-08T08:04:47Z) - Symmetry-via-Duality: Invariant Neural Network Densities from
Parameter-Space Correlators [0.0]
symmetries of network densities may be determined via dual computations of network correlation functions.
We demonstrate that the amount of symmetry in the initial density affects the accuracy of networks trained on Fashion-MNIST.
arXiv Detail & Related papers (2021-06-01T18:00:06Z) - On the eigenvector bias of Fourier feature networks: From regression to
solving multi-scale PDEs with physics-informed neural networks [0.0]
We show that neural networks (PINNs) struggle in cases where the target functions to be approximated exhibit high-frequency or multi-scale features.
We construct novel architectures that employ multi-scale random observational features and justify how such coordinate embedding layers can lead to robust and accurate PINN models.
arXiv Detail & Related papers (2020-12-18T04:19:30Z) - Data-driven rogue waves and parameter discovery in the defocusing NLS
equation with a potential using the PINN deep learning [7.400475825464313]
We use the multi-layer PINN deep learning method to study the data-driven rogue wave solutions of the defocusing nonlinear Schr"odinger (NLS) equation with the time-dependent potential.
Results will be useful to further discuss the rogue wave solutions of the defocusing NLS equation with a potential in the study of deep learning neural networks.
arXiv Detail & Related papers (2020-12-18T00:09:21Z) - Provably Efficient Neural Estimation of Structural Equation Model: An
Adversarial Approach [144.21892195917758]
We study estimation in a class of generalized Structural equation models (SEMs)
We formulate the linear operator equation as a min-max game, where both players are parameterized by neural networks (NNs), and learn the parameters of these neural networks using a gradient descent.
For the first time we provide a tractable estimation procedure for SEMs based on NNs with provable convergence and without the need for sample splitting.
arXiv Detail & Related papers (2020-07-02T17:55:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.