Data-driven soliton mappings for integrable fractional nonlinear wave
equations via deep learning with Fourier neural operator
- URL: http://arxiv.org/abs/2209.14291v1
- Date: Mon, 29 Aug 2022 06:48:26 GMT
- Title: Data-driven soliton mappings for integrable fractional nonlinear wave
equations via deep learning with Fourier neural operator
- Authors: Ming Zhong and Zhenya Yan
- Abstract summary: We extend the Fourier neural operator (FNO) to discovery the soliton mapping between two function spaces.
To be specific, the fractional nonlinear Schr"odinger (fNLS), fractional Korteweg-de Vries (fKdV), fractional modified Korteweg-de Vries (fmKdV) and fractional sine-Gordon (fsineG) equations proposed recently are studied.
- Score: 7.485410656333205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we firstly extend the Fourier neural operator (FNO) to
discovery the soliton mapping between two function spaces, where one is the
fractional-order index space $\{\epsilon|\epsilon\in (0, 1)\}$ in the
fractional integrable nonlinear wave equations while another denotes the
solitonic solution function space. To be specific, the fractional nonlinear
Schr\"{o}dinger (fNLS), fractional Korteweg-de Vries (fKdV), fractional
modified Korteweg-de Vries (fmKdV) and fractional sine-Gordon (fsineG)
equations proposed recently are studied in this paper. We present the train and
evaluate progress by recording the train and test loss. To illustrate the
accuracies, the data-driven solitons are also compared to the exact solutions.
Moreover, we consider the influences of several critical factors (e.g.,
activation functions containing Relu$(x)$, Sigmoid$(x)$, Swish$(x)$ and
$x\tanh(x)$, depths of fully connected layer) on the performance of the FNO
algorithm. We also use a new activation function, namely, $x\tanh(x)$, which is
not used in the field of deep learning. The results obtained in this paper may
be useful to further understand the neural networks in the fractional
integrable nonlinear wave systems and the mappings between two spaces.
Related papers
- On the estimation rate of Bayesian PINN for inverse problems [10.100602879566782]
Solving partial differential equations (PDEs) and their inverse problems using Physics-informed neural networks (PINNs) is a rapidly growing approach in the physics and machine learning community.
We study the behavior of a Bayesian PINN estimator of the solution of a PDE from $n$ independent noisy measurement of the solution.
arXiv Detail & Related papers (2024-06-21T01:13:18Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Neural Operators with Localized Integral and Differential Kernels [77.76991758980003]
We present a principled approach to operator learning that can capture local features under two frameworks.
We prove that we obtain differential operators under an appropriate scaling of the kernel values of CNNs.
To obtain local integral operators, we utilize suitable basis representations for the kernels based on discrete-continuous convolutions.
arXiv Detail & Related papers (2024-02-26T18:59:31Z) - Efficient uniform approximation using Random Vector Functional Link
networks [0.0]
A Random Vector Functional Link (RVFL) network is a depth-2 neural network with random inner nodes and biases.
We show that an RVFL with ReLU activation can approximate the Lipschitz target function.
Our method of proof is rooted in theory and harmonic analysis.
arXiv Detail & Related papers (2023-06-30T09:25:03Z) - Pseudo-Differential Neural Operator: Generalized Fourier Neural Operator
for Learning Solution Operators of Partial Differential Equations [14.43135909469058]
We propose a novel textitpseudo-differential integral operator (PDIO) to analyze and generalize the Fourier integral operator in FNO.
We experimentally validate the effectiveness of the proposed model by utilizing Darcy flow and the Navier-Stokes equation.
arXiv Detail & Related papers (2022-01-28T07:22:32Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z) - Multiwavelet-based Operator Learning for Differential Equations [3.0824316066680484]
We introduce a textitmultiwavelet-based neural operator learning scheme that compresses the associated operator's kernel.
By explicitly embedding the inverse multiwavelet filters, we learn the projection of the kernel onto fixed multiwavelet bases.
Compared with the existing neural operator approaches, our model shows significantly higher accuracy and state-of-the-art in a range of datasets.
arXiv Detail & Related papers (2021-09-28T03:21:47Z) - Learning Frequency Domain Approximation for Binary Neural Networks [68.79904499480025]
We propose to estimate the gradient of sign function in the Fourier frequency domain using the combination of sine functions for training BNNs.
The experiments on several benchmark datasets and neural architectures illustrate that the binary network learned using our method achieves the state-of-the-art accuracy.
arXiv Detail & Related papers (2021-03-01T08:25:26Z) - Fourier Neural Operator for Parametric Partial Differential Equations [57.90284928158383]
We formulate a new neural operator by parameterizing the integral kernel directly in Fourier space.
We perform experiments on Burgers' equation, Darcy flow, and Navier-Stokes equation.
It is up to three orders of magnitude faster compared to traditional PDE solvers.
arXiv Detail & Related papers (2020-10-18T00:34:21Z) - nPINNs: nonlocal Physics-Informed Neural Networks for a parametrized
nonlocal universal Laplacian operator. Algorithms and Applications [0.0]
Physics-informed neural networks (PINNs) are effective in solving inverse problems based on differential and integral equations with sparse, unstructured and multi-fidelity data.
In this paper, we extend PINNs to parameters and function for integral equations nonlocal Poisson and nonlocal turbulence.
Our results show that nPINNs can jointly infer this function as well as $delta$.
arXiv Detail & Related papers (2020-04-08T21:48:30Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.