Operator learning for hyperbolic partial differential equations
- URL: http://arxiv.org/abs/2312.17489v1
- Date: Fri, 29 Dec 2023 06:41:50 GMT
- Title: Operator learning for hyperbolic partial differential equations
- Authors: Christopher Wang and Alex Townsend
- Abstract summary: We construct the first rigorously justified probabilistic algorithm for recovering the solution operator of a hyperbolic partial differential equation (PDE)
The primary challenge of recovering the solution operator of hyperbolic PDEs is the presence of characteristics, along which the associated Green's function is discontinuous.
Our assumptions on the regularity of the coefficients of the hyperbolic PDE are relatively weak given that hyperbolic PDEs do not have the instantaneous smoothing effect'' of elliptic and parabolic PDEs.
- Score: 9.434110429069385
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We construct the first rigorously justified probabilistic algorithm for
recovering the solution operator of a hyperbolic partial differential equation
(PDE) in two variables from input-output training pairs. The primary challenge
of recovering the solution operator of hyperbolic PDEs is the presence of
characteristics, along which the associated Green's function is discontinuous.
Therefore, a central component of our algorithm is a rank detection scheme that
identifies the approximate location of the characteristics. By combining the
randomized singular value decomposition with an adaptive hierarchical partition
of the domain, we construct an approximant to the solution operator using
$O(\Psi_\epsilon^{-1}\epsilon^{-7}\log(\Xi_\epsilon^{-1}\epsilon^{-1}))$
input-output pairs with relative error $O(\Xi_\epsilon^{-1}\epsilon)$ in the
operator norm as $\epsilon\to0$, with high probability. Here, $\Psi_\epsilon$
represents the existence of degenerate singular values of the solution
operator, and $\Xi_\epsilon$ measures the quality of the training data. Our
assumptions on the regularity of the coefficients of the hyperbolic PDE are
relatively weak given that hyperbolic PDEs do not have the ``instantaneous
smoothing effect'' of elliptic and parabolic PDEs, and our recovery rate
improves as the regularity of the coefficients increases.
Related papers
- Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - A Mean-Field Analysis of Neural Stochastic Gradient Descent-Ascent for Functional Minimiax Optimization [90.87444114491116]
This paper studies minimax optimization problems defined over infinite-dimensional function classes of overparametricized two-layer neural networks.
We address (i) the convergence of the gradient descent-ascent algorithm and (ii) the representation learning of the neural networks.
Results show that the feature representation induced by the neural networks is allowed to deviate from the initial one by the magnitude of $O(alpha-1)$, measured in terms of the Wasserstein distance.
arXiv Detail & Related papers (2024-04-18T16:46:08Z) - Deep Learning of Delay-Compensated Backstepping for Reaction-Diffusion
PDEs [2.2869182375774613]
Multiple operators arise in the control of PDE systems from distinct PDE classes.
DeepONet-approximated nonlinear operator is a cascade/composition of the operators defined by one hyperbolic PDE of the Goursat form and one parabolic PDE on a rectangle.
For the delay-compensated PDE backstepping controller, we guarantee exponential stability in the $L2$ norm of the plant state and the $H1$ norm of the input delay state.
arXiv Detail & Related papers (2023-08-21T06:42:33Z) - On the Identification and Optimization of Nonsmooth Superposition
Operators in Semilinear Elliptic PDEs [3.045851438458641]
We study an infinite-dimensional optimization problem that aims to identify the Nemytskii operator in the nonlinear part of a prototypical semilinear elliptic partial differential equation (PDE)
In contrast to previous works, we consider this identification problem in a low-regularity regime in which the function inducing the Nemytskii operator is a-priori only known to be an element of $H leakyloc(mathbbR)$.
arXiv Detail & Related papers (2023-06-08T13:33:20Z) - Efficient Sampling of Stochastic Differential Equations with Positive
Semi-Definite Models [91.22420505636006]
This paper deals with the problem of efficient sampling from a differential equation, given the drift function and the diffusion matrix.
It is possible to obtain independent and identically distributed (i.i.d.) samples at precision $varepsilon$ with a cost that is $m2 d log (1/varepsilon)$
Our results suggest that as the true solution gets smoother, we can circumvent the curse of dimensionality without requiring any sort of convexity.
arXiv Detail & Related papers (2023-03-30T02:50:49Z) - A Newton-CG based barrier-augmented Lagrangian method for general
nonconvex conic optimization [77.8485863487028]
In this paper we consider finding an approximate second-order stationary point (SOSP) that minimizes a twice different subject general non conic optimization.
In particular, we propose a Newton-CG based-augmentedconjugate method for finding an approximate SOSP.
arXiv Detail & Related papers (2023-01-10T20:43:29Z) - Primal-dual extrapolation methods for monotone inclusions under local
Lipschitz continuity with applications to variational inequality, conic
constrained saddle point, and convex conic optimization problems [0.0]
We consider a class of structured monotone inclusion (MI) problems that consist of finding a zero in the sum of two monotone operators.
We first propose a primal-dual extrapolation (PDE) method for solving a structured strongly MI problem by modifying the classical forward-backward splitting method.
We then propose another PDE method for solving a structured non-strongly MI problem by applying the above PDE method to approximately solve a sequence of structured strongly MI problems.
arXiv Detail & Related papers (2022-06-02T10:31:45Z) - Improved Convergence Rate of Stochastic Gradient Langevin Dynamics with
Variance Reduction and its Application to Optimization [50.83356836818667]
gradient Langevin Dynamics is one of the most fundamental algorithms to solve non-eps optimization problems.
In this paper, we show two variants of this kind, namely the Variance Reduced Langevin Dynamics and the Recursive Gradient Langevin Dynamics.
arXiv Detail & Related papers (2022-03-30T11:39:00Z) - Exponential Convergence of Deep Operator Networks for Elliptic Partial
Differential Equations [0.0]
We construct deep operator networks (ONets) between infinite-dimensional spaces that emulate with an exponential rate of convergence the coefficient-to-solution map of elliptic second-order PDEs.
In particular, we consider problems set in $d$-dimensional periodic domains, $d=1, 2, dots$, and with analytic right-hand sides and coefficients.
We prove that the neural networks in the ONet have size $mathcalO(left|log(varepsilon)right|kappa)$ for some $kappa
arXiv Detail & Related papers (2021-12-15T13:56:28Z) - Learning elliptic partial differential equations with randomized linear
algebra [2.538209532048867]
We show that one can construct an approximant to $G$ that converges almost surely.
The quantity $0Gamma_epsilonleq 1$ characterizes the quality of the training dataset.
arXiv Detail & Related papers (2021-01-31T16:57:59Z) - Complexity of Finding Stationary Points of Nonsmooth Nonconvex Functions [84.49087114959872]
We provide the first non-asymptotic analysis for finding stationary points of nonsmooth, nonsmooth functions.
In particular, we study Hadamard semi-differentiable functions, perhaps the largest class of nonsmooth functions.
arXiv Detail & Related papers (2020-02-10T23:23:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.