Regularity of Second-Order Elliptic PDEs in Spectral Barron Spaces
- URL: http://arxiv.org/abs/2602.19381v1
- Date: Sun, 22 Feb 2026 23:29:57 GMT
- Title: Regularity of Second-Order Elliptic PDEs in Spectral Barron Spaces
- Authors: Ziang Chen, Liqiang Huang, Mengxuan Yang, Shengxuan Zhou,
- Abstract summary: We establish a regularity theorem for second-order elliptic PDEs on $mathbbRd$ in spectral Barron spaces.<n>We identify a class of PDEs whose solutions can be approximated by two-layer neural networks with cosine activation functions.
- Score: 8.73881274952982
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We establish a regularity theorem for second-order elliptic PDEs on $\mathbb{R}^{d}$ in spectral Barron spaces. Under mild ellipticity and smallness assumptions, the solution gains two additional orders of Barron regularity. As a corollary, we identify a class of PDEs whose solutions can be approximated by two-layer neural networks with cosine activation functions, where the width of the neural network is independent of the spatial dimension.
Related papers
- Barron Space Representations for Elliptic PDEs with Homogeneous Boundary Conditions [10.72933755002183]
We study the approximation complexity of high-dimensional second-order elliptic PDEs with homogeneous boundary conditions on the unit hypercube.<n>Under the assumption that the coefficients belong to suitably defined Barron spaces, we prove that the solution can be efficiently approximated by two-layer neural networks.
arXiv Detail & Related papers (2025-08-11T02:36:40Z) - High precision PINNs in unbounded domains: application to singularity formulation in PDEs [83.50980325611066]
We study the choices of neural network ansatz, sampling strategy, and optimization algorithm.<n>For 1D Burgers equation, our framework can lead to a solution with very high precision.<n>For the 2D Boussinesq equation, we obtain a solution whose loss is $4$ digits smaller than that obtained in citewang2023asymptotic with fewer training steps.
arXiv Detail & Related papers (2025-06-24T02:01:44Z) - Mechanistic PDE Networks for Discovery of Governing Equations [52.492158106791365]
We present Mechanistic PDE Networks, a model for discovery of partial differential equations from data.<n>The represented PDEs are then solved and decoded for specific tasks.<n>We develop a native, GPU-capable, parallel, sparse, and differentiable multigrid solver specialized for linear partial differential equations.
arXiv Detail & Related papers (2025-02-25T17:21:44Z) - Learning with Norm Constrained, Over-parameterized, Two-layer Neural Networks [54.177130905659155]
Recent studies show that a reproducing kernel Hilbert space (RKHS) is not a suitable space to model functions by neural networks.
In this paper, we study a suitable function space for over- parameterized two-layer neural networks with bounded norms.
arXiv Detail & Related papers (2024-04-29T15:04:07Z) - Space-Time Approximation with Shallow Neural Networks in Fourier
Lebesgue spaces [1.74048653626208]
We study the inclusion of anisotropic weighted Fourier-Lebesgue spaces in the Bochner-Sobolev spaces.
We establish a bound on the approximation rate for functions from the anisotropic weighted Fourier-Lebesgue spaces and approximation via SNNs in the Bochner-Sobolev norm.
arXiv Detail & Related papers (2023-12-13T19:02:27Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Neural Network Approximations of PDEs Beyond Linearity: A
Representational Perspective [40.964402478629495]
We take a step towards studying the representational power of neural networks for approximating solutions to nonlinear PDEs.
Treating a class of PDEs known as emphnonlinear elliptic variational PDEs, our results show neural networks can evade the curse of dimensionality.
arXiv Detail & Related papers (2022-10-21T16:53:18Z) - On the Representation of Solutions to Elliptic PDEs in Barron Spaces [9.875204185976777]
This paper derives complexity estimates of the solutions of $d$dimensional second-order elliptic PDEs in the Barron space.
As a direct consequence of the complexity estimates, the solution of the PDE can be approximated on any bounded domain by a two-layer neural network.
arXiv Detail & Related papers (2021-06-14T16:05:07Z) - A Priori Generalization Analysis of the Deep Ritz Method for Solving
High Dimensional Elliptic Equations [11.974322921837384]
We derive the generalization error bounds of two-layer neural networks in the framework of the Deep Ritz Method (DRM)
We prove that the convergence rates of generalization errors are independent of the dimension $d$.
We develop a new solution theory for the PDEs on the spectral Barron space.
arXiv Detail & Related papers (2021-01-05T18:50:59Z) - Two-Layer Neural Networks for Partial Differential Equations:
Optimization and Generalization Theory [4.243322291023028]
We show that the gradient descent method can identify a global minimizer of the least-squares optimization for solving second-order linear PDEs.
We also analyze the generalization error of the least-squares optimization for second-order linear PDEs and two-layer neural networks.
arXiv Detail & Related papers (2020-06-28T22:24:51Z) - Convex Geometry and Duality of Over-parameterized Neural Networks [70.15611146583068]
We develop a convex analytic approach to analyze finite width two-layer ReLU networks.
We show that an optimal solution to the regularized training problem can be characterized as extreme points of a convex set.
In higher dimensions, we show that the training problem can be cast as a finite dimensional convex problem with infinitely many constraints.
arXiv Detail & Related papers (2020-02-25T23:05:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.