Geometric Laplace Neural Operator
- URL: http://arxiv.org/abs/2512.16409v1
- Date: Thu, 18 Dec 2025 11:07:41 GMT
- Title: Geometric Laplace Neural Operator
- Authors: Hao Tang, Jiongyu Zhu, Zimeng Feng, Hao Li, Chao Li,
- Abstract summary: We propose a generalized operator learning framework based on a pole-residue decomposition enriched with exponential basis functions.<n>We introduce the Geometric Laplace Neural Operator (GLNO), which embeds the Laplace spectral representation into the eigen-basis of the Laplace-Beltrami operator.<n>We further design a grid-invariant network architecture (GLNONet) that realizes GLNO in practice.
- Score: 12.869633759181417
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural operators have emerged as powerful tools for learning mappings between function spaces, enabling efficient solutions to partial differential equations across varying inputs and domains. Despite the success, existing methods often struggle with non-periodic excitations, transient responses, and signals defined on irregular or non-Euclidean geometries. To address this, we propose a generalized operator learning framework based on a pole-residue decomposition enriched with exponential basis functions, enabling expressive modeling of aperiodic and decaying dynamics. Building on this formulation, we introduce the Geometric Laplace Neural Operator (GLNO), which embeds the Laplace spectral representation into the eigen-basis of the Laplace-Beltrami operator, extending operator learning to arbitrary Riemannian manifolds without requiring periodicity or uniform grids. We further design a grid-invariant network architecture (GLNONet) that realizes GLNO in practice. Extensive experiments on PDEs/ODEs and real-world datasets demonstrate our robust performance over other state-of-the-art models.
Related papers
- Discrete Solution Operator Learning for Geometry-Dependent PDEs [5.010936781094744]
We introduce DiSOL, a paradigm that learns discrete solution procedures rather than continuous function-space operators.<n>DiSOL factorizes the solver into learnable stages that mirror classical discretizations.<n>These results highlight the need for operator representations in geometry-dominated problems.
arXiv Detail & Related papers (2026-01-14T04:34:49Z) - Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - Rates and architectures for learning geometrically non-trivial operators [29.400357098551495]
Deep learning methods have proven capable of recovering operators between high-dimensional spaces from very few training samples.<n>We extend the learning theory to include double fibration transforms--geometric integral operators that include generalized Radon and geodesic ray transforms.<n>Our results contribute to a rapidly-growing line of theoretical work on learning operators for scientific machine learning.
arXiv Detail & Related papers (2025-12-10T07:15:07Z) - Fourier Neural Operators Explained: A Practical Perspective [75.12291469255794]
The Fourier Neural Operator (FNO) has become the most influential and widely adopted due to its elegant spectral formulation.<n>This guide aims to establish a clear and reliable framework for applying FNOs effectively across diverse scientific and engineering fields.
arXiv Detail & Related papers (2025-12-01T08:56:21Z) - Graph Neural Regularizers for PDE Inverse Problems [62.49743146797144]
We present a framework for solving a broad class of ill-posed inverse problems governed by partial differential equations (PDEs)<n>The forward problem is numerically solved using the finite element method (FEM)<n>We employ physics-inspired graph neural networks as learned regularizers, providing a robust, interpretable, and generalizable alternative to standard approaches.
arXiv Detail & Related papers (2025-10-23T21:43:25Z) - Operator Learning with Domain Decomposition for Geometry Generalization in PDE Solving [3.011852751337123]
We propose operator learning with domain decomposition, a local-to-global framework to solve PDEs on arbitrary geometries.<n>Under this framework, we devise an iterative scheme textitSchwarz Inference (SNI)<n>This scheme allows for partitioning of the problem domain into smaller geometries, on which local problems can be solved with neural operators.<n>We conduct extensive experiments on several representative PDEs with diverse boundary conditions and achieve remarkable geometry generalization.
arXiv Detail & Related papers (2025-04-01T08:00:43Z) - Diffeomorphic Latent Neural Operators for Data-Efficient Learning of Solutions to Partial Differential Equations [5.308435208832696]
A computed approximation of the solution operator to a system of partial differential equations (PDEs) is needed in various areas of science and engineering.<n>We propose that in order to learn a PDE solution operator that can generalize across multiple domains without needing to sample enough data expressive enough, we can train instead a latent neural operator on just a few ground truth solution fields.
arXiv Detail & Related papers (2024-11-27T03:16:00Z) - A Mathematical Analysis of Neural Operator Behaviors [0.0]
This paper presents a rigorous framework for analyzing the behaviors of neural operators.
We focus on their stability, convergence, clustering dynamics, universality, and generalization error.
We aim to offer clear and unified guidance in a single setting for the future design of neural operator-based methods.
arXiv Detail & Related papers (2024-10-28T19:38:53Z) - DimINO: Dimension-Informed Neural Operator Learning [41.37905663176428]
DimINO is a framework inspired by dimensional analysis.<n>It can be seamlessly integrated into existing neural operator architectures.<n>It achieves up to 76.3% performance gain on PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Designing Universal Causal Deep Learning Models: The Case of Infinite-Dimensional Dynamical Systems from Stochastic Analysis [7.373617024876726]
Several non-linear operators in analysis depend on a temporal structure which is not leveraged by contemporary neural operators.<n>This paper introduces a deep learning model-design framework that takes suitable infinite-dimensional linear metric spaces.<n>We show that our framework can uniformly approximate on compact sets and across arbitrarily finite-time horizons H" or smooth trace class operators.
arXiv Detail & Related papers (2022-10-24T14:43:03Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.