SFO: Learning PDE Operators via Spectral Filtering
- URL: http://arxiv.org/abs/2601.17090v1
- Date: Fri, 23 Jan 2026 10:45:52 GMT
- Title: SFO: Learning PDE Operators via Spectral Filtering
- Authors: Noam Koren, Rafael Moschopoulos, Kira Radinsky, Elad Hazan,
- Abstract summary: We introduce a neural operator that parameterizes integral kernels using the Universal Spectral Basis (USB)<n>By learning only the spectral coefficients of rapidly decaying eigenvalues, SFO achieves a highly efficient representation.
- Score: 25.390393080966422
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Partial differential equations (PDEs) govern complex systems, yet neural operators often struggle to efficiently capture the long-range, nonlocal interactions inherent in their solution maps. We introduce Spectral Filtering Operator (SFO), a neural operator that parameterizes integral kernels using the Universal Spectral Basis (USB), a fixed, global orthonormal basis derived from the eigenmodes of the Hilbert matrix in spectral filtering theory. Motivated by our theoretical finding that the discrete Green's functions of shift-invariant PDE discretizations exhibit spatial Linear Dynamical System (LDS) structure, we prove that these kernels admit compact approximations in the USB. By learning only the spectral coefficients of rapidly decaying eigenvalues, SFO achieves a highly efficient representation. Across six benchmarks, including reaction-diffusion, fluid dynamics, and 3D electromagnetics, SFO achieves state-of-the-art accuracy, reducing error by up to 40% relative to strong baselines while using substantially fewer parameters.
Related papers
- Parallel Complex Diffusion for Scalable Time Series Generation [50.01609741902786]
PaCoDi is a spectral-native architecture that decouples generative modeling in the frequency domain.<n>We show that PaCoDi outperforms existing baselines in both generation quality and inference speed.
arXiv Detail & Related papers (2026-02-10T14:31:53Z) - SpectraKAN: Conditioning Spectral Operators [21.190440188964452]
We introduce SpectraKAN, a neural operator that conditions the spectral operator on the input itself, turning spectral into an input-conditioned integral operator.<n>This is achieved by extracting a compact global representation from static-temporal history and using it to modulate a multi-scale trunk via single-query cross-attention.<n>Across diverse PDE benchmarks, SpectraKAN achieves state-of-the-art performance, reducing RMSE by up to 49% over strong baselines, with particularly large gains on challenging-temporal prediction tasks.
arXiv Detail & Related papers (2026-02-05T01:30:25Z) - NeuraLSP: An Efficient and Rigorous Neural Left Singular Subspace Preconditioner for Conjugate Gradient Methods [49.84495044725856]
NeuraLSP is a novel neural preconditioner combined with a novel loss metric.<n>Our method exhibits both theoretical guarantees and empirical robustness to rank inflation, up to a 53% speedup.
arXiv Detail & Related papers (2026-01-28T02:15:16Z) - The Adaptive Vekua Cascade: A Differentiable Spectral-Analytic Solver for Physics-Informed Representation [0.0]
Adaptive-based neural networks have emerged as a powerful tool for representing continuous physical fields.<n>They face two fundamental pathologies: spectral bias and the curse of dimensionality.<n>We propose a hybrid architecture that bridges analytic analytic decouples manifold learning from function approximation.
arXiv Detail & Related papers (2025-12-12T18:41:35Z) - The Vekua Layer: Exact Physical Priors for Implicit Neural Representations via Generalized Analytic Functions [0.0]
Implicit Neural Representations (INRs) have emerged as a powerful paradigm for parameterizing physical fields.<n>We introduce a differentiable spectral method grounded in the Generalized Analytic theory.<n>We show that our method can effectively act as a physics-informed spectral filter.
arXiv Detail & Related papers (2025-12-11T21:57:21Z) - Separated-Variable Spectral Neural Networks: A Physics-Informed Learning Approach for High-Frequency PDEs [21.081644719506453]
Separated-Variable Spectral Neural Networks (SV-SNN) is a novel framework that addresses the spectral bias problem in neural PDE solving.<n>We show that SV-SNN achieves 1-3 orders of magnitude of improvement in accuracy while reducing parameter count by over 90%.
arXiv Detail & Related papers (2025-08-01T13:40:10Z) - LOGLO-FNO: Efficient Learning of Local and Global Features in Fourier Neural Operators [20.77877474840923]
High-frequency information is a critical challenge in machine learning.<n>Deep neural nets exhibit the so-called spectral bias toward learning low-frequency components.<n>We propose a novel frequency-sensitive loss term based on radially binned spectral errors.
arXiv Detail & Related papers (2025-04-05T19:35:04Z) - DimINO: Dimension-Informed Neural Operator Learning [41.37905663176428]
DimINO is a framework inspired by dimensional analysis.<n>It can be seamlessly integrated into existing neural operator architectures.<n>It achieves up to 76.3% performance gain on PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Multi-Grid Tensorized Fourier Neural Operator for High-Resolution PDEs [93.82811501035569]
We introduce a new data efficient and highly parallelizable operator learning approach with reduced memory requirement and better generalization.
MG-TFNO scales to large resolutions by leveraging local and global structures of full-scale, real-world phenomena.
We demonstrate superior performance on the turbulent Navier-Stokes equations where we achieve less than half the error with over 150x compression.
arXiv Detail & Related papers (2023-09-29T20:18:52Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z) - Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers [3.2712166248850685]
HINTS is a hybrid, iterative, numerical, and transferable solver for partial differential equations.
It balances the convergence behavior across the spectrum of eigenmodes by utilizing the spectral bias of DeepONet.
It is flexible with regards to discretizations, computational domain, and boundary conditions.
arXiv Detail & Related papers (2022-08-28T19:07:54Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.