SpectraKAN: Conditioning Spectral Operators
- URL: http://arxiv.org/abs/2602.05187v1
- Date: Thu, 05 Feb 2026 01:30:25 GMT
- Title: SpectraKAN: Conditioning Spectral Operators
- Authors: Chun-Wun Cheng, Carola-Bibiane Schönlieb, Angelica I. Aviles-Rivero,
- Abstract summary: We introduce SpectraKAN, a neural operator that conditions the spectral operator on the input itself, turning spectral into an input-conditioned integral operator.<n>This is achieved by extracting a compact global representation from static-temporal history and using it to modulate a multi-scale trunk via single-query cross-attention.<n>Across diverse PDE benchmarks, SpectraKAN achieves state-of-the-art performance, reducing RMSE by up to 49% over strong baselines, with particularly large gains on challenging-temporal prediction tasks.
- Score: 21.190440188964452
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Spectral neural operators, particularly Fourier Neural Operators (FNO), are a powerful framework for learning solution operators of partial differential equations (PDEs) due to their efficient global mixing in the frequency domain. However, existing spectral operators rely on static Fourier kernels applied uniformly across inputs, limiting their ability to capture multi-scale, regime-dependent, and anisotropic dynamics governed by the global state of the system. We introduce SpectraKAN, a neural operator that conditions the spectral operator on the input itself, turning static spectral convolution into an input-conditioned integral operator. This is achieved by extracting a compact global representation from spatio-temporal history and using it to modulate a multi-scale Fourier trunk via single-query cross-attention, enabling the operator to adapt its behaviour while retaining the efficiency of spectral mixing. We provide theoretical justification showing that this modulation converges to a resolution-independent continuous operator under mesh refinement and KAN gives smooth, Lipschitz-controlled global modulation. Across diverse PDE benchmarks, SpectraKAN achieves state-of-the-art performance, reducing RMSE by up to 49% over strong baselines, with particularly large gains on challenging spatio-temporal prediction tasks.
Related papers
- RKHS Representation of Algebraic Convolutional Filters with Integral Operators [111.57971404925486]
In this paper, we develop a theory showing that the range of integral operators naturally induces RKHS convolutional signal models.<n>We show that filtering with integral operators corresponds to iterated box products, giving rise to a unital kernel algebra.<n>Our results establish precise connections between eigendecompositions and RKHS representations in graphon signal processing, extend naturally to directed graphons, and enable novel spatial-spectral localization results.
arXiv Detail & Related papers (2026-02-22T08:28:34Z) - FUTON: Fourier Tensor Network for Implicit Neural Representations [56.48739018255443]
Implicit neural representations (INRs) have emerged as powerful tools for encoding signals, yet dominant-based designs often suffer from slow convergence, overfitting to noise, and poor extrapolation.<n>We introduce FUTON, which models signals as generalized Fourier series whose coefficients are parameterized by a low-rank tensor decomposition.
arXiv Detail & Related papers (2026-02-13T19:31:44Z) - Parallel Complex Diffusion for Scalable Time Series Generation [50.01609741902786]
PaCoDi is a spectral-native architecture that decouples generative modeling in the frequency domain.<n>We show that PaCoDi outperforms existing baselines in both generation quality and inference speed.
arXiv Detail & Related papers (2026-02-10T14:31:53Z) - SFO: Learning PDE Operators via Spectral Filtering [25.390393080966422]
We introduce a neural operator that parameterizes integral kernels using the Universal Spectral Basis (USB)<n>By learning only the spectral coefficients of rapidly decaying eigenvalues, SFO achieves a highly efficient representation.
arXiv Detail & Related papers (2026-01-23T10:45:52Z) - The Vekua Layer: Exact Physical Priors for Implicit Neural Representations via Generalized Analytic Functions [0.0]
Implicit Neural Representations (INRs) have emerged as a powerful paradigm for parameterizing physical fields.<n>We introduce a differentiable spectral method grounded in the Generalized Analytic theory.<n>We show that our method can effectively act as a physics-informed spectral filter.
arXiv Detail & Related papers (2025-12-11T21:57:21Z) - SAOT: An Enhanced Locality-Aware Spectral Transformer for Solving PDEs [8.678387342998613]
We investigate incorporating the spatial-frequency localization property of Wavelet transforms into the Transformer architecture.<n>We propose a novel Wavelet Attention (WA) module with linear computational complexity to efficiently learn locality-aware features.<n>We further develop the Spectral Attention Operator Transformer (SAOT), a hybrid spectral Transformer framework that integrates WA's localized focus with the global receptive field of Fourier-based Attention.
arXiv Detail & Related papers (2025-11-24T05:22:28Z) - STNet: Spectral Transformation Network for Solving Operator Eigenvalue Problem [10.27238431947351]
Operator eigenvalue problems play a critical role in various scientific fields and engineering applications.<n>Recent deep learning methods provide an efficient approach to address this challenge by iteratively updating neural networks.<n>We propose the Spectral Transformation Network (STNet), which consistently outperforms existing learning-based methods.
arXiv Detail & Related papers (2025-10-28T01:43:54Z) - SpectrumFM: A Foundation Model for Intelligent Spectrum Management [99.08036558911242]
Existing intelligent spectrum management methods, typically based on small-scale models, suffer from notable limitations in recognition accuracy, convergence speed, and generalization.<n>This paper proposes a novel spectrum foundation model, termed SpectrumFM, establishing a new paradigm for spectrum management.<n>Experiments demonstrate that SpectrumFM achieves superior performance in terms of accuracy, robustness, adaptability, few-shot learning efficiency, and convergence speed.
arXiv Detail & Related papers (2025-05-02T04:06:39Z) - Holistic Physics Solver: Learning PDEs in a Unified Spectral-Physical Space [54.13671100638092]
Holistic Physics Mixer (HPM) is a framework for integrating spectral and physical information in a unified space.<n>We show that HPM consistently outperforms state-of-the-art methods in both accuracy and computational efficiency.
arXiv Detail & Related papers (2024-10-15T08:19:39Z) - Incremental Spatial and Spectral Learning of Neural Operators for
Solving Large-Scale PDEs [86.35471039808023]
We introduce the Incremental Fourier Neural Operator (iFNO), which progressively increases the number of frequency modes used by the model.
We show that iFNO reduces total training time while maintaining or improving generalization performance across various datasets.
Our method demonstrates a 10% lower testing error, using 20% fewer frequency modes compared to the existing Fourier Neural Operator, while also achieving a 30% faster training.
arXiv Detail & Related papers (2022-11-28T09:57:15Z) - Blending Neural Operators and Relaxation Methods in PDE Numerical Solvers [3.2712166248850685]
HINTS is a hybrid, iterative, numerical, and transferable solver for partial differential equations.
It balances the convergence behavior across the spectrum of eigenmodes by utilizing the spectral bias of DeepONet.
It is flexible with regards to discretizations, computational domain, and boundary conditions.
arXiv Detail & Related papers (2022-08-28T19:07:54Z) - Hyperspectral Image Denoising Using Non-convex Local Low-rank and Sparse
Separation with Spatial-Spectral Total Variation Regularization [49.55649406434796]
We propose a novel non particular approach to robust principal component analysis for HSI denoising.
We develop accurate approximations to both rank and sparse components.
Experiments on both simulated and real HSIs demonstrate the effectiveness of the proposed method.
arXiv Detail & Related papers (2022-01-08T11:48:46Z) - Factorized Fourier Neural Operators [77.47313102926017]
The Factorized Fourier Neural Operator (F-FNO) is a learning-based method for simulating partial differential equations.
We show that our model maintains an error rate of 2% while still running an order of magnitude faster than a numerical solver.
arXiv Detail & Related papers (2021-11-27T03:34:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.