Deep Eigenspace Network and Its Application to Parametric Non-selfadjoint Eigenvalue Problems
- URL: http://arxiv.org/abs/2512.20058v1
- Date: Tue, 23 Dec 2025 05:20:22 GMT
- Title: Deep Eigenspace Network and Its Application to Parametric Non-selfadjoint Eigenvalue Problems
- Authors: H. Li, J. Sun, Z. Zhang,
- Abstract summary: We consider operator learning for efficiently solving parametric non-selfadjoint eigenvalue problems.<n>We introduce a hybrid framework that learns the stable invariant eigensubspace mapping rather than individual eigenfunctions.
- Score: 0.12744523252873352
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We consider operator learning for efficiently solving parametric non-selfadjoint eigenvalue problems. To overcome the spectral instability and mode switching inherent in non-selfadjoint operators, we introduce a hybrid framework that learns the stable invariant eigensubspace mapping rather than individual eigenfunctions. We proposed a Deep Eigenspace Network (DEN) architecture integrating Fourier Neural Operators, geometry-adaptive POD bases, and explicit banded cross-mode mixing mechanisms to capture complex spectral dependencies on unstructured meshes. We apply DEN to the parametric non-selfadjoint Steklov eigenvalue problem and provide theoretical proofs for the Lipschitz continuity of the eigensubspace with respect to the parameters. In addition, we derive error bounds for the reconstruction of the eigenspace. Numerical experiments validate DEN's high accuracy and zero-shot generalization capabilities across different discretizations.
Related papers
- Stability and Generalization of Push-Sum Based Decentralized Optimization over Directed Graphs [55.77845440440496]
Push-based decentralized communication enables optimization over communication networks, where information exchange may be asymmetric.<n>We develop a unified uniform-stability framework for the Gradient Push (SGP) algorithm.<n>A key technical ingredient is an imbalance-aware generalization bound through two quantities.
arXiv Detail & Related papers (2026-02-24T05:32:03Z) - Stability and Concentration in Nonlinear Inverse Problems with Block-Structured Parameters: Lipschitz Geometry, Identifiability, and an Application to Gaussian Splatting [0.552480439325792]
We develop an operator-theoretic framework for stability and statistical concentration in nonlinear inverse problems with block-structured parameters.<n>Overall, the analysis characterizes operator-level limits for a broad class of high-dimensional nonlinear inverse problems arising in modern imaging and differentiable rendering.
arXiv Detail & Related papers (2026-02-10T05:11:06Z) - Deep Delta Learning [91.75868893250662]
We introduce Deep Delta Learning (DDL), a novel architecture that generalizes the standard residual connection.<n>We provide a spectral analysis of this operator, demonstrating that the gate $(mathbfX)$ enables dynamic between identity mapping, projection, and geometric reflection.<n>This unification empowers the network to explicitly control the spectrum of its layer-wise transition operator, enabling the modeling of complex, non-monotonic dynamics.
arXiv Detail & Related papers (2026-01-01T18:11:38Z) - The Procrustean Bed of Time Series: The Optimization Bias of Point-wise Loss [53.542743390809356]
This paper aims to provide a first-principles analysis of the Expectation of Optimization Bias (EOB)<n>Our analysis reveals a fundamental paradigm paradox: the more deterministic and structured the time series, the more severe the bias by point-wise loss function.<n>We present a concrete solution that simultaneously achieves both principles via DFT or DWT.
arXiv Detail & Related papers (2025-12-21T06:08:22Z) - Learning Eigenstructures of Unstructured Data Manifolds [47.81117132002129]
We introduce a novel framework that learns a spectral basis for shape and manifold analysis from unstructured data.<n>By replacing the traditional operator selection, construction, and eigendecomposition with a learning-based approach, our framework offers a principled, data-driven alternative to conventional pipelines.
arXiv Detail & Related papers (2025-11-30T22:06:49Z) - Limitation of Stoquastic Quantum Annealing: A Structural Perspective [0.0]
We provide a structural explanation for the anti-crossing arising from the competition between the energies associated with a set of degenerate local minima.<n>This paper serves as a supplementary companion to our main work on the DIC-DAC-DOA algorithm.
arXiv Detail & Related papers (2025-09-18T04:39:48Z) - Architecture independent generalization bounds for overparametrized deep ReLU networks [0.9687141267566189]
We prove that overparametrized neural networks are able to generalize with a test error independent of the level of overparametrization.<n>For overparametrized deep ReLU networks with a training sample size bounded by the input space dimension, we explicitly construct zero loss minimizers without use of gradient descent.
arXiv Detail & Related papers (2025-04-08T05:37:38Z) - On the Convergence of Hermitian Dynamic Mode Decomposition [4.028503203417233]
We study the convergence of Hermitian Dynamic Mode Decomposition to the spectral properties of self-adjoint Koopman operators.
We numerically demonstrate our results by applying them to two-dimensional Schr"odinger equations.
arXiv Detail & Related papers (2024-01-06T11:13:16Z) - Stable Nonconvex-Nonconcave Training via Linear Interpolation [51.668052890249726]
This paper presents a theoretical analysis of linearahead as a principled method for stabilizing (large-scale) neural network training.
We argue that instabilities in the optimization process are often caused by the nonmonotonicity of the loss landscape and show how linear can help by leveraging the theory of nonexpansive operators.
arXiv Detail & Related papers (2023-10-20T12:45:12Z) - Convolutional Filtering and Neural Networks with Non Commutative
Algebras [153.20329791008095]
We study the generalization of non commutative convolutional neural networks.
We show that non commutative convolutional architectures can be stable to deformations on the space of operators.
arXiv Detail & Related papers (2021-08-23T04:22:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.