The Adaptive Vekua Cascade: A Differentiable Spectral-Analytic Solver for Physics-Informed Representation
- URL: http://arxiv.org/abs/2512.11776v1
- Date: Fri, 12 Dec 2025 18:41:35 GMT
- Title: The Adaptive Vekua Cascade: A Differentiable Spectral-Analytic Solver for Physics-Informed Representation
- Authors: Vladimer Khasia,
- Abstract summary: Adaptive-based neural networks have emerged as a powerful tool for representing continuous physical fields.<n>They face two fundamental pathologies: spectral bias and the curse of dimensionality.<n>We propose a hybrid architecture that bridges analytic analytic decouples manifold learning from function approximation.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Coordinate-based neural networks have emerged as a powerful tool for representing continuous physical fields, yet they face two fundamental pathologies: spectral bias, which hinders the learning of high-frequency dynamics, and the curse of dimensionality, which causes parameter explosion in discrete feature grids. We propose the Adaptive Vekua Cascade (AVC), a hybrid architecture that bridges deep learning and classical approximation theory. AVC decouples manifold learning from function approximation by using a deep network to learn a diffeomorphic warping of the physical domain, projecting complex spatiotemporal dynamics onto a latent manifold where the solution is represented by a basis of generalized analytic functions. Crucially, we replace the standard gradient-descent output layer with a differentiable linear solver, allowing the network to optimally resolve spectral coefficients in a closed form during the forward pass. We evaluate AVC on a suite of five rigorous physics benchmarks, including high-frequency Helmholtz wave propagation, sparse medical reconstruction, and unsteady 3D Navier-Stokes turbulence. Our results demonstrate that AVC achieves state-of-the-art accuracy while reducing parameter counts by orders of magnitude (e.g., 840 parameters vs. 4.2 million for 3D grids) and converging 2-3x faster than implicit neural representations. This work establishes a new paradigm for memory-efficient, spectrally accurate scientific machine learning. The code is available at https://github.com/VladimerKhasia/vecua.
Related papers
- DInf-Grid: A Neural Differential Equation Solver with Differentiable Feature Grids [73.28614344779076]
We present a differentiable grid-based representation for efficiently solving differential equations (DEs)<n>Our results demonstrate a 5-20x speed-up over coordinate-based methods, solving differential equations in seconds or minutes while maintaining comparable accuracy and compactness.
arXiv Detail & Related papers (2026-01-15T18:59:57Z) - EddyFormer: Accelerated Neural Simulations of Three-Dimensional Turbulence at Scale [15.20548942455541]
EddyFormer is a Transformer-based spectral-element architecture for large-scale turbulence simulation.<n>It achieves DNS-level accuracy at 2563 resolution, providing a 30x speedup over DNS.<n>It preserves accuracy on physics-invariant metrics-energy spectra, correlation functions, and structure functions-showing domain generalization.
arXiv Detail & Related papers (2025-10-28T08:27:37Z) - Wave-PDE Nets: Trainable Wave-Equation Layers as an Alternative to Attention [0.0]
Wave-PDE Nets is a neural architecture whose elementary operation is a differentiable simulation of the second-order wave equation.<n>A symplectic spectral solver based on FFTs realises this propagation in O(nlog n) time.<n>On language and vision benchmarks, Wave-PDE Nets match or exceed Transformer performance.
arXiv Detail & Related papers (2025-10-05T17:52:52Z) - Convergence of physics-informed neural networks modeling time-harmonic wave fields [0.0]
We study 3D room acoustic cases at low frequency, varying the source definition and the number of boundary condition sets.<n>We assess the convergence behavior by looking at the loss landscape of the PINN architecture.<n>The developments are part of an initiative aiming to model the low-frequency behavior of room acoustics, including absorbers.
arXiv Detail & Related papers (2025-05-18T19:12:14Z) - Point Cloud Denoising With Fine-Granularity Dynamic Graph Convolutional Networks [58.050130177241186]
Noise perturbations often corrupt 3-D point clouds, hindering downstream tasks such as surface reconstruction, rendering, and further processing.
This paper introduces finegranularity dynamic graph convolutional networks called GDGCN, a novel approach to denoising in 3-D point clouds.
arXiv Detail & Related papers (2024-11-21T14:19:32Z) - Geometric Algebra Planes: Convex Implicit Neural Volumes [70.12234371845445]
We show that GA-Planes is equivalent to a sparse low-rank factor plus low-resolution matrix.
We also show that GA-Planes can be adapted for many existing representations.
arXiv Detail & Related papers (2024-11-20T18:21:58Z) - Equivariant Graph Neural Operator for Modeling 3D Dynamics [148.98826858078556]
We propose Equivariant Graph Neural Operator (EGNO) to directly models dynamics as trajectories instead of just next-step prediction.
EGNO explicitly learns the temporal evolution of 3D dynamics where we formulate the dynamics as a function over time and learn neural operators to approximate it.
Comprehensive experiments in multiple domains, including particle simulations, human motion capture, and molecular dynamics, demonstrate the significantly superior performance of EGNO against existing methods.
arXiv Detail & Related papers (2024-01-19T21:50:32Z) - Geometry-Informed Neural Operator for Large-Scale 3D PDEs [76.06115572844882]
We propose the geometry-informed neural operator (GINO) to learn the solution operator of large-scale partial differential equations.
We successfully trained GINO to predict the pressure on car surfaces using only five hundred data points.
arXiv Detail & Related papers (2023-09-01T16:59:21Z) - Differentiable Turbulence: Closure as a partial differential equation constrained optimization [1.8749305679160366]
We leverage the concept of differentiable turbulence, whereby an end-to-end differentiable solver is used in combination with physics-inspired choices of deep learning architectures.
We show that the differentiable physics paradigm is more successful than offline, textita-priori learning, and that hybrid solver-in-the-loop approaches to deep learning offer an ideal balance between computational efficiency, accuracy, and generalization.
arXiv Detail & Related papers (2023-07-07T15:51:55Z) - Neural Astrophysical Wind Models [0.0]
We show that deep neural networks embedded as individual terms in the governing coupled ordinary differential equations (ODEs) can robustly discover both of these physics.
We optimize a loss function based on the Mach number, rather than the explicitly solved-for 3 conserved variables, and apply a penalty term towards near-diverging solutions.
This work further highlights the feasibility of neural ODEs as a promising discovery tool with mechanistic interpretability for non-linear inverse problems.
arXiv Detail & Related papers (2023-06-20T16:37:57Z) - Learning Deformable Tetrahedral Meshes for 3D Reconstruction [78.0514377738632]
3D shape representations that accommodate learning-based 3D reconstruction are an open problem in machine learning and computer graphics.
Previous work on neural 3D reconstruction demonstrated benefits, but also limitations, of point cloud, voxel, surface mesh, and implicit function representations.
We introduce Deformable Tetrahedral Meshes (DefTet) as a particular parameterization that utilizes volumetric tetrahedral meshes for the reconstruction problem.
arXiv Detail & Related papers (2020-11-03T02:57:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.