Hybrid Iterative Solvers with Geometry-Aware Neural Preconditioners for Parametric PDEs
- URL: http://arxiv.org/abs/2512.14596v1
- Date: Tue, 16 Dec 2025 17:06:10 GMT
- Title: Hybrid Iterative Solvers with Geometry-Aware Neural Preconditioners for Parametric PDEs
- Authors: Youngkyu Lee, Francesc Levrero Florencio, Jay Pathak, George Em Karniadakis,
- Abstract summary: We introduce Geo-DeepONet, a geometry-aware deep operator network that incorporates domain information extracted from finite element discretizations.<n>We develop a class of geometry-aware hybrid preconditioned iterative solvers by coupling Geo-DeepONet with traditional methods such as relaxation schemes and Krylov subspace algorithms.
- Score: 5.532017361572708
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: The convergence behavior of classical iterative solvers for parametric partial differential equations (PDEs) is often highly sensitive to the domain and specific discretization of PDEs. Previously, we introduced hybrid solvers by combining the classical solvers with neural operators for a specific geometry 1, but they tend to under-perform in geometries not encountered during training. To address this challenge, we introduce Geo-DeepONet, a geometry-aware deep operator network that incorporates domain information extracted from finite element discretizations. Geo-DeepONet enables accurate operator learning across arbitrary unstructured meshes without requiring retraining. Building on this, we develop a class of geometry-aware hybrid preconditioned iterative solvers by coupling Geo-DeepONet with traditional methods such as relaxation schemes and Krylov subspace algorithms. Through numerical experiments on parametric PDEs posed over diverse unstructured domains, we demonstrate the enhanced robustness and efficiency of the proposed hybrid solvers for multiple real-world applications.
Related papers
- Structure-Preserving Learning Improves Geometry Generalization in Neural PDEs [7.60216127875876]
We introduce General-Geometry Neural Whitney Forms (Geo-NeW): a data-driven finite element method.<n>We demonstrate state-of-the-art performance on several steady-state PDE benchmarks, and provide a significant improvement over conventional baselines on out-of-distribution geometries.
arXiv Detail & Related papers (2026-02-02T20:45:07Z) - Expanding the Chaos: Neural Operator for Stochastic (Partial) Differential Equations [65.80144621950981]
We build on Wiener chaos expansions (WCE) to design neural operator (NO) architectures for SPDEs and SDEs.<n>We show that WCE-based neural operators provide a practical and scalable way to learn SDE/SPDE solution operators.
arXiv Detail & Related papers (2026-01-03T00:59:25Z) - Graph Neural Regularizers for PDE Inverse Problems [62.49743146797144]
We present a framework for solving a broad class of ill-posed inverse problems governed by partial differential equations (PDEs)<n>The forward problem is numerically solved using the finite element method (FEM)<n>We employ physics-inspired graph neural networks as learned regularizers, providing a robust, interpretable, and generalizable alternative to standard approaches.
arXiv Detail & Related papers (2025-10-23T21:43:25Z) - Geometry aware inference of steady state PDEs using Equivariant Neural Fields representations [0.30786914102688595]
We introduce enf2enf, a neural field approach for predicting steady-state PDEs with geometric variability.<n>Our method encodes geometries into latent features anchored at specific spatial locations, preserving locality throughout the network.
arXiv Detail & Related papers (2025-04-24T08:30:32Z) - Bridging Geometric States via Geometric Diffusion Bridge [79.60212414973002]
We introduce the Geometric Diffusion Bridge (GDB), a novel generative modeling framework that accurately bridges initial and target geometric states.
GDB employs an equivariant diffusion bridge derived by a modified version of Doob's $h$-transform for connecting geometric states.
We show that GDB surpasses existing state-of-the-art approaches, opening up a new pathway for accurately bridging geometric states.
arXiv Detail & Related papers (2024-10-31T17:59:53Z) - A hybrid numerical methodology coupling Reduced Order Modeling and Graph Neural Networks for non-parametric geometries: applications to structural dynamics problems [0.0]
This work introduces a new approach for accelerating the numerical analysis of time-domain partial differential equations (PDEs) governing complex physical systems.
The methodology is based on a combination of a classical reduced-order modeling (ROM) framework and recently-parametric Graph Neural Networks (GNNs)
arXiv Detail & Related papers (2024-06-03T08:51:25Z) - Adaptive Surface Normal Constraint for Geometric Estimation from Monocular Images [56.86175251327466]
We introduce a novel approach to learn geometries such as depth and surface normal from images while incorporating geometric context.
Our approach extracts geometric context that encodes the geometric variations present in the input image and correlates depth estimation with geometric constraints.
Our method unifies depth and surface normal estimations within a cohesive framework, which enables the generation of high-quality 3D geometry from images.
arXiv Detail & Related papers (2024-02-08T17:57:59Z) - Transolver: A Fast Transformer Solver for PDEs on General Geometries [66.82060415622871]
We present Transolver, which learns intrinsic physical states hidden behind discretized geometries.
By calculating attention to physics-aware tokens encoded from slices, Transovler can effectively capture intricate physical correlations.
Transolver achieves consistent state-of-the-art with 22% relative gain across six standard benchmarks and also excels in large-scale industrial simulations.
arXiv Detail & Related papers (2024-02-04T06:37:38Z) - Physics-informed neural networks for transformed geometries and
manifolds [0.0]
We propose a novel method for integrating geometric transformations within PINNs to robustly accommodate geometric variations.
We demonstrate the enhanced flexibility over traditional PINNs, especially under geometric variations.
The proposed framework presents an outlook for training deep neural operators over parametrized geometries.
arXiv Detail & Related papers (2023-11-27T15:47:33Z) - Operator Learning with Neural Fields: Tackling PDEs on General
Geometries [15.65577053925333]
Machine learning approaches for solving partial differential equations require learning mappings between function spaces.
New CORAL method leverages coordinate-based networks for PDEs on some general constraints.
arXiv Detail & Related papers (2023-06-12T17:52:39Z) - Solving High-Dimensional PDEs with Latent Spectral Models [74.1011309005488]
We present Latent Spectral Models (LSM) toward an efficient and precise solver for high-dimensional PDEs.
Inspired by classical spectral methods in numerical analysis, we design a neural spectral block to solve PDEs in the latent space.
LSM achieves consistent state-of-the-art and yields a relative gain of 11.5% averaged on seven benchmarks.
arXiv Detail & Related papers (2023-01-30T04:58:40Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.