Learning Singularity-Encoded Green's Functions with Application to Iterative Methods
- URL: http://arxiv.org/abs/2509.11580v1
- Date: Mon, 15 Sep 2025 04:53:22 GMT
- Title: Learning Singularity-Encoded Green's Functions with Application to Iterative Methods
- Authors: Qi Sun, Shengyan Li, Bowen Zheng, Lili Ju, Xuejun Xu,
- Abstract summary: Green's function provides an inherent connection between theoretical analysis and numerical methods for elliptic partial differential equations.<n> numerical computation of Green's function remains challenging due to its doubled dimensionality and intrinsic singularity.<n>We present a novel singularity-encoded learning approach to resolve these problems in an unsupervised fashion.
- Score: 10.746390638014956
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Green's function provides an inherent connection between theoretical analysis and numerical methods for elliptic partial differential equations, and general absence of its closed-form expression necessitates surrogate modeling to guide the design of effective solvers. Unfortunately, numerical computation of Green's function remains challenging due to its doubled dimensionality and intrinsic singularity. In this paper, we present a novel singularity-encoded learning approach to resolve these problems in an unsupervised fashion. Our method embeds the Green's function within a one-order higher-dimensional space by encoding its prior estimate as an augmented variable, followed by a neural network parametrization to manage the increased dimensionality. By projecting the trained neural network solution back onto the original domain, our deep surrogate model exploits its spectral bias to accelerate conventional iterative schemes, serving either as a preconditioner or as part of a hybrid solver. The effectiveness of our proposed method is empirically verified through numerical experiments with two and four dimensional Green's functions, achieving satisfactory resolution of singularities and acceleration of iterative solvers.
Related papers
- Neural Green's Functions [26.725858777761506]
We introduce a neural solution operator for linear partial differential equations (PDEs) whose differential operators admit eigendecompositions.<n>Inspired by Green's functions, we design Neural Green's Function to imitate their behavior, achieving superior generalization across diverse irregular geometries and source and boundary functions.
arXiv Detail & Related papers (2025-11-02T09:08:01Z) - An explainable operator approximation framework under the guideline of Green's function [1.1174586184779578]
We introduce a novel framework, GreensONet, to learn embedded Green's functions and solve PDEs via Green's integral formulation.<n>The framework's accuracy and generalization ability surpass those of existing methods.
arXiv Detail & Related papers (2024-12-21T14:31:03Z) - Neural Control Variates with Automatic Integration [49.91408797261987]
This paper proposes a novel approach to construct learnable parametric control variates functions from arbitrary neural network architectures.
We use the network to approximate the anti-derivative of the integrand.
We apply our method to solve partial differential equations using the Walk-on-sphere algorithm.
arXiv Detail & Related papers (2024-09-23T06:04:28Z) - Physics-Informed Generator-Encoder Adversarial Networks with Latent
Space Matching for Stochastic Differential Equations [14.999611448900822]
We propose a new class of physics-informed neural networks to address the challenges posed by forward, inverse, and mixed problems in differential equations.
Our model consists of two key components: the generator and the encoder, both updated alternately by gradient descent.
In contrast to previous approaches, we employ an indirect matching that operates within the lower-dimensional latent feature space.
arXiv Detail & Related papers (2023-11-03T04:29:49Z) - Differentiable DG with Neural Operator Source Term Correction [0.0]
We introduce an end-to-end differentiable framework for solving the compressible Navier-Stokes equations.<n>This integrated approach combines a differentiable discontinuous Galerkin solver with a neural network source term.<n>We demonstrate the performance of the proposed framework through two examples.
arXiv Detail & Related papers (2023-10-29T04:26:23Z) - A Deep Unrolling Model with Hybrid Optimization Structure for Hyperspectral Image Deconvolution [50.13564338607482]
We propose a novel optimization framework for the hyperspectral deconvolution problem, called DeepMix.<n>It consists of three distinct modules, namely, a data consistency module, a module that enforces the effect of the handcrafted regularizers, and a denoising module.<n>This work proposes a context aware denoising module designed to sustain the advancements achieved by the cooperative efforts of the other modules.
arXiv Detail & Related papers (2023-06-10T08:25:16Z) - A Stable and Scalable Method for Solving Initial Value PDEs with Neural
Networks [52.5899851000193]
We develop an ODE based IVP solver which prevents the network from getting ill-conditioned and runs in time linear in the number of parameters.
We show that current methods based on this approach suffer from two key issues.
First, following the ODE produces an uncontrolled growth in the conditioning of the problem, ultimately leading to unacceptably large numerical errors.
arXiv Detail & Related papers (2023-04-28T17:28:18Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Learning Green's Functions of Linear Reaction-Diffusion Equations with
Application to Fast Numerical Solver [9.58037674226622]
We propose a novel neural network, GF-Net, for learning the Green's functions of linear reaction-diffusion equations in an unsupervised fashion.
The proposed method overcomes the challenges for finding the Green's functions of the equations on arbitrary domains.
arXiv Detail & Related papers (2021-05-23T23:36:46Z) - MetaSDF: Meta-learning Signed Distance Functions [85.81290552559817]
Generalizing across shapes with neural implicit representations amounts to learning priors over the respective function space.
We formalize learning of a shape space as a meta-learning problem and leverage gradient-based meta-learning algorithms to solve this task.
arXiv Detail & Related papers (2020-06-17T05:14:53Z) - Cogradient Descent for Bilinear Optimization [124.45816011848096]
We introduce a Cogradient Descent algorithm (CoGD) to address the bilinear problem.
We solve one variable by considering its coupling relationship with the other, leading to a synchronous gradient descent.
Our algorithm is applied to solve problems with one variable under the sparsity constraint.
arXiv Detail & Related papers (2020-06-16T13:41:54Z) - Method of spectral Green functions in driven open quantum dynamics [77.34726150561087]
A novel method based on spectral Green functions is presented for the simulation of driven open quantum dynamics.
The formalism shows remarkable analogies to the use of Green functions in quantum field theory.
The method dramatically reduces computational cost compared with simulations based on solving the full master equation.
arXiv Detail & Related papers (2020-06-04T09:41:08Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.