Variational Green's Functions for Volumetric PDEs
- URL: http://arxiv.org/abs/2602.12349v1
- Date: Thu, 12 Feb 2026 19:12:44 GMT
- Title: Variational Green's Functions for Volumetric PDEs
- Authors: Joao Teixeira, Eitan Grinspun, Otman Benchekroun,
- Abstract summary: We present a method that learns a smooth, differentiable representation of the Green's function for linear self-adjoint PDE operators.<n>To resolve the sharp singularities characteristic of the Green's functions, our method decomposes the Green's function into an analytic free-space component, and a learned corrector component.<n>The resulting Green's functions are fast to evaluate, differentiable with respect to source application, and can be conditioned on other signals parameterizing our geometry.
- Score: 5.958368061748122
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Green's functions characterize the fundamental solutions of partial differential equations; they are essential for tasks ranging from shape analysis to physical simulation, yet they remain computationally prohibitive to evaluate on arbitrary geometric discretizations. We present Variational Green's Function (VGF), a method that learns a smooth, differentiable representation of the Green's function for linear self-adjoint PDE operators, including the Poisson, the screened Poisson, and the biharmonic equations. To resolve the sharp singularities characteristic of the Green's functions, our method decomposes the Green's function into an analytic free-space component, and a learned corrector component. Our method leverages a variational foundation to impose Neumann boundary conditions naturally, and imposes Dirichlet boundary conditions via a projective layer on the output of the neural field. The resulting Green's functions are fast to evaluate, differentiable with respect to source application, and can be conditioned on other signals parameterizing our geometry.
Related papers
- DInf-Grid: A Neural Differential Equation Solver with Differentiable Feature Grids [73.28614344779076]
We present a differentiable grid-based representation for efficiently solving differential equations (DEs)<n>Our results demonstrate a 5-20x speed-up over coordinate-based methods, solving differential equations in seconds or minutes while maintaining comparable accuracy and compactness.
arXiv Detail & Related papers (2026-01-15T18:59:57Z) - Neural Green's Functions [26.725858777761506]
We introduce a neural solution operator for linear partial differential equations (PDEs) whose differential operators admit eigendecompositions.<n>Inspired by Green's functions, we design Neural Green's Function to imitate their behavior, achieving superior generalization across diverse irregular geometries and source and boundary functions.
arXiv Detail & Related papers (2025-11-02T09:08:01Z) - Governing Equation Discovery from Data Based on Differential Invariants [52.2614860099811]
We propose a pipeline for governing equation discovery based on differential invariants.<n>Specifically, we compute the set of differential invariants corresponding to the infinitesimal generators of the symmetry group.<n>Taking DI-SINDy as an example, we demonstrate that its success rate and accuracy in PDE discovery surpass those of other symmetry-informed governing equation discovery methods.
arXiv Detail & Related papers (2025-05-24T17:19:02Z) - Covariant non-perturbative pointer variables for quantum fields [44.99833362998488]
We derive and renormalize the integro-differential equation that governs the detector pointer-variable dynamics.<n>Our formal solution, expressed in terms of Green's functions, allows for the covariant, and causal analysis of induced observables on the field.
arXiv Detail & Related papers (2025-02-03T11:53:31Z) - An explainable operator approximation framework under the guideline of Green's function [1.1174586184779578]
We introduce a novel framework, GreensONet, to learn embedded Green's functions and solve PDEs via Green's integral formulation.<n>The framework's accuracy and generalization ability surpass those of existing methods.
arXiv Detail & Related papers (2024-12-21T14:31:03Z) - DimINO: Dimension-Informed Neural Operator Learning [41.37905663176428]
DimINO is a framework inspired by dimensional analysis.<n>It can be seamlessly integrated into existing neural operator architectures.<n>It achieves up to 76.3% performance gain on PDE datasets.
arXiv Detail & Related papers (2024-10-08T10:48:50Z) - Learning Domain-Independent Green's Function For Elliptic Partial
Differential Equations [0.0]
Green's function characterizes a partial differential equation (PDE) and maps its solution in the entire domain as integrals.
We propose a novel boundary integral network to learn the domain-independent Green's function, referred to as BIN-G.
We demonstrate that our numerical scheme enables fast training and accurate evaluation of the Green's function for PDEs with variable coefficients.
arXiv Detail & Related papers (2024-01-30T17:00:22Z) - Data-driven discovery of Green's functions [0.0]
This thesis introduces theoretical results and deep learning algorithms to learn Green's functions associated with linear partial differential equations.
The construction connects the fields of PDE learning and numerical linear algebra.
Rational neural networks (NNs) are introduced and consist of neural networks with trainable rational activation functions.
arXiv Detail & Related papers (2022-10-28T09:41:50Z) - Learning differentiable solvers for systems with hard constraints [48.54197776363251]
We introduce a practical method to enforce partial differential equation (PDE) constraints for functions defined by neural networks (NNs)
We develop a differentiable PDE-constrained layer that can be incorporated into any NN architecture.
Our results show that incorporating hard constraints directly into the NN architecture achieves much lower test error when compared to training on an unconstrained objective.
arXiv Detail & Related papers (2022-07-18T15:11:43Z) - Optimal Neural Network Approximation of Wasserstein Gradient Direction
via Convex Optimization [43.6961980403682]
The computation of Wasserstein gradient direction is essential for posterior sampling problems and scientific computing.
We study the variational problem in the family of two-layer networks with squared-ReLU activations, towards which we derive a semi-definite programming (SDP) relaxation.
This SDP can be viewed as an approximation of the Wasserstein gradient in a broader function family including two-layer networks.
arXiv Detail & Related papers (2022-05-26T00:51:12Z) - BI-GreenNet: Learning Green's functions by boundary integral network [14.008606361378149]
Green's function plays a significant role in both theoretical analysis and numerical computing of partial differential equations.
We develop a new method for computing Green's function with high accuracy.
arXiv Detail & Related papers (2022-04-28T01:42:35Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.