An Operator-Consistent Graph Neural Network for Learning Diffusion Dynamics on Irregular Meshes
- URL: http://arxiv.org/abs/2512.11860v1
- Date: Fri, 05 Dec 2025 06:58:25 GMT
- Title: An Operator-Consistent Graph Neural Network for Learning Diffusion Dynamics on Irregular Meshes
- Authors: Yuelian Li, Andrew Rushing Hands,
- Abstract summary: Multiphysics interactions such as diffusion, damage, and healing often take place on irregular meshes.<n>We develop an operator-consistent graph neural network (OCGNN-PINN) that approximates PDE evolution under physics-informed constraints.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Classical numerical methods solve partial differential equations (PDEs) efficiently on regular meshes, but many of them become unstable on irregular domains. In practice, multiphysics interactions such as diffusion, damage, and healing often take place on irregular meshes. We develop an operator-consistent graph neural network (OCGNN-PINN) that approximates PDE evolution under physics-informed constraints. It couples node-edge message passing with a consistency loss enforcing the gradient-divergence relation through the graph incidence matrix, ensuring that discrete node and edge dynamics remain structurally coupled during temporal rollout. We evaluate the model on diffusion processes over physically driven evolving meshes and real-world scanned surfaces. The results show improved temporal stability and prediction accuracy compared with graph convolutional and multilayer perceptron baselines, approaching the performance of Crank-Nicolson solvers on unstructured domains.
Related papers
- Latent Dynamics Graph Convolutional Networks for model order reduction of parameterized time-dependent PDEs [0.0]
We introduce Latent Dynamics Graph Conal Network (LD-GCN), a purely data-driven, encoder-free architecture.<n>LD-GCN learns a global, low-dimensional representation of dynamical systems conditioned on external inputs and parameters.<n>Our framework enhances interpretability by enabling the analysis of the reduced dynamics and supporting zero-shot prediction.
arXiv Detail & Related papers (2026-01-16T13:10:00Z) - SPIKE: Sparse Koopman Regularization for Physics-Informed Neural Networks [0.0]
SPIKE is a framework that regularizes PINNs with continuous-time Koopman operators to learn parsimonious dynamics representations.<n>Experiments across parabolic, hyperbolic, dispersive, and stiff PDEs, including fluid dynamics, demonstrate consistent improvements in temporal generalization, spatial extrapolation, and long-term prediction accuracy.
arXiv Detail & Related papers (2026-01-15T10:59:48Z) - Extended Physics Informed Neural Network for Hyperbolic Two-Phase Flow in Porous Media [0.7390960543869483]
This work employs the Extended Physics-In Neural Network (XPINN) framework to solve the nonlinear Buckley-Leverett equation.<n> Coupling betweenworks is achieved through the Rankine-Hugoniot jump condition, which enforces physically consistent flux continuity.<n>Compared to standard PINNs, the XPINN framework achieves superior stability, faster convergence, and enhanced nonlinear wave dynamics.
arXiv Detail & Related papers (2025-11-05T14:16:28Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - PhyMPGN: Physics-encoded Message Passing Graph Network for spatiotemporal PDE systems [31.006807854698376]
We propose a new graph learning approach, namely, Physics-encoded Message Passing Graph Network (PhyMPGN)<n>We incorporate a GNN into a numerical integrator to approximate the temporal marching of partialtemporal dynamics for a given PDE system.<n>PhyMPGN is capable of accurately predicting various types of operatortemporal dynamics on coarse unstructured meshes.
arXiv Detail & Related papers (2024-10-02T08:54:18Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - Momentum Diminishes the Effect of Spectral Bias in Physics-Informed
Neural Networks [72.09574528342732]
Physics-informed neural network (PINN) algorithms have shown promising results in solving a wide range of problems involving partial differential equations (PDEs)
They often fail to converge to desirable solutions when the target function contains high-frequency features, due to a phenomenon known as spectral bias.
In the present work, we exploit neural tangent kernels (NTKs) to investigate the training dynamics of PINNs evolving under gradient descent with momentum (SGDM)
arXiv Detail & Related papers (2022-06-29T19:03:10Z) - Multipole Graph Neural Operator for Parametric Partial Differential
Equations [57.90284928158383]
One of the main challenges in using deep learning-based methods for simulating physical systems is formulating physics-based data.
We propose a novel multi-level graph neural network framework that captures interaction at all ranges with only linear complexity.
Experiments confirm our multi-graph network learns discretization-invariant solution operators to PDEs and can be evaluated in linear time.
arXiv Detail & Related papers (2020-06-16T21:56:22Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.