TENG-BC: Unified Time-Evolving Natural Gradient for Neural PDE Solvers with General Boundary Conditions
- URL: http://arxiv.org/abs/2603.00397v1
- Date: Sat, 28 Feb 2026 01:03:22 GMT
- Title: TENG-BC: Unified Time-Evolving Natural Gradient for Neural PDE Solvers with General Boundary Conditions
- Authors: Hongjie Jiang, Di Luo,
- Abstract summary: We introduce TENG-BC, a high-precision neural PDE solver based on the Time-Evolving Natural Gradient.<n>At each time step, TENG-BC performs a boundary-aware optimization that jointly enforces interior dynamics and boundary conditions.<n>This formulation admits a natural-gradient interpretation, enabling stable time evolution without delicate penalty tuning.
- Score: 6.258801939886893
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Accurately solving time-dependent partial differential equations (PDEs) with neural networks remains challenging due to long-time error accumulation and the difficulty of enforcing general boundary conditions. We introduce TENG-BC, a high-precision neural PDE solver based on the Time-Evolving Natural Gradient, designed to perform under general boundary constraints. At each time step, TENG-BC performs a boundary-aware optimization that jointly enforces interior dynamics and boundary conditions, accommodating Dirichlet, Neumann, Robin, and mixed types within a unified framework. This formulation admits a natural-gradient interpretation, enabling stable time evolution without delicate penalty tuning. Across benchmarks over diffusion, transport, and nonlinear PDEs with various boundary conditions, TENG-BC achieves solver-level accuracy under comparable sampling budgets, outperforming conventional solvers and physics-informed neural network (PINN) baselines.
Related papers
- TENG++: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets under General Boundary Conditions [0.5908471365011942]
Partial Differential Equations (PDEs) are central to modeling complex systems across physical, biological, and engineering domains.<n>Traditional numerical methods often struggle with high-dimensional or complex problems.<n>PINNs have emerged as an efficient alternative by embedding physics-based constraints into deep learning frameworks.
arXiv Detail & Related papers (2025-12-13T02:32:45Z) - BEKAN: Boundary condition-guaranteed evolutionary Kolmogorov-Arnold networks with radial basis functions for solving PDE problems [11.258825397319143]
We propose a boundary condition-guaranteed evolutionary Kolmogorov-Arnold Network (KAN) with radial basis functions (BEKAN)<n>In BEKAN, we propose three distinct approaches for incorporating Dirichlet, periodic, and Neumann boundary conditions into the network.<n>By virtue of the boundary-embedded RBFs, the periodic layer, and the evolutionary framework, we can perform accurate PDE simulations while rigorously enforcing boundary conditions.
arXiv Detail & Related papers (2025-10-03T23:57:23Z) - PINN-FEM: A Hybrid Approach for Enforcing Dirichlet Boundary Conditions in Physics-Informed Neural Networks [1.1060425537315088]
Physics-Informed Neural Networks (PINNs) solve partial differential equations (PDEs)<n>We propose a hybrid approach, PINN-FEM, which combines PINNs with finite element methods (FEM) to impose strong Dirichlet boundary conditions via domain decomposition.<n>This method incorporates FEM-based representations near the boundary, ensuring exact enforcement without compromising convergence.
arXiv Detail & Related papers (2025-01-14T00:47:15Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)<n>We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.<n>We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - RoPINN: Region Optimized Physics-Informed Neural Networks [66.38369833561039]
Physics-informed neural networks (PINNs) have been widely applied to solve partial differential equations (PDEs)
This paper proposes and theoretically studies a new training paradigm as region optimization.
A practical training algorithm, Region Optimized PINN (RoPINN), is seamlessly derived from this new paradigm.
arXiv Detail & Related papers (2024-05-23T09:45:57Z) - Neural Fields with Hard Constraints of Arbitrary Differential Order [61.49418682745144]
We develop a series of approaches for enforcing hard constraints on neural fields.
The constraints can be specified as a linear operator applied to the neural field and its derivatives.
Our approaches are demonstrated in a wide range of real-world applications.
arXiv Detail & Related papers (2023-06-15T08:33:52Z) - Convergence of mean-field Langevin dynamics: Time and space
discretization, stochastic gradient, and variance reduction [49.66486092259376]
The mean-field Langevin dynamics (MFLD) is a nonlinear generalization of the Langevin dynamics that incorporates a distribution-dependent drift.
Recent works have shown that MFLD globally minimizes an entropy-regularized convex functional in the space of measures.
We provide a framework to prove a uniform-in-time propagation of chaos for MFLD that takes into account the errors due to finite-particle approximation, time-discretization, and gradient approximation.
arXiv Detail & Related papers (2023-06-12T16:28:11Z) - Deep NURBS -- Admissible Physics-informed Neural Networks [0.0]
We propose a new numerical scheme for physics-informed neural networks (PINNs) that enables precise and inexpensive solution for partial differential equations (PDEs)
The proposed approach combines admissible NURBS parametrizations required to define the physical domain and the Dirichlet boundary conditions with a PINN solver.
arXiv Detail & Related papers (2022-10-25T10:35:45Z) - Tunable Complexity Benchmarks for Evaluating Physics-Informed Neural
Networks on Coupled Ordinary Differential Equations [64.78260098263489]
In this work, we assess the ability of physics-informed neural networks (PINNs) to solve increasingly-complex coupled ordinary differential equations (ODEs)
We show that PINNs eventually fail to produce correct solutions to these benchmarks as their complexity increases.
We identify several reasons why this may be the case, including insufficient network capacity, poor conditioning of the ODEs, and high local curvature, as measured by the Laplacian of the PINN loss.
arXiv Detail & Related papers (2022-10-14T15:01:32Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Physics-Embedded Neural Networks: Graph Neural PDE Solvers with Mixed
Boundary Conditions [3.04585143845864]
Graph neural network (GNN) is a promising approach to learning and predicting physical phenomena.
We present a physics-embedded GNN that considers boundary conditions and predicts the state after a long time.
Our model can be a useful standard for realizing reliable, fast, and accurate GNN-based PDE solvers.
arXiv Detail & Related papers (2022-05-24T09:17:27Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.