TENG: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets Toward Machine Precision
- URL: http://arxiv.org/abs/2404.10771v2
- Date: Tue, 4 Jun 2024 03:11:56 GMT
- Title: TENG: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets Toward Machine Precision
- Authors: Zhuo Chen, Jacob McCarran, Esteban Vizcaino, Marin Soljačić, Di Luo,
- Abstract summary: Partial differential equations (PDEs) are instrumental for modeling dynamical systems in science and engineering.
In this paper, we introduce the $textitTime-Evolving Natural Gradient (TENG)$, generalizing time-dependent variational principles and optimization-based time integration.
Our comprehensive development includes algorithms like TENG-Euler and its high-order variants, such as TENG-Heun, tailored for enhanced precision and efficiency.
- Score: 5.283885355422517
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Partial differential equations (PDEs) are instrumental for modeling dynamical systems in science and engineering. The advent of neural networks has initiated a significant shift in tackling these complexities though challenges in accuracy persist, especially for initial value problems. In this paper, we introduce the $\textit{Time-Evolving Natural Gradient (TENG)}$, generalizing time-dependent variational principles and optimization-based time integration, leveraging natural gradient optimization to obtain high accuracy in neural-network-based PDE solutions. Our comprehensive development includes algorithms like TENG-Euler and its high-order variants, such as TENG-Heun, tailored for enhanced precision and efficiency. TENG's effectiveness is further validated through its performance, surpassing current leading methods and achieving $\textit{machine precision}$ in step-by-step optimizations across a spectrum of PDEs, including the heat equation, Allen-Cahn equation, and Burgers' equation.
Related papers
- MAD-NG: Meta-Auto-Decoder Neural Galerkin Method for Solving Parametric Partial Differential Equations [5.767740428776141]
Parametric partial differential equations (PDEs) are fundamental for modeling a wide range of physical and engineering systems.<n>Traditional neural network-based solvers, such as Physics-Informed Neural Networks (PINNs) and Deep Galerkin Methods, often face challenges in generalization and long-time prediction efficiency.<n>We propose a novel and scalable framework that significantly enhances the Neural Galerkin Method (NGM) by incorporating the Meta-Auto-Decoder (MAD) paradigm.
arXiv Detail & Related papers (2025-12-25T11:27:40Z) - TENG++: Time-Evolving Natural Gradient for Solving PDEs With Deep Neural Nets under General Boundary Conditions [0.5908471365011942]
Partial Differential Equations (PDEs) are central to modeling complex systems across physical, biological, and engineering domains.<n>Traditional numerical methods often struggle with high-dimensional or complex problems.<n>PINNs have emerged as an efficient alternative by embedding physics-based constraints into deep learning frameworks.
arXiv Detail & Related papers (2025-12-13T02:32:45Z) - Bilevel optimization for learning hyperparameters: Application to solving PDEs and inverse problems with Gaussian processes [4.197402763771375]
kernel- and neural network-based approaches for partial differential equations (PDEs), inverse problems, and supervised learning tasks, depend crucially on the choice of hyper parameters.<n>We propose an efficient strategy for hyperparameter optimization within the bilevel framework by employing a Gauss-Newton linearization of the inner optimization step.<n>Our approach provides closed-form updates, eliminating the need for repeated costly PDE solves.
arXiv Detail & Related papers (2025-10-07T04:22:09Z) - Learning to Solve Optimization Problems Constrained with Partial Differential Equations [45.143085119200265]
Partial equation (PDE)-constrained optimization arises in many scientific and engineering domains.<n>This paper introduces a learning-based framework that integrates a dynamic predictor with an optimization surrogate.
arXiv Detail & Related papers (2025-09-29T10:28:14Z) - PhysicsCorrect: A Training-Free Approach for Stable Neural PDE Simulations [4.7903561901859355]
We present PhysicsCorrect, a training-free correction framework that enforces PDE consistency at each prediction step.<n>Our key innovation is an efficient caching strategy that precomputes the Jacobian and its pseudoinverse during an offline warm-up phase.<n>Across three representative PDE systems, PhysicsCorrect reduces prediction errors by up to 100x while adding negligible inference time.
arXiv Detail & Related papers (2025-07-03T01:22:57Z) - High precision PINNs in unbounded domains: application to singularity formulation in PDEs [83.50980325611066]
We study the choices of neural network ansatz, sampling strategy, and optimization algorithm.<n>For 1D Burgers equation, our framework can lead to a solution with very high precision.<n>For the 2D Boussinesq equation, we obtain a solution whose loss is $4$ digits smaller than that obtained in citewang2023asymptotic with fewer training steps.
arXiv Detail & Related papers (2025-06-24T02:01:44Z) - Advancing Generalization in PINNs through Latent-Space Representations [71.86401914779019]
Physics-informed neural networks (PINNs) have made significant strides in modeling dynamical systems governed by partial differential equations (PDEs)
We propose PIDO, a novel physics-informed neural PDE solver designed to generalize effectively across diverse PDE configurations.
We validate PIDO on a range of benchmarks, including 1D combined equations and 2D Navier-Stokes equations.
arXiv Detail & Related papers (2024-11-28T13:16:20Z) - Gaussian Process Priors for Boundary Value Problems of Linear Partial Differential Equations [3.524869467682149]
Solving systems of partial differential equations (PDEs) is a fundamental task in computational science.
Recent advancements have introduced neural operators and physics-informed neural networks (PINNs) to tackle PDEs.
We propose a novel framework for constructing GP priors that satisfy both general systems of linear PDEs with constant coefficients and linear boundary conditions.
arXiv Detail & Related papers (2024-11-25T18:48:15Z) - GEPS: Boosting Generalization in Parametric PDE Neural Solvers through Adaptive Conditioning [14.939978372699084]
Data-driven approaches learn parametric PDEs by incorporating a very large variety of trajectories with varying PDE parameters.
GEPS is a simple adaptation mechanism to boost GEneralization in Pde solvers.
We demonstrate the versatility of our approach for both fully data-driven and for physics-aware neural solvers.
arXiv Detail & Related papers (2024-10-31T12:51:40Z) - Learning a Neural Solver for Parametric PDE to Enhance Physics-Informed Methods [14.791541465418263]
We propose learning a solver, i.e., solving partial differential equations (PDEs) using a physics-informed iterative algorithm trained on data.
Our method learns to condition a gradient descent algorithm that automatically adapts to each PDE instance.
We demonstrate the effectiveness of our method through empirical experiments on multiple datasets.
arXiv Detail & Related papers (2024-10-09T12:28:32Z) - Enhancing Low-Order Discontinuous Galerkin Methods with Neural Ordinary Differential Equations for Compressible Navier--Stokes Equations [0.1578515540930834]
We introduce an end-to-end differentiable framework for solving the compressible Navier-Stokes equations.
This integrated approach combines a differentiable discontinuous Galerkin solver with a neural network source term.
We demonstrate the performance of the proposed framework through two examples.
arXiv Detail & Related papers (2023-10-29T04:26:23Z) - TSONN: Time-stepping-oriented neural network for solving partial
differential equations [1.9061608251056779]
This work integrates time-stepping method with deep learning to solve PDE problems.
The convergence of model training is significantly improved by following the trajectory of the pseudo time-stepping process.
Our results show that the proposed method achieves stable training and correct results in many problems that standard PINNs fail to solve.
arXiv Detail & Related papers (2023-10-25T09:19:40Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - PIXEL: Physics-Informed Cell Representations for Fast and Accurate PDE
Solvers [4.1173475271436155]
We propose a new kind of data-driven PDEs solver, physics-informed cell representations (PIXEL)
PIXEL elegantly combines classical numerical methods and learning-based approaches.
We show that PIXEL achieves fast convergence speed and high accuracy.
arXiv Detail & Related papers (2022-07-26T10:46:56Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Message Passing Neural PDE Solvers [60.77761603258397]
We build a neural message passing solver, replacing allally designed components in the graph with backprop-optimized neural function approximators.
We show that neural message passing solvers representationally contain some classical methods, such as finite differences, finite volumes, and WENO schemes.
We validate our method on various fluid-like flow problems, demonstrating fast, stable, and accurate performance across different domain topologies, equation parameters, discretizations, etc., in 1D and 2D.
arXiv Detail & Related papers (2022-02-07T17:47:46Z) - Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients [51.42959998304931]
A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints.
These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach.
We propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme.
arXiv Detail & Related papers (2021-04-25T22:43:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.