NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition
- URL: http://arxiv.org/abs/2302.10255v2
- Date: Sat, 27 May 2023 23:19:43 GMT
- Title: NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition
- Authors: Xinquan Huang, Wenlei Shi, Qi Meng, Yue Wang, Xiaotian Gao, Jia Zhang,
Tie-Yan Liu
- Abstract summary: This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
- Score: 67.46012350241969
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neural networks have shown great potential in accelerating the solution of
partial differential equations (PDEs). Recently, there has been a growing
interest in introducing physics constraints into training neural PDE solvers to
reduce the use of costly data and improve the generalization ability. However,
these physics constraints, based on certain finite dimensional approximations
over the function space, must resolve the smallest scaled physics to ensure the
accuracy and stability of the simulation, resulting in high computational costs
from large input, output, and neural networks. This paper proposes a general
acceleration methodology called NeuralStagger by spatially and temporally
decomposing the original learning tasks into several coarser-resolution
subtasks. We define a coarse-resolution neural solver for each subtask, which
requires fewer computational resources, and jointly train them with the vanilla
physics-constrained loss by simply arranging their outputs to reconstruct the
original solution. Due to the perfect parallelism between them, the solution is
achieved as fast as a coarse-resolution neural solver. In addition, the trained
solvers bring the flexibility of simulating with multiple levels of resolution.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid
dynamics simulations, which leads to an additional $10\sim100\times$ speed-up.
Moreover, the experiment also shows that the learned model could be well used
for optimal control.
Related papers
- Metamizer: a versatile neural optimizer for fast and accurate physics simulations [4.717325308876749]
We introduce Metamizer, a novel neural network that iteratively solves a wide range of physical systems with high accuracy.
We demonstrate that Metamizer achieves unprecedented accuracy for deep learning based approaches.
Our results suggest that Metamizer could have a profound impact on future numerical solvers.
arXiv Detail & Related papers (2024-10-10T11:54:31Z) - Text2PDE: Latent Diffusion Models for Accessible Physics Simulation [7.16525545814044]
We introduce several methods to apply latent diffusion models to physics simulation.
We show that the proposed approach is competitive with current neural PDE solvers in both accuracy and efficiency.
By introducing a scalable, accurate, and usable physics simulator, we hope to bring neural PDE solvers closer to practical use.
arXiv Detail & Related papers (2024-10-02T01:09:47Z) - Scaling physics-informed hard constraints with mixture-of-experts [0.0]
We develop a scalable approach to enforce hard physical constraints using Mixture-of-Experts (MoE)
MoE imposes the constraint over smaller domains, each of which is solved by an "expert" through differentiable optimization.
Compared to standard differentiable optimization, our scalable approach achieves greater accuracy in the neural PDE solver setting.
arXiv Detail & Related papers (2024-02-20T22:45:00Z) - Learning Generic Solutions for Multiphase Transport in Porous Media via
the Flux Functions Operator [0.0]
DeepDeepONet has emerged as a powerful tool for accelerating rendering fluxDEs.
We use Physics-In DeepONets (PI-DeepONets) to achieve this mapping without any input paired-output observations.
arXiv Detail & Related papers (2023-07-03T21:10:30Z) - Super-resolving sparse observations in partial differential equations: A
physics-constrained convolutional neural network approach [6.85316573653194]
We propose a physics-constrained convolutional neural network (CNN) to infer the high-resolution solution from sparse observations of nonlinear partial differential equations.
We show that, by constraining prior physical knowledge in the dataset, we can infer the unresolved physical dynamics without using the high-resolution training.
arXiv Detail & Related papers (2023-06-19T15:00:04Z) - Learning Controllable Adaptive Simulation for Multi-resolution Physics [86.8993558124143]
We introduce Learning controllable Adaptive simulation for Multi-resolution Physics (LAMP) as the first full deep learning-based surrogate model.
LAMP consists of a Graph Neural Network (GNN) for learning the forward evolution, and a GNN-based actor-critic for learning the policy of spatial refinement and coarsening.
We demonstrate that our LAMP outperforms state-of-the-art deep learning surrogate models, and can adaptively trade-off computation to improve long-term prediction error.
arXiv Detail & Related papers (2023-05-01T23:20:27Z) - Implicit Stochastic Gradient Descent for Training Physics-informed
Neural Networks [51.92362217307946]
Physics-informed neural networks (PINNs) have effectively been demonstrated in solving forward and inverse differential equation problems.
PINNs are trapped in training failures when the target functions to be approximated exhibit high-frequency or multi-scale features.
In this paper, we propose to employ implicit gradient descent (ISGD) method to train PINNs for improving the stability of training process.
arXiv Detail & Related papers (2023-03-03T08:17:47Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Learning Physics-Informed Neural Networks without Stacked
Back-propagation [82.26566759276105]
We develop a novel approach that can significantly accelerate the training of Physics-Informed Neural Networks.
In particular, we parameterize the PDE solution by the Gaussian smoothed model and show that, derived from Stein's Identity, the second-order derivatives can be efficiently calculated without back-propagation.
Experimental results show that our proposed method can achieve competitive error compared to standard PINN training but is two orders of magnitude faster.
arXiv Detail & Related papers (2022-02-18T18:07:54Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.