MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation
- URL: http://arxiv.org/abs/2501.15987v1
- Date: Mon, 27 Jan 2025 12:15:51 GMT
- Title: MultiPDENet: PDE-embedded Learning with Multi-time-stepping for Accelerated Flow Simulation
- Authors: Qi Wang, Yuan Mi, Haoyun Wang, Yi Zhang, Ruizhi Chengze, Hongsheng Liu, Ji-Rong Wen, Hao Sun,
- Abstract summary: We propose a PDE-embedded network with multiscale time stepping (MultiPDENet)
In particular, we design a convolutional filter based on the structure of finite difference with a small number of parameters to optimize.
A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction.
- Score: 48.41289705783405
- License:
- Abstract: Solving partial differential equations (PDEs) by numerical methods meet computational cost challenge for getting the accurate solution since fine grids and small time steps are required. Machine learning can accelerate this process, but struggle with weak generalizability, interpretability, and data dependency, as well as suffer in long-term prediction. To this end, we propose a PDE-embedded network with multiscale time stepping (MultiPDENet), which fuses the scheme of numerical methods and machine learning, for accelerated simulation of flows. In particular, we design a convolutional filter based on the structure of finite difference stencils with a small number of parameters to optimize, which estimates the equivalent form of spatial derivative on a coarse grid to minimize the equation's residual. A Physics Block with a 4th-order Runge-Kutta integrator at the fine time scale is established that embeds the structure of PDEs to guide the prediction. To alleviate the curse of temporal error accumulation in long-term prediction, we introduce a multiscale time integration approach, where a neural network is used to correct the prediction error at a coarse time scale. Experiments across various PDE systems, including the Navier-Stokes equations, demonstrate that MultiPDENet can accurately predict long-term spatiotemporal dynamics, even given small and incomplete training data, e.g., spatiotemporally down-sampled datasets. MultiPDENet achieves the state-of-the-art performance compared with other neural baseline models, also with clear speedup compared to classical numerical methods.
Related papers
- Neural Dynamical Operator: Continuous Spatial-Temporal Model with Gradient-Based and Derivative-Free Optimization Methods [0.0]
We present a data-driven modeling framework called neural dynamical operator that is continuous in both space and time.
A key feature of the neural dynamical operator is the resolution-invariance with respect to both spatial and temporal discretizations.
We show that the proposed model can better predict long-term statistics via the hybrid optimization scheme.
arXiv Detail & Related papers (2023-11-20T14:31:18Z) - A predictive physics-aware hybrid reduced order model for reacting flows [65.73506571113623]
A new hybrid predictive Reduced Order Model (ROM) is proposed to solve reacting flow problems.
The number of degrees of freedom is reduced from thousands of temporal points to a few POD modes with their corresponding temporal coefficients.
Two different deep learning architectures have been tested to predict the temporal coefficients.
arXiv Detail & Related papers (2023-01-24T08:39:20Z) - MAgNet: Mesh Agnostic Neural PDE Solver [68.8204255655161]
Climate predictions require fine-temporal resolutions to resolve all turbulent scales in the fluid simulations.
Current numerical model solveers PDEs on grids that are too coarse (3km to 200km on each side)
We design a novel architecture that predicts the spatially continuous solution of a PDE given a spatial position query.
arXiv Detail & Related papers (2022-10-11T14:52:20Z) - Deep Convolutional Architectures for Extrapolative Forecast in
Time-dependent Flow Problems [0.0]
Deep learning techniques are employed to model the system dynamics for advection dominated problems.
These models take as input a sequence of high-fidelity vector solutions for consecutive time-steps obtained from the PDEs.
Non-intrusive reduced-order modelling techniques such as deep auto-encoder networks are utilized to compress the high-fidelity snapshots.
arXiv Detail & Related papers (2022-09-18T03:45:56Z) - Semi-supervised Learning of Partial Differential Operators and Dynamical
Flows [68.77595310155365]
We present a novel method that combines a hyper-network solver with a Fourier Neural Operator architecture.
We test our method on various time evolution PDEs, including nonlinear fluid flows in one, two, and three spatial dimensions.
The results show that the new method improves the learning accuracy at the time point of supervision point, and is able to interpolate and the solutions to any intermediate time.
arXiv Detail & Related papers (2022-07-28T19:59:14Z) - Long-time integration of parametric evolution equations with
physics-informed DeepONets [0.0]
We introduce an effective framework for learning infinite-dimensional operators that map random initial conditions to associated PDE solutions within a short time interval.
Global long-time predictions across a range of initial conditions can be then obtained by iteratively evaluating the trained model.
This introduces a new approach to temporal domain decomposition that is shown to be effective in performing accurate long-time simulations.
arXiv Detail & Related papers (2021-06-09T20:46:17Z) - DiffPD: Differentiable Projective Dynamics with Contact [65.88720481593118]
We present DiffPD, an efficient differentiable soft-body simulator with implicit time integration.
We evaluate the performance of DiffPD and observe a speedup of 4-19 times compared to the standard Newton's method in various applications.
arXiv Detail & Related papers (2021-01-15T00:13:33Z) - Large-scale Neural Solvers for Partial Differential Equations [48.7576911714538]
Solving partial differential equations (PDE) is an indispensable part of many branches of science as many processes can be modelled in terms of PDEs.
Recent numerical solvers require manual discretization of the underlying equation as well as sophisticated, tailored code for distributed computing.
We examine the applicability of continuous, mesh-free neural solvers for partial differential equations, physics-informed neural networks (PINNs)
We discuss the accuracy of GatedPINN with respect to analytical solutions -- as well as state-of-the-art numerical solvers, such as spectral solvers.
arXiv Detail & Related papers (2020-09-08T13:26:51Z) - Hierarchical Deep Learning of Multiscale Differential Equation
Time-Steppers [5.6385744392820465]
We develop a hierarchy of deep neural network time-steppers to approximate the flow map of the dynamical system over a disparate range of time-scales.
The resulting model is purely data-driven and leverages features of the multiscale dynamics.
We benchmark our algorithm against state-of-the-art methods, such as LSTM, reservoir computing, and clockwork RNN.
arXiv Detail & Related papers (2020-08-22T07:16:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.