PDE-Refiner: Achieving Accurate Long Rollouts with Neural PDE Solvers
- URL: http://arxiv.org/abs/2308.05732v2
- Date: Sat, 21 Oct 2023 15:41:47 GMT
- Title: PDE-Refiner: Achieving Accurate Long Rollouts with Neural PDE Solvers
- Authors: Phillip Lippe, Bastiaan S. Veeling, Paris Perdikaris, Richard E.
Turner, Johannes Brandstetter
- Abstract summary: Time-dependent partial differential equations (PDEs) are ubiquitous in science and engineering.
Deep neural network based surrogates have gained increased interest.
- Score: 40.097474800631
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Time-dependent partial differential equations (PDEs) are ubiquitous in
science and engineering. Recently, mostly due to the high computational cost of
traditional solution techniques, deep neural network based surrogates have
gained increased interest. The practical utility of such neural PDE solvers
relies on their ability to provide accurate, stable predictions over long time
horizons, which is a notoriously hard problem. In this work, we present a
large-scale analysis of common temporal rollout strategies, identifying the
neglect of non-dominant spatial frequency information, often associated with
high frequencies in PDE solutions, as the primary pitfall limiting stable,
accurate rollout performance. Based on these insights, we draw inspiration from
recent advances in diffusion models to introduce PDE-Refiner; a novel model
class that enables more accurate modeling of all frequency components via a
multistep refinement process. We validate PDE-Refiner on challenging benchmarks
of complex fluid dynamics, demonstrating stable and accurate rollouts that
consistently outperform state-of-the-art models, including neural, numerical,
and hybrid neural-numerical architectures. We further demonstrate that
PDE-Refiner greatly enhances data efficiency, since the denoising objective
implicitly induces a novel form of spectral data augmentation. Finally,
PDE-Refiner's connection to diffusion models enables an accurate and efficient
assessment of the model's predictive uncertainty, allowing us to estimate when
the surrogate becomes inaccurate.
Related papers
- Trajectory Flow Matching with Applications to Clinical Time Series Modeling [77.58277281319253]
Trajectory Flow Matching (TFM) trains a Neural SDE in a simulation-free manner, bypassing backpropagation through the dynamics.
We demonstrate improved performance on three clinical time series datasets in terms of absolute performance and uncertainty prediction.
arXiv Detail & Related papers (2024-10-28T15:54:50Z) - Adversarial Learning for Neural PDE Solvers with Sparse Data [4.226449585713182]
This study introduces a universal learning strategy for neural network PDEs, named Systematic Model Augmentation for Robust Training.
By focusing on challenging and improving the model's weaknesses, SMART reduces generalization error during training under data-scarce conditions.
arXiv Detail & Related papers (2024-09-04T04:18:25Z) - On the Benefits of Memory for Modeling Time-Dependent PDEs [35.86010060677811]
We introduce Memory Neural Operator (MemNO), a network based on the recent SSM architectures and Fourier Neural Operator (FNO)
MemNO significantly outperforms the baselines without memory, achieving more than 6 times less error on unseen PDEs.
arXiv Detail & Related papers (2024-09-03T21:56:13Z) - Deep Equilibrium Based Neural Operators for Steady-State PDEs [100.88355782126098]
We study the benefits of weight-tied neural network architectures for steady-state PDEs.
We propose FNO-DEQ, a deep equilibrium variant of the FNO architecture that directly solves for the solution of a steady-state PDE.
arXiv Detail & Related papers (2023-11-30T22:34:57Z) - A Neural PDE Solver with Temporal Stencil Modeling [44.97241931708181]
Recent Machine Learning (ML) models have shown new promises in capturing important dynamics in high-resolution signals.
This study shows that significant information is often lost in the low-resolution down-sampled features.
We propose a new approach, which combines the strengths of advanced time-series sequence modeling and state-of-the-art neural PDE solvers.
arXiv Detail & Related papers (2023-02-16T06:13:01Z) - Learning PDE Solution Operator for Continuous Modeling of Time-Series [1.39661494747879]
This work presents a partial differential equation (PDE) based framework which improves the dynamics modeling capability.
We propose a neural operator that can handle time continuously without requiring iterative operations or specific grids of temporal discretization.
Our framework opens up a new way for a continuous representation of neural networks that can be readily adopted for real-world applications.
arXiv Detail & Related papers (2023-02-02T03:47:52Z) - EgPDE-Net: Building Continuous Neural Networks for Time Series
Prediction with Exogenous Variables [22.145726318053526]
Inter-series correlation and time dependence among variables are rarely considered in the present continuous methods.
We propose a continuous-time model for arbitrary-step prediction to learn an unknown PDE system.
arXiv Detail & Related papers (2022-08-03T08:34:31Z) - Learning to Accelerate Partial Differential Equations via Latent Global
Evolution [64.72624347511498]
Latent Evolution of PDEs (LE-PDE) is a simple, fast and scalable method to accelerate the simulation and inverse optimization of PDEs.
We introduce new learning objectives to effectively learn such latent dynamics to ensure long-term stability.
We demonstrate up to 128x reduction in the dimensions to update, and up to 15x improvement in speed, while achieving competitive accuracy.
arXiv Detail & Related papers (2022-06-15T17:31:24Z) - Closed-form Continuous-Depth Models [99.40335716948101]
Continuous-depth neural models rely on advanced numerical differential equation solvers.
We present a new family of models, termed Closed-form Continuous-depth (CfC) networks, that are simple to describe and at least one order of magnitude faster.
arXiv Detail & Related papers (2021-06-25T22:08:51Z) - Neural ODE Processes [64.10282200111983]
We introduce Neural ODE Processes (NDPs), a new class of processes determined by a distribution over Neural ODEs.
We show that our model can successfully capture the dynamics of low-dimensional systems from just a few data-points.
arXiv Detail & Related papers (2021-03-23T09:32:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.