PROSE-FD: A Multimodal PDE Foundation Model for Learning Multiple Operators for Forecasting Fluid Dynamics
- URL: http://arxiv.org/abs/2409.09811v1
- Date: Sun, 15 Sep 2024 18:20:15 GMT
- Title: PROSE-FD: A Multimodal PDE Foundation Model for Learning Multiple Operators for Forecasting Fluid Dynamics
- Authors: Yuxuan Liu, Jingmin Sun, Xinjie He, Griffin Pinney, Zecheng Zhang, Hayden Schaeffer,
- Abstract summary: We propose a zero-shot multimodal PDE foundational model for simultaneous prediction of heterogeneous two-dimensional physical systems.
These systems include shallow water equations and the Navier-Stokes equations with incompressible and compressible flow.
- Score: 3.770825791788951
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose PROSE-FD, a zero-shot multimodal PDE foundational model for simultaneous prediction of heterogeneous two-dimensional physical systems related to distinct fluid dynamics settings. These systems include shallow water equations and the Navier-Stokes equations with incompressible and compressible flow, regular and complex geometries, and different buoyancy settings. This work presents a new transformer-based multi-operator learning approach that fuses symbolic information to perform operator-based data prediction, i.e. non-autoregressive. By incorporating multiple modalities in the inputs, the PDE foundation model builds in a pathway for including mathematical descriptions of the physical behavior. We pre-train our foundation model on 6 parametric families of equations collected from 13 datasets, including over 60K trajectories. Our model outperforms popular operator learning, computer vision, and multi-physics models, in benchmark forward prediction tasks. We test our architecture choices with ablation studies.
Related papers
- Towards a Foundation Model for Partial Differential Equations: Multi-Operator Learning and Extrapolation [4.286691905364396]
We introduce a multi-modal foundation model for scientific problems, named PROSE-PDE.
Our model is a multi-operator learning approach which can predict future states of systems while concurrently learning the underlying governing equations of the physical system.
arXiv Detail & Related papers (2024-04-18T17:34:20Z) - Pretraining Codomain Attention Neural Operators for Solving Multiphysics PDEs [85.40198664108624]
We propose Codomain Attention Neural Operator (CoDA-NO) to solve multiphysics problems with PDEs.
CoDA-NO tokenizes functions along the codomain or channel space, enabling self-supervised learning or pretraining of multiple PDE systems.
We find CoDA-NO to outperform existing methods by over 36% on complex downstream tasks with limited data.
arXiv Detail & Related papers (2024-03-19T08:56:20Z) - Discovering Interpretable Physical Models using Symbolic Regression and
Discrete Exterior Calculus [55.2480439325792]
We propose a framework that combines Symbolic Regression (SR) and Discrete Exterior Calculus (DEC) for the automated discovery of physical models.
DEC provides building blocks for the discrete analogue of field theories, which are beyond the state-of-the-art applications of SR to physical problems.
We prove the effectiveness of our methodology by re-discovering three models of Continuum Physics from synthetic experimental data.
arXiv Detail & Related papers (2023-10-10T13:23:05Z) - Training Deep Surrogate Models with Large Scale Online Learning [48.7576911714538]
Deep learning algorithms have emerged as a viable alternative for obtaining fast solutions for PDEs.
Models are usually trained on synthetic data generated by solvers, stored on disk and read back for training.
It proposes an open source online training framework for deep surrogate models.
arXiv Detail & Related papers (2023-06-28T12:02:27Z) - Differentiable physics-enabled closure modeling for Burgers' turbulence [0.0]
We discuss an approach using the differentiable physics paradigm that combines known physics with machine learning to develop closure models for turbulence problems.
We train a series of models that incorporate varying degrees of physical assumptions on an a posteriori loss function to test the efficacy of models.
We find that constraining models with inductive biases in the form of partial differential equations that contain known physics or existing closure approaches produces highly data-efficient, accurate, and generalizable models.
arXiv Detail & Related papers (2022-09-23T14:38:01Z) - Data-driven, multi-moment fluid modeling of Landau damping [6.456946924438425]
We apply a deep learning architecture to learn fluid partial differential equations (PDEs) of a plasma system.
The learned multi-moment fluid PDEs are demonstrated to incorporate kinetic effects such as Landau damping.
arXiv Detail & Related papers (2022-09-10T19:06:12Z) - Data-driven Control of Agent-based Models: an Equation/Variable-free
Machine Learning Approach [0.0]
We present an Equation/Variable free machine learning (EVFML) framework for the control of the collective dynamics of complex/multiscale systems.
The proposed implementation consists of three steps: (A) from high-dimensional agent-based simulations, machine learning (in particular, non-linear manifold learning (DMs))
We exploit the Equation-free approach to perform numerical bifurcation analysis of the emergent dynamics.
We design data-driven embedded wash-out controllers that drive the agent-based simulators to their intrinsic, imprecisely known, emergent open-loop unstable steady-states.
arXiv Detail & Related papers (2022-07-12T18:16:22Z) - Multi-scale Physical Representations for Approximating PDE Solutions
with Graph Neural Operators [14.466945570499183]
We study three multi-resolution schema with integral kernel operators approximated with emphMessage Passing Graph Neural Networks (MPGNNs)
To validate our study, we make extensive MPGNNs experiments with well-chosen metrics considering steady and unsteady PDEs.
arXiv Detail & Related papers (2022-06-29T14:42:03Z) - Capturing Actionable Dynamics with Structured Latent Ordinary
Differential Equations [68.62843292346813]
We propose a structured latent ODE model that captures system input variations within its latent representation.
Building on a static variable specification, our model learns factors of variation for each input to the system, thus separating the effects of the system inputs in the latent space.
arXiv Detail & Related papers (2022-02-25T20:00:56Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Equivariant vector field network for many-body system modeling [65.22203086172019]
Equivariant Vector Field Network (EVFN) is built on a novel equivariant basis and the associated scalarization and vectorization layers.
We evaluate our method on predicting trajectories of simulated Newton mechanics systems with both full and partially observed data.
arXiv Detail & Related papers (2021-10-26T14:26:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.