Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators
- URL: http://arxiv.org/abs/2402.12365v4
- Date: Thu, 10 Oct 2024 07:48:24 GMT
- Title: Universal Physics Transformers: A Framework For Efficiently Scaling Neural Operators
- Authors: Benedikt Alkin, Andreas Fürst, Simon Schmid, Lukas Gruber, Markus Holzleitner, Johannes Brandstetter,
- Abstract summary: Universal Physics Transformers (UPTs) are efficient and unified learning paradigm for a wide range of problems.
UPTs operate without grid- or particle-based latent meshes, enabling flexibility across structures and particles.
We demonstrate diverse applicability and efficacy of UPTs in mesh-based fluid simulations, and steady-state Reynolds averaged Navier-Stokes simulations.
- Score: 12.165876595927452
- License:
- Abstract: Neural operators, serving as physics surrogate models, have recently gained increased interest. With ever increasing problem complexity, the natural question arises: what is an efficient way to scale neural operators to larger and more complex simulations - most importantly by taking into account different types of simulation datasets. This is of special interest since, akin to their numerical counterparts, different techniques are used across applications, even if the underlying dynamics of the systems are similar. Whereas the flexibility of transformers has enabled unified architectures across domains, neural operators mostly follow a problem specific design, where GNNs are commonly used for Lagrangian simulations and grid-based models predominate Eulerian simulations. We introduce Universal Physics Transformers (UPTs), an efficient and unified learning paradigm for a wide range of spatio-temporal problems. UPTs operate without grid- or particle-based latent structures, enabling flexibility and scalability across meshes and particles. UPTs efficiently propagate dynamics in the latent space, emphasized by inverse encoding and decoding techniques. Finally, UPTs allow for queries of the latent space representation at any point in space-time. We demonstrate diverse applicability and efficacy of UPTs in mesh-based fluid simulations, and steady-state Reynolds averaged Navier-Stokes simulations, and Lagrangian-based dynamics.
Related papers
- Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - Physics-enhanced Neural Operator for Simulating Turbulent Transport [9.923888452768919]
This paper presents a physics-enhanced neural operator (PENO) that incorporates physical knowledge of partial differential equations (PDEs) to accurately model flow dynamics.
The proposed method is evaluated through its performance on two distinct sets of 3D turbulent flow data.
arXiv Detail & Related papers (2024-05-31T20:05:17Z) - Neural SPH: Improved Neural Modeling of Lagrangian Fluid Dynamics [10.420017109857765]
Smoothed particle hydrodynamics (SPH) is omnipresent in modern engineering and scientific disciplines.
Due to the particle-like nature of the simulation, graph neural networks (GNNs) have emerged as appealing and successful surrogates.
In this work, we identify particle clustering originating from tensile instabilities as one of the primary pitfalls.
arXiv Detail & Related papers (2024-02-09T09:40:12Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Spherical Fourier Neural Operators: Learning Stable Dynamics on the
Sphere [53.63505583883769]
We introduce Spherical FNOs (SFNOs) for learning operators on spherical geometries.
SFNOs have important implications for machine learning-based simulation of climate dynamics.
arXiv Detail & Related papers (2023-06-06T16:27:17Z) - Towards Complex Dynamic Physics System Simulation with Graph Neural ODEs [75.7104463046767]
This paper proposes a novel learning based simulation model that characterizes the varying spatial and temporal dependencies in particle systems.
We empirically evaluate GNSTODE's simulation performance on two real-world particle systems, Gravity and Coulomb.
arXiv Detail & Related papers (2023-05-21T03:51:03Z) - Transformer with Implicit Edges for Particle-based Physics Simulation [135.77656965678196]
Transformer with Implicit Edges (TIE) captures the rich semantics of particle interactions in an edge-free manner.
We evaluate our model on diverse domains of varying complexity and materials.
arXiv Detail & Related papers (2022-07-22T03:45:29Z) - REMuS-GNN: A Rotation-Equivariant Model for Simulating Continuum
Dynamics [0.0]
We introduce REMuS-GNN, a rotation-equivariant multi-scale model for simulating continuum dynamical systems.
We demonstrate and evaluate this method on the incompressible flow around elliptical cylinders.
arXiv Detail & Related papers (2022-05-05T16:20:37Z) - Constructing Neural Network-Based Models for Simulating Dynamical
Systems [59.0861954179401]
Data-driven modeling is an alternative paradigm that seeks to learn an approximation of the dynamics of a system using observations of the true system.
This paper provides a survey of the different ways to construct models of dynamical systems using neural networks.
In addition to the basic overview, we review the related literature and outline the most significant challenges from numerical simulations that this modeling paradigm must overcome.
arXiv Detail & Related papers (2021-11-02T10:51:42Z) - A Gradient-based Deep Neural Network Model for Simulating Multiphase
Flow in Porous Media [1.5791732557395552]
We describe a gradient-based deep neural network (GDNN) constrained by the physics related to multiphase flow in porous media.
We demonstrate that GDNN can effectively predict the nonlinear patterns of subsurface responses.
arXiv Detail & Related papers (2021-04-30T02:14:00Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.