Harnessing Equivariance: Modeling Turbulence with Graph Neural Networks
- URL: http://arxiv.org/abs/2504.07741v1
- Date: Thu, 10 Apr 2025 13:37:54 GMT
- Title: Harnessing Equivariance: Modeling Turbulence with Graph Neural Networks
- Authors: Marius Kurz, Andrea Beck, Benjamin Sanderse,
- Abstract summary: This work proposes a novel methodology for turbulence modeling in Large Eddy Simulation (LES) based on Graph Neural Networks (GNNs)<n>GNNs embed the discrete rotational, reflectional and translational symmetries of the Navier-Stokes equations into the model architecture.<n>The suitability of the proposed approach is investigated for two canonical test cases: Homogeneous Isotropic Turbulence (HIT) and turbulent channel flow.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: This work proposes a novel methodology for turbulence modeling in Large Eddy Simulation (LES) based on Graph Neural Networks (GNNs), which embeds the discrete rotational, reflectional and translational symmetries of the Navier-Stokes equations into the model architecture. In addition, suitable invariant input and output spaces are derived that allow the GNN models to be embedded seamlessly into the LES framework to obtain a symmetry-preserving simulation setup. The suitability of the proposed approach is investigated for two canonical test cases: Homogeneous Isotropic Turbulence (HIT) and turbulent channel flow. For both cases, GNN models are trained successfully in actual simulations using Reinforcement Learning (RL) to ensure that the models are consistent with the underlying LES formulation and discretization. It is demonstrated for the HIT case that the resulting GNN-based LES scheme recovers rotational and reflectional equivariance up to machine precision in actual simulations. At the same time, the stability and accuracy remain on par with non-symmetry-preserving machine learning models that fail to obey these properties. The same modeling strategy translates well to turbulent channel flow, where the GNN model successfully learns the more complex flow physics and is able to recover the turbulent statistics and Reynolds stresses. It is shown that the GNN model learns a zonal modeling strategy with distinct behaviors in the near-wall and outer regions. The proposed approach thus demonstrates the potential of GNNs for turbulence modeling, especially in the context of LES and RL.
Related papers
- AutoTurb: Using Large Language Models for Automatic Algebraic Model Discovery of Turbulence Closure [15.905369652489505]
In this work, a novel framework using LLMs to automatically discover expressions for correcting the Reynolds stress model is proposed.
The proposed method is performed for separated flow over periodic hills at Re = 10,595.
It is demonstrated that the corrective RANS can improve the prediction for both the Reynolds stress and mean velocity fields.
arXiv Detail & Related papers (2024-10-14T16:06:35Z) - Recurrent neural networks and transfer learning for elasto-plasticity in
woven composites [0.0]
This article presents Recurrent Neural Network (RNN) models as a surrogate for computationally intensive meso-scale simulation of woven composites.
A mean-field model generates a comprehensive data set representing elasto-plastic behavior.
In simulations, arbitrary six-dimensional strain histories are used to predict stresses under random walking as the source task and cyclic loading conditions as the target task.
arXiv Detail & Related papers (2023-11-22T14:47:54Z) - Towards Long-Term predictions of Turbulence using Neural Operators [68.8204255655161]
It aims to develop reduced-order/surrogate models for turbulent flow simulations using Machine Learning.
Different model structures are analyzed, with U-NET structures performing better than the standard FNO in accuracy and stability.
arXiv Detail & Related papers (2023-07-25T14:09:53Z) - Geometric Neural Diffusion Processes [55.891428654434634]
We extend the framework of diffusion models to incorporate a series of geometric priors in infinite-dimension modelling.
We show that with these conditions, the generative functional model admits the same symmetry.
arXiv Detail & Related papers (2023-07-11T16:51:38Z) - Neural Ideal Large Eddy Simulation: Modeling Turbulence with Neural
Stochastic Differential Equations [22.707574194338132]
We introduce a data-driven learning framework that assimilates two powerful ideas: ideal eddy simulation (LES) from turbulence closure modeling and neural differential equations (SDE) for large modeling.
We show the effectiveness of our approach on a challenging chaotic dynamical system: Kolmogorov flow at a Reynolds number of 20,000.
arXiv Detail & Related papers (2023-06-01T22:16:28Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Distributed Bayesian Learning of Dynamic States [65.7870637855531]
The proposed algorithm is a distributed Bayesian filtering task for finite-state hidden Markov models.
It can be used for sequential state estimation, as well as for modeling opinion formation over social networks under dynamic environments.
arXiv Detail & Related papers (2022-12-05T19:40:17Z) - REMuS-GNN: A Rotation-Equivariant Model for Simulating Continuum
Dynamics [0.0]
We introduce REMuS-GNN, a rotation-equivariant multi-scale model for simulating continuum dynamical systems.
We demonstrate and evaluate this method on the incompressible flow around elliptical cylinders.
arXiv Detail & Related papers (2022-05-05T16:20:37Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Learning Stochastic Dynamics with Statistics-Informed Neural Network [0.4297070083645049]
We introduce a machine-learning framework named statistics-informed neural network (SINN) for learning dynamics from data.
We devise mechanisms for training the neural network model to reproduce the correct emphstatistical behavior of a target process.
We show that the obtained reduced-order model can be trained on temporally coarse-grained data and hence is well suited for rare-event simulations.
arXiv Detail & Related papers (2022-02-24T18:21:01Z) - Moser Flow: Divergence-based Generative Modeling on Manifolds [49.04974733536027]
Moser Flow (MF) is a new class of generative models within the family of continuous normalizing flows (CNF)
MF does not require invoking or backpropagating through an ODE solver during training.
We demonstrate for the first time the use of flow models for sampling from general curved surfaces.
arXiv Detail & Related papers (2021-08-18T09:00:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.