A Finite Element-Inspired Hypergraph Neural Network: Application to
Fluid Dynamics Simulations
- URL: http://arxiv.org/abs/2212.14545v2
- Date: Mon, 24 Apr 2023 17:56:17 GMT
- Title: A Finite Element-Inspired Hypergraph Neural Network: Application to
Fluid Dynamics Simulations
- Authors: Rui Gao, Indu Kant Deo, Rajeev K. Jaiman
- Abstract summary: An emerging trend in deep learning research focuses on the applications of graph neural networks (GNNs) for continuum mechanics simulations.
We present a method to construct a hypergraph by connecting the nodes by elements rather than edges.
We term this method a finite element-inspired hypergraph neural network, in short FEIH($phi$)-GNN.
- Score: 4.984601297028257
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: An emerging trend in deep learning research focuses on the applications of
graph neural networks (GNNs) for mesh-based continuum mechanics simulations.
Most of these learning frameworks operate on graphs wherein each edge connects
two nodes. Inspired by the data connectivity in the finite element method, we
present a method to construct a hypergraph by connecting the nodes by elements
rather than edges. A hypergraph message-passing network is defined on such a
node-element hypergraph that mimics the calculation process of local stiffness
matrices. We term this method a finite element-inspired hypergraph neural
network, in short FEIH($\phi$)-GNN. We further equip the proposed network with
rotation equivariance, and explore its capability for modeling unsteady fluid
flow systems. The effectiveness of the network is demonstrated on two common
benchmark problems, namely the fluid flow around a circular cylinder and
airfoil configurations. Stabilized and accurate temporal roll-out predictions
can be obtained using the $\phi$-GNN framework within the interpolation
Reynolds number range. The network is also able to extrapolate moderately
towards higher Reynolds number domain out of the training range.
Related papers
- Scalable and Consistent Graph Neural Networks for Distributed Mesh-based Data-driven Modeling [0.0]
This work develops a distributed graph neural network (GNN) methodology for mesh-based modeling applications.
consistency refers to the fact that a GNN trained and evaluated on one rank (one large graph) is arithmetically equivalent to evaluations on multiple ranks (a partitioned graph)
It is shown how the NekRS mesh partitioning can be linked to the distributed GNN training and inference routines, resulting in a scalable mesh-based data-driven modeling workflow.
arXiv Detail & Related papers (2024-10-02T15:22:27Z) - Mesh-based Super-Resolution of Fluid Flows with Multiscale Graph Neural Networks [0.0]
A graph neural network (GNN) approach is introduced in this work which enables mesh-based three-dimensional super-resolution of fluid flows.
In this framework, the GNN is designed to operate not on the full mesh-based field at once, but on localized meshes of elements (or cells) directly.
arXiv Detail & Related papers (2024-09-12T05:52:19Z) - Solving the Discretised Multiphase Flow Equations with Interface
Capturing on Structured Grids Using Machine Learning Libraries [0.6299766708197884]
This paper solves the discretised multiphase flow equations using tools and methods from machine-learning libraries.
For the first time, finite element discretisations of multiphase flows can be solved using an approach based on (untrained) convolutional neural networks.
arXiv Detail & Related papers (2024-01-12T18:42:42Z) - Identification of vortex in unstructured mesh with graph neural networks [0.0]
We present a Graph Neural Network (GNN) based model with U-Net architecture to identify the vortex in CFD results on unstructured meshes.
A vortex auto-labeling method is proposed to label vortex regions in 2D CFD meshes.
arXiv Detail & Related papers (2023-11-11T12:10:16Z) - From Hypergraph Energy Functions to Hypergraph Neural Networks [94.88564151540459]
We present an expressive family of parameterized, hypergraph-regularized energy functions.
We then demonstrate how minimizers of these energies effectively serve as node embeddings.
We draw parallels between the proposed bilevel hypergraph optimization, and existing GNN architectures in common use.
arXiv Detail & Related papers (2023-06-16T04:40:59Z) - A Point-Cloud Deep Learning Framework for Prediction of Fluid Flow
Fields on Irregular Geometries [62.28265459308354]
Network learns end-to-end mapping between spatial positions and CFD quantities.
Incompress laminar steady flow past a cylinder with various shapes for its cross section is considered.
Network predicts the flow fields hundreds of times faster than our conventional CFD.
arXiv Detail & Related papers (2020-10-15T12:15:02Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Structural Temporal Graph Neural Networks for Anomaly Detection in
Dynamic Graphs [54.13919050090926]
We propose an end-to-end structural temporal Graph Neural Network model for detecting anomalous edges in dynamic graphs.
In particular, we first extract the $h$-hop enclosing subgraph centered on the target edge and propose the node labeling function to identify the role of each node in the subgraph.
Based on the extracted features, we utilize Gated recurrent units (GRUs) to capture the temporal information for anomaly detection.
arXiv Detail & Related papers (2020-05-15T09:17:08Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z) - Graphs, Convolutions, and Neural Networks: From Graph Filters to Graph
Neural Networks [183.97265247061847]
We leverage graph signal processing to characterize the representation space of graph neural networks (GNNs)
We discuss the role of graph convolutional filters in GNNs and show that any architecture built with such filters has the fundamental properties of permutation equivariance and stability to changes in the topology.
We also study the use of GNNs in recommender systems and learning decentralized controllers for robot swarms.
arXiv Detail & Related papers (2020-03-08T13:02:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.