Predicting fluid-structure interaction with graph neural networks
- URL: http://arxiv.org/abs/2210.04193v2
- Date: Mon, 16 Oct 2023 12:46:06 GMT
- Title: Predicting fluid-structure interaction with graph neural networks
- Authors: Rui Gao, Rajeev K. Jaiman
- Abstract summary: We present a rotation equivariant, quasi-monolithic graph neural network framework for the reduced-order modeling of fluid-structure interaction systems.
A finite element-inspired hypergraph neural network is employed to predict the evolution of the fluid state based on the state of the whole system.
The proposed framework tracks the interface description and provides stable and accurate system state predictions during roll-out for at least 2000 time steps.
- Score: 13.567118450260178
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We present a rotation equivariant, quasi-monolithic graph neural network
framework for the reduced-order modeling of fluid-structure interaction
systems. With the aid of an arbitrary Lagrangian-Eulerian formulation, the
system states are evolved temporally with two sub-networks. The movement of the
mesh is reduced to the evolution of several coefficients via complex-valued
proper orthogonal decomposition, and the prediction of these coefficients over
time is handled by a single multi-layer perceptron. A finite element-inspired
hypergraph neural network is employed to predict the evolution of the fluid
state based on the state of the whole system. The structural state is
implicitly modeled by the movement of the mesh on the solid-fluid interface;
hence it makes the proposed framework quasi-monolithic. The effectiveness of
the proposed framework is assessed on two prototypical fluid-structure systems,
namely the flow around an elastically-mounted cylinder, and the flow around a
hyperelastic plate attached to a fixed cylinder. The proposed framework tracks
the interface description and provides stable and accurate system state
predictions during roll-out for at least 2000 time steps, and even demonstrates
some capability in self-correcting erroneous predictions. The proposed
framework also enables direct calculation of the lift and drag forces using the
predicted fluid and mesh states, in contrast to existing convolution-based
architectures. The proposed reduced-order model via graph neural network has
implications for the development of physics-based digital twins concerning
moving boundaries and fluid-structure interactions.
Related papers
- Equi-Euler GraphNet: An Equivariant, Temporal-Dynamics Informed Graph Neural Network for Dual Force and Trajectory Prediction in Multi-Body Systems [5.442686600296734]
We propose Equi-Euler GraphNet, a physics-informed graph neural network (GNN) that simultaneously predicts internal forces and global trajectories in multi-body systems.
Equi-Euler GraphNet generalizes beyond the training distribution, accurately predicting loads and trajectories under unseen speeds, loads, and configurations.
It outperforms state-of-the-art GNNs focused on trajectory prediction, delivering stable rollouts over thousands of time steps with minimal error accumulation.
arXiv Detail & Related papers (2025-04-18T16:09:57Z) - Data-driven modeling of fluid flow around rotating structures with graph neural networks [12.295701458215401]
We propose to apply a graph neural network-based surrogate modeling for fluid flow with the mesh corotating with the structure.
Unlike conventional data-driven approaches that rely on structured Cartesian meshes, our framework operates on unstructured co-rotating meshes.
Our results show that the model achieves stable and accurate rollouts for over 2000 time steps in periodic regimes.
arXiv Detail & Related papers (2025-03-28T09:06:19Z) - Allostatic Control of Persistent States in Spiking Neural Networks for perception and computation [79.16635054977068]
We introduce a novel model for updating perceptual beliefs about the environment by extending the concept of Allostasis to the control of internal representations.
In this paper, we focus on an application in numerical cognition, where a bump of activity in an attractor network is used as a spatial numerical representation.
arXiv Detail & Related papers (2025-03-20T12:28:08Z) - Learning Effective Dynamics across Spatio-Temporal Scales of Complex Flows [4.798951413107239]
We propose a novel framework, Graph-based Learning of Effective Dynamics (Graph-LED), that leverages graph neural networks (GNNs) and an attention-based autoregressive model.
We evaluate the proposed approach on a suite of fluid dynamics problems, including flow past a cylinder and flow over a backward-facing step over a range of Reynolds numbers.
arXiv Detail & Related papers (2025-02-11T22:14:30Z) - Dynamic Frame Interpolation in Wavelet Domain [57.25341639095404]
Video frame is an important low-level computation vision task, which can increase frame rate for more fluent visual experience.
Existing methods have achieved great success by employing advanced motion models and synthesis networks.
WaveletVFI can reduce computation up to 40% while maintaining similar accuracy, making it perform more efficiently against other state-of-the-arts.
arXiv Detail & Related papers (2023-09-07T06:41:15Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Piecewise-Velocity Model for Learning Continuous-time Dynamic Node
Representations [0.0]
Piecewise-Veable Model (PiVeM) for representation of continuous-time dynamic networks.
We show that PiVeM can successfully represent network structure and dynamics in ultra-low two-dimensional spaces.
It outperforms relevant state-of-art methods in downstream tasks such as link prediction.
arXiv Detail & Related papers (2022-12-23T13:57:56Z) - Learning the Evolutionary and Multi-scale Graph Structure for
Multivariate Time Series Forecasting [50.901984244738806]
We show how to model the evolutionary and multi-scale interactions of time series.
In particular, we first provide a hierarchical graph structure cooperated with the dilated convolution to capture the scale-specific correlations.
A unified neural network is provided to integrate the components above to get the final prediction.
arXiv Detail & Related papers (2022-06-28T08:11:12Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - Graph-Coupled Oscillator Networks [23.597444325599835]
Graph-Coupled Networks (GraphCON) is a novel framework for deep learning on graphs.
We show that our framework offers competitive performance with respect to the state-of-the-art on a variety of graph-based learning tasks.
arXiv Detail & Related papers (2022-02-04T18:29:49Z) - Deep Learning for Stability Analysis of a Freely Vibrating Sphere at
Moderate Reynolds Number [0.0]
We present a deep learning-based reduced-order model (DL-ROM) for the stability prediction of unsteady 3D fluid-structure interaction systems.
The proposed DL-ROM has the format of a nonlinear state-space model and employs a recurrent neural network with long short-term memory (LSTM)
By integrating the LSTM network with the eigensystem realization algorithm (ERA), we construct a data-driven state-space model for the reduced-order stability analysis.
arXiv Detail & Related papers (2021-12-18T06:41:02Z) - Constrained Block Nonlinear Neural Dynamical Models [1.3163098563588727]
Neural network modules conditioned by known priors can be effectively trained and combined to represent systems with nonlinear dynamics.
The proposed method consists of neural network blocks that represent input, state, and output dynamics with constraints placed on the network weights and system variables.
We evaluate the performance of the proposed architecture and training methods on system identification tasks for three nonlinear systems.
arXiv Detail & Related papers (2021-01-06T04:27:54Z) - Predicting Rigid Body Dynamics using Dual Quaternion Recurrent Neural
Networks with Quaternion Attention [0.0]
We propose a novel neural network architecture based on dual quaternions which allow for a compact representation of informations.
To cover the dynamic behavior inherent to rigid body movements, we propose recurrent architectures in the neural network.
To further model the interactions between individual rigid bodies as well as external inputs efficiently, we incorporate a novel attention mechanism employing dual quaternion algebra.
arXiv Detail & Related papers (2020-11-17T16:10:49Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.