Floating-Body Hydrodynamic Neural Networks
- URL: http://arxiv.org/abs/2509.13783v1
- Date: Wed, 17 Sep 2025 07:51:35 GMT
- Title: Floating-Body Hydrodynamic Neural Networks
- Authors: Tianshuo Zhang, Wenzhe Zhai, Rui Yann, Jia Gao, He Cao, Xianglei Xing,
- Abstract summary: We propose a physics-structured framework that predicts interpretable parameters such as directional added masses, drag coefficients, and a streamfunction-based flow, and couples them with analytic equations of motion.<n>Compared with Hamiltonian and Lagrangian neural networks, FHNN more effectively handles dissipative dynamics while preserving interpretability, which bridges the gap between black-box learning and transparent system identification.
- Score: 8.501171043928354
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Fluid-structure interaction is common in engineering and natural systems, where floating-body motion is governed by added mass, drag, and background flows. Modeling these dissipative dynamics is difficult: black-box neural models regress state derivatives with limited interpretability and unstable long-horizon predictions. We propose Floating-Body Hydrodynamic Neural Networks (FHNN), a physics-structured framework that predicts interpretable hydrodynamic parameters such as directional added masses, drag coefficients, and a streamfunction-based flow, and couples them with analytic equations of motion. This design constrains the hypothesis space, enhances interpretability, and stabilizes integration. On synthetic vortex datasets, FHNN achieves up to an order-of-magnitude lower error than Neural ODEs, recovers physically consistent flow fields. Compared with Hamiltonian and Lagrangian neural networks, FHNN more effectively handles dissipative dynamics while preserving interpretability, which bridges the gap between black-box learning and transparent system identification.
Related papers
- PHASE-Net: Physics-Grounded Harmonic Attention System for Efficient Remote Photoplethysmography Measurement [63.007237197267834]
Existing deep learning methods are mostly physiological monitoring and lack theoretical robustness.<n>We propose a physics-informed r paradigm derived from the Navier-Stokes equations of hemodynamics, showing that the pulse signal follows a second-order system.<n>This provides a theoretical justification for using a Temporal Conal Network (TCN)<n>Phase-Net achieves state-of-the-art performance with strong efficiency, offering a theoretically grounded and deployment-ready r solution.
arXiv Detail & Related papers (2025-09-29T14:36:45Z) - Graph-Based Learning of Free Surface Dynamics in Generalized Newtonian Fluids using Smoothed Particle Hydrodynamics [3.712898298472801]
We propose a graph neural network (GNN) model for efficiently predicting the flow behavior of non-Newtonian fluids.<n>Traditional algorithms for Newtonian fluids with constant viscosity struggle to converge when applied to non-Newtonian cases.<n>We introduce a novel GNN-based numerical model to enhance the computational efficiency of non-Newtonian power-law fluid flow simulations.
arXiv Detail & Related papers (2025-09-29T04:14:58Z) - Fractional Spike Differential Equations Neural Network with Efficient Adjoint Parameters Training [63.3991315762955]
Spiking Neural Networks (SNNs) draw inspiration from biological neurons to create realistic models for brain-like computation.<n>Most existing SNNs assume a single time constant for neuronal membrane voltage dynamics, modeled by first-order ordinary differential equations (ODEs) with Markovian characteristics.<n>We propose the Fractional SPIKE Differential Equation neural network (fspikeDE), which captures long-term dependencies in membrane voltage and spike trains through fractional-order dynamics.
arXiv Detail & Related papers (2025-07-22T18:20:56Z) - Langevin Flows for Modeling Neural Latent Dynamics [81.81271685018284]
We introduce LangevinFlow, a sequential Variational Auto-Encoder where the time evolution of latent variables is governed by the underdamped Langevin equation.<n>Our approach incorporates physical priors -- such as inertia, damping, a learned potential function, and forces -- to represent both autonomous and non-autonomous processes in neural systems.<n>Our method outperforms state-of-the-art baselines on synthetic neural populations generated by a Lorenz attractor.
arXiv Detail & Related papers (2025-07-15T17:57:48Z) - KITINet: Kinetics Theory Inspired Network Architectures with PDE Simulation Approaches [43.872190335490515]
This paper introduces KITINet, a novel architecture that reinterprets feature propagation through the lens of non-equilibrium particle dynamics.<n>At its core, we propose a residual module that models update as the evolution of a particle system.<n>This formulation mimics particle collisions and energy exchange, enabling adaptive feature refinement via physics-informed interactions.
arXiv Detail & Related papers (2025-05-23T13:58:29Z) - Learning Effective Dynamics across Spatio-Temporal Scales of Complex Flows [4.798951413107239]
We propose a novel framework, Graph-based Learning of Effective Dynamics (Graph-LED), that leverages graph neural networks (GNNs) and an attention-based autoregressive model.<n>We evaluate the proposed approach on a suite of fluid dynamics problems, including flow past a cylinder and flow over a backward-facing step over a range of Reynolds numbers.
arXiv Detail & Related papers (2025-02-11T22:14:30Z) - Neural SPH: Improved Neural Modeling of Lagrangian Fluid Dynamics [10.420017109857765]
Smoothed particle hydrodynamics (SPH) is omnipresent in modern engineering and scientific disciplines.
Due to the particle-like nature of the simulation, graph neural networks (GNNs) have emerged as appealing and successful surrogates.
In this work, we identify particle clustering originating from tensile instabilities as one of the primary pitfalls.
arXiv Detail & Related papers (2024-02-09T09:40:12Z) - SEGNO: Generalizing Equivariant Graph Neural Networks with Physical
Inductive Biases [66.61789780666727]
We show how the second-order continuity can be incorporated into GNNs while maintaining the equivariant property.
We also offer theoretical insights into SEGNO, highlighting that it can learn a unique trajectory between adjacent states.
Our model yields a significant improvement over the state-of-the-art baselines.
arXiv Detail & Related papers (2023-08-25T07:15:58Z) - Dynamic Causal Explanation Based Diffusion-Variational Graph Neural
Network for Spatio-temporal Forecasting [60.03169701753824]
We propose a novel Dynamic Diffusion-al Graph Neural Network (DVGNN) fortemporal forecasting.
The proposed DVGNN model outperforms state-of-the-art approaches and achieves outstanding Root Mean Squared Error result.
arXiv Detail & Related papers (2023-05-16T11:38:19Z) - Predicting fluid-structure interaction with graph neural networks [13.567118450260178]
We present a rotation equivariant, quasi-monolithic graph neural network framework for the reduced-order modeling of fluid-structure interaction systems.
A finite element-inspired hypergraph neural network is employed to predict the evolution of the fluid state based on the state of the whole system.
The proposed framework tracks the interface description and provides stable and accurate system state predictions during roll-out for at least 2000 time steps.
arXiv Detail & Related papers (2022-10-09T07:42:23Z) - Neural Operator with Regularity Structure for Modeling Dynamics Driven
by SPDEs [70.51212431290611]
Partial differential equations (SPDEs) are significant tools for modeling dynamics in many areas including atmospheric sciences and physics.
We propose the Neural Operator with Regularity Structure (NORS) which incorporates the feature vectors for modeling dynamics driven by SPDEs.
We conduct experiments on various of SPDEs including the dynamic Phi41 model and the 2d Navier-Stokes equation.
arXiv Detail & Related papers (2022-04-13T08:53:41Z) - Neural Ordinary Differential Equations for Data-Driven Reduced Order
Modeling of Environmental Hydrodynamics [4.547988283172179]
We explore the use of Neural Ordinary Differential Equations for fluid flow simulation.
Test problems we consider include incompressible flow around a cylinder and real-world applications of shallow water hydrodynamics in riverine and estuarine systems.
Our findings indicate that Neural ODEs provide an elegant framework for stable and accurate evolution of latent-space dynamics with a promising potential of extrapolatory predictions.
arXiv Detail & Related papers (2021-04-22T19:20:47Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.