Predicting Energy Budgets in Droplet Dynamics: A Recurrent Neural Network Approach
- URL: http://arxiv.org/abs/2403.16144v1
- Date: Sun, 24 Mar 2024 13:32:42 GMT
- Title: Predicting Energy Budgets in Droplet Dynamics: A Recurrent Neural Network Approach
- Authors: Diego A. de Aguiar, Hugo L. França, Cassio M. Oishi,
- Abstract summary: This study applies Long Short-Term Memory to predict transient and static outputs for fluid flows under surface tension effects.
Using only dimensionless numbers and geometric time series data from numerical simulations, LSTM predicts the energy budget.
Using a recurrent neural network (RNN) architecture fed with time series data derived from geometrical parameters, our study shows the accuracy of our approach in predicting energy budgets.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Neural networks in fluid mechanics offer an efficient approach for exploring complex flows, including multiphase and free surface flows. The recurrent neural network, particularly the Long Short-Term Memory (LSTM) model, proves attractive for learning mappings from transient inputs to dynamic outputs. This study applies LSTM to predict transient and static outputs for fluid flows under surface tension effects. Specifically, we explore two distinct droplet dynamic scenarios: droplets with diverse initial shapes impacting with solid surfaces, as well as the coalescence of two droplets following collision. Using only dimensionless numbers and geometric time series data from numerical simulations, LSTM predicts the energy budget. The marker-and-cell front-tracking methodology combined with a marker-and-cell finite-difference strategy is adopted for simulating the droplet dynamics. Using a recurrent neural network (RNN) architecture fed with time series data derived from geometrical parameters, as for example droplet diameter variation, our study shows the accuracy of our approach in predicting energy budgets, as for instance the kinetic, dissipation, and surface energy trends, across a range of Reynolds and Weber numbers in droplet dynamic problems. Finally, a two-phase sequential neural network using only geometric data, which is readily available in experimental settings, is employed to predict the energies and then use them to estimate static parameters, such as the Reynolds and Weber numbers. While our methodology has been primarily validated with simulation data, its adaptability to experimental datasets is a promising avenue for future exploration. We hope that our strategy can be useful for diverse applications, spanning from inkjet printing to combustion engines, where the prediction of energy budgets or dissipation energies is crucial.
Related papers
- Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - Event-based Shape from Polarization with Spiking Neural Networks [5.200503222390179]
We introduce the Single-Timestep and Multi-Timestep Spiking UNets for effective and efficient surface normal estimation.
Our work contributes to the advancement of SNNs in event-based sensing.
arXiv Detail & Related papers (2023-12-26T14:43:26Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Statistical and machine learning approaches for prediction of long-time
excitation energy transfer dynamics [0.0]
The objective here is to demonstrate whether models such as SARIMA, CatBoost, Prophet, convolutional and recurrent neural networks are able to bypass this requirement.
Our results suggest that the SARIMA model can serve as a computationally inexpensive yet accurate way to predict long-time dynamics.
arXiv Detail & Related papers (2022-10-25T16:50:26Z) - Human Trajectory Prediction via Neural Social Physics [63.62824628085961]
Trajectory prediction has been widely pursued in many fields, and many model-based and model-free methods have been explored.
We propose a new method combining both methodologies based on a new Neural Differential Equation model.
Our new model (Neural Social Physics or NSP) is a deep neural network within which we use an explicit physics model with learnable parameters.
arXiv Detail & Related papers (2022-07-21T12:11:18Z) - An advanced spatio-temporal convolutional recurrent neural network for
storm surge predictions [73.4962254843935]
We study the capability of artificial neural network models to emulate storm surge based on the storm track/size/intensity history.
This study presents a neural network model that can predict storm surge, informed by a database of synthetic storm simulations.
arXiv Detail & Related papers (2022-04-18T23:42:18Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - Rotation Invariant Graph Neural Networks using Spin Convolutions [28.4962005849904]
Machine learning approaches have the potential to approximate Density Functional Theory (DFT) in a computationally efficient manner.
We introduce a novel approach to modeling angular information between sets of neighboring atoms in a graph neural network.
Results are demonstrated on the large-scale Open Catalyst 2020 dataset.
arXiv Detail & Related papers (2021-06-17T14:59:34Z) - Echo State Network for two-dimensional turbulent moist Rayleigh-B\'enard
convection [0.0]
We apply an echo state network to approximate the evolution of moist Rayleigh-B'enard convection.
We conclude that our model is capable of learning complex dynamics.
arXiv Detail & Related papers (2021-01-27T11:27:16Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Liquid Time-constant Networks [117.57116214802504]
We introduce a new class of time-continuous recurrent neural network models.
Instead of declaring a learning system's dynamics by implicit nonlinearities, we construct networks of linear first-order dynamical systems.
These neural networks exhibit stable and bounded behavior, yield superior expressivity within the family of neural ordinary differential equations.
arXiv Detail & Related papers (2020-06-08T09:53:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.