Fluid Simulation on Neural Flow Maps
- URL: http://arxiv.org/abs/2312.14635v1
- Date: Fri, 22 Dec 2023 12:13:19 GMT
- Title: Fluid Simulation on Neural Flow Maps
- Authors: Yitong Deng, Hong-Xing Yu, Diyang Zhang, Jiajun Wu, and Bo Zhu
- Abstract summary: We introduce Neural Flow Maps, a novel simulation method bridging the emerging paradigm of implicit neural representations with fluid simulation based on the theory of flow maps.
We demonstrate the efficacy of our neural fluid simulation in a variety of challenging simulation scenarios, including leapfrogging vortices, colliding vortices, vortex reconnections, as well as vortex generation from moving obstacles and density differences.
- Score: 23.5602305386658
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We introduce Neural Flow Maps, a novel simulation method bridging the
emerging paradigm of implicit neural representations with fluid simulation
based on the theory of flow maps, to achieve state-of-the-art simulation of
inviscid fluid phenomena. We devise a novel hybrid neural field representation,
Spatially Sparse Neural Fields (SSNF), which fuses small neural networks with a
pyramid of overlapping, multi-resolution, and spatially sparse grids, to
compactly represent long-term spatiotemporal velocity fields at high accuracy.
With this neural velocity buffer in hand, we compute long-term, bidirectional
flow maps and their Jacobians in a mechanistically symmetric manner, to
facilitate drastic accuracy improvement over existing solutions. These
long-range, bidirectional flow maps enable high advection accuracy with low
dissipation, which in turn facilitates high-fidelity incompressible flow
simulations that manifest intricate vortical structures. We demonstrate the
efficacy of our neural fluid simulation in a variety of challenging simulation
scenarios, including leapfrogging vortices, colliding vortices, vortex
reconnections, as well as vortex generation from moving obstacles and density
differences. Our examples show increased performance over existing methods in
terms of energy conservation, visual complexity, adherence to experimental
observations, and preservation of detailed vortical structures.
Related papers
- Unfolding Time: Generative Modeling for Turbulent Flows in 4D [49.843505326598596]
This work introduces a 4D generative diffusion model and a physics-informed guidance technique that enables the generation of realistic sequences of flow states.
Our findings indicate that the proposed method can successfully sample entire subsequences from the turbulent manifold.
This advancement opens doors for the application of generative modeling in analyzing the temporal evolution of turbulent flows.
arXiv Detail & Related papers (2024-06-17T10:21:01Z) - Vision-Informed Flow Image Super-Resolution with Quaternion Spatial
Modeling and Dynamic Flow Convolution [49.45309818782329]
Flow image super-resolution (FISR) aims at recovering high-resolution turbulent velocity fields from low-resolution flow images.
Existing FISR methods mainly process the flow images in natural image patterns.
We propose the first flow visual property-informed FISR algorithm.
arXiv Detail & Related papers (2024-01-29T06:48:16Z) - Rethinking materials simulations: Blending direct numerical simulations
with neural operators [1.6874375111244329]
We develop a new method that blends numerical solvers with neural operators to accelerate such simulations.
We demonstrate the effectiveness of this framework on simulations of microstructure evolution during physical vapor deposition.
arXiv Detail & Related papers (2023-12-08T23:44:54Z) - Gaussian Interpolation Flows [11.340847429991525]
This work investigates the well-posedness of simulation-free continuous normalizing flows built on Gaussian denoising.
We establish the Lipschitz regularity of the flow velocity field, the existence and uniqueness of the flow, and the continuity of the flow map.
We also study the stability of these flows in source distributions and perturbations of the velocity field, using the quadratic Wasserstein distance as a metric.
arXiv Detail & Related papers (2023-11-20T00:59:20Z) - Neural Posterior Estimation with Differentiable Simulators [58.720142291102135]
We present a new method to perform Neural Posterior Estimation (NPE) with a differentiable simulator.
We demonstrate how gradient information helps constrain the shape of the posterior and improves sample-efficiency.
arXiv Detail & Related papers (2022-07-12T16:08:04Z) - Towards Fast Simulation of Environmental Fluid Mechanics with
Multi-Scale Graph Neural Networks [0.0]
We introduce MultiScaleGNN, a novel multi-scale graph neural network model for learning to infer unsteady continuum mechanics.
We demonstrate this method on advection problems and incompressible fluid dynamics, both fundamental phenomena in oceanic and atmospheric processes.
Simulations obtained with MultiScaleGNN are between two and four orders of magnitude faster than those on which it was trained.
arXiv Detail & Related papers (2022-05-05T13:33:03Z) - Predicting the temporal dynamics of turbulent channels through deep
learning [0.0]
We aim to assess the capability of neural networks to reproduce the temporal evolution of a minimal turbulent channel flow.
Long-short-term-memory (LSTM) networks and a Koopman-based framework (KNF) are trained to predict the temporal dynamics of the minimal-channel-flow modes.
arXiv Detail & Related papers (2022-03-02T09:31:03Z) - Neural UpFlow: A Scene Flow Learning Approach to Increase the Apparent
Resolution of Particle-Based Liquids [0.6882042556551611]
We present a novel up-resing technique for generating high-resolution liquids based on scene flow estimation using deep neural networks.
Our approach infers and synthesizes small- and large-scale details solely from a low-resolution particle-based liquid simulation.
arXiv Detail & Related papers (2021-06-09T15:36:23Z) - Deep Bayesian Active Learning for Accelerating Stochastic Simulation [74.58219903138301]
Interactive Neural Process (INP) is a deep active learning framework for simulations and with active learning approaches.
For active learning, we propose a novel acquisition function, Latent Information Gain (LIG), calculated in the latent space of NP based models.
The results demonstrate STNP outperforms the baselines in the learning setting and LIG achieves the state-of-the-art for active learning.
arXiv Detail & Related papers (2021-06-05T01:31:51Z) - Machine learning for rapid discovery of laminar flow channel wall
modifications that enhance heat transfer [56.34005280792013]
We present a combination of accurate numerical simulations of arbitrary, flat, and non-flat channels and machine learning models predicting drag coefficient and Stanton number.
We show that convolutional neural networks (CNN) can accurately predict the target properties at a fraction of the time of numerical simulations.
arXiv Detail & Related papers (2021-01-19T16:14:02Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.