OmniFluids: Physics Pre-trained Modeling of Fluid Dynamics
- URL: http://arxiv.org/abs/2506.10862v2
- Date: Sat, 09 Aug 2025 08:53:51 GMT
- Title: OmniFluids: Physics Pre-trained Modeling of Fluid Dynamics
- Authors: Rui Zhang, Qi Meng, Han Wan, Yang Liu, Zhi-Ming Ma, Hao Sun,
- Abstract summary: We propose OmniFluids, a pure physics pre-trained model that captures fundamental fluid dynamics laws and adapts efficiently to diverse downstream tasks.<n>We develop a training framework combining physics-only pre-training, coarse-grid operator distillation, and few-shot fine-tuning.<n>Tests show that OmniFluids outperforms state-of-the-art AI-driven methods in terms of flow field prediction and statistics.
- Score: 25.066485418709114
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Computational fluid dynamics (CFD) drives progress in numerous scientific and engineering fields, yet high-fidelity simulations remain computationally prohibitive. While machine learning approaches offer computing acceleration, they typically specialize in single physical systems or require extensive training data, hindering their applicability in highly nonlinear and 3D flow scenarios. To overcome these limitations, we propose OmniFluids, a pure physics pre-trained model that captures fundamental fluid dynamics laws and adapts efficiently to diverse downstream tasks with minimal data. We develop a training framework combining physics-only pre-training, coarse-grid operator distillation, and few-shot fine-tuning. This enables OmniFluids to retain broad physics knowledge while delivering fast and accurate predictions. Architecturally, OmniFluids integrates a mixture of operators, a multi-frame decoder, and factorized Fourier layers, seamlessly incorporating physics-based supervision while allowing efficient and scalable modeling of diverse tasks. Extensive tests on a broad range of 2D and 3D benchmarks show that OmniFluids outperforms state-of-the-art AI-driven methods in terms of flow field prediction and turbulence statistics. It delivers 10--100$\times$ speedups over traditional solvers while maintaining a comparable accuracy and accurately identifies unknown physical parameters from sparse, noisy data. This work demonstrates the potential of training a unified CFD solver exclusively from physics knowledge, offering a new approach for efficient and generalizable modeling across complex fluid systems.
Related papers
- AutoHood3D: A Multi-Modal Benchmark for Automotive Hood Design and Fluid-Structure Interaction [0.0]
The dataset is centered on a practical multiphysics problem-hood deformation from fluid entrapment and inertial loading during rotary-dip painting.<n>Each hood is numerically modeled with a coupled Large-Eddy Simulation (LES)-Finite Element Analysis (FEA) using 1.2M cells in total.<n>The dataset provides time-resolved physical fields, along with STL meshes and structured natural language prompts for text-to-geometry.
arXiv Detail & Related papers (2025-11-05T14:09:03Z) - Towards a Physics Foundation Model [2.109902626434734]
We present the General Physics Transformer (GPhyT), trained on 1.8 TB of diverse simulation data.<n>GPhyT achieves superior performance across multiple physics domains, outperforming specialized architectures by up to 29x.<n>By establishing that a single model can learn general physical principles from data alone, this work opens the path toward a universal Physics Foundation Model.
arXiv Detail & Related papers (2025-09-17T08:19:57Z) - PMNO: A novel physics guided multi-step neural operator predictor for partial differential equations [23.04840527974364]
We propose a novel physics guided multi-step neural operator (PMNO) architecture to address challenges in long-horizon prediction of complex physical systems.<n>The PMNO framework replaces the single-step input with multi-step historical data in the forward pass and introduces an implicit time-stepping scheme during backpropagation.<n>We demonstrate the superior predictive performance of PMNO predictor across a diverse range of physical systems.
arXiv Detail & Related papers (2025-06-02T12:33:50Z) - Hybrid Neural-MPM for Interactive Fluid Simulations in Real-Time [57.30651532625017]
We present a novel hybrid method that integrates numerical simulation, neural physics, and generative control.<n>Our system demonstrates robust performance across diverse 2D/3D scenarios, material types, and obstacle interactions.<n>We promise to release both models and data upon acceptance.
arXiv Detail & Related papers (2025-05-25T01:27:18Z) - High-fidelity Multiphysics Modelling for Rapid Predictions Using Physics-informed Parallel Neural Operator [17.85837423448985]
Modelling complex multiphysics systems governed by nonlinear and strongly coupled partial differential equations (PDEs) is a cornerstone in computational science and engineering.<n>We propose a novel paradigm, physics-informed parallel neural operator (PIPNO), a scalable and unsupervised learning framework.<n>PIPNO efficiently captures nonlinear operator mappings across diverse physics, including geotechnical engineering, material science, electromagnetism, quantum mechanics, and fluid dynamics.
arXiv Detail & Related papers (2025-02-26T20:29:41Z) - GausSim: Foreseeing Reality by Gaussian Simulator for Elastic Objects [55.02281855589641]
GausSim is a novel neural network-based simulator designed to capture the dynamic behaviors of real-world elastic objects represented through Gaussian kernels.<n>We leverage continuum mechanics and treat each kernel as a Center of Mass System (CMS) that represents continuous piece of matter.<n>In addition, GausSim incorporates explicit physics constraints, such as mass and momentum conservation, ensuring interpretable results and robust, physically plausible simulations.
arXiv Detail & Related papers (2024-12-23T18:58:17Z) - Data-Efficient Inference of Neural Fluid Fields via SciML Foundation Model [49.06911227670408]
We show that SciML foundation model can significantly improve the data efficiency of inferring real-world 3D fluid dynamics with improved generalization.<n>We equip neural fluid fields with a novel collaborative training approach that utilizes augmented views and fluid features extracted by our foundation model.
arXiv Detail & Related papers (2024-12-18T14:39:43Z) - Fine-Tuning Hybrid Physics-Informed Neural Networks for Vehicle Dynamics Model Estimation [2.432448600920501]
This paper introduces the Fine-Tuning Hybrid Dynamics (FTHD) method, which integrates supervised and unsupervised Physics-Informed Neural Networks (PINNs)
FTHD fine-tunes a pre-trained Deep Dynamics Model (DDM) using a smaller training dataset, delivering superior performance compared to state-of-the-art methods.
An Extended Kalman Filter (EKF) is embedded within FTHD to effectively manage noisy real-world data, ensuring accurate denoising.
Results demonstrate that the hybrid approach significantly improves parameter estimation accuracy, even with reduced data, and outperforms existing models.
arXiv Detail & Related papers (2024-09-29T10:33:07Z) - Liquid Fourier Latent Dynamics Networks for fast GPU-based numerical simulations in computational cardiology [0.0]
We propose an extension of Latent Dynamics Networks (LDNets) to create parameterized space-time surrogate models for multiscale and multiphysics sets of highly nonlinear differential equations on complex geometries.
LFLDNets employ a neurologically-inspired, sparse liquid neural network for temporal dynamics, relaxing the requirement of a numerical solver for time advancement and leading to superior performance in terms of parameters, accuracy, efficiency and learned trajectories.
arXiv Detail & Related papers (2024-08-19T09:14:25Z) - Multi-fidelity physics constrained neural networks for dynamical systems [16.6396704642848]
We propose the Multi-Scale Physics-Constrained Neural Network (MSPCNN)
MSPCNN offers a novel methodology for incorporating data with different levels of fidelity into a unified latent space.
Unlike conventional methods, MSPCNN also manages to employ multi-fidelity data to train the predictive model.
arXiv Detail & Related papers (2024-02-03T05:05:26Z) - Neural Operators for Accelerating Scientific Simulations and Design [85.89660065887956]
An AI framework, known as Neural Operators, presents a principled framework for learning mappings between functions defined on continuous domains.
Neural Operators can augment or even replace existing simulators in many applications, such as computational fluid dynamics, weather forecasting, and material modeling.
arXiv Detail & Related papers (2023-09-27T00:12:07Z) - Hybrid quantum physics-informed neural networks for simulating computational fluid dynamics in complex shapes [37.69303106863453]
We present a hybrid quantum physics-informed neural network that simulates laminar fluid flows in 3D Y-shaped mixers.
Our approach combines the expressive power of a quantum model with the flexibility of a physics-informed neural network, resulting in a 21% higher accuracy compared to a purely classical neural network.
arXiv Detail & Related papers (2023-04-21T20:49:29Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - On Fast Simulation of Dynamical System with Neural Vector Enhanced
Numerical Solver [59.13397937903832]
We introduce a deep learning-based corrector called Neural Vector (NeurVec)
NeurVec can compensate for integration errors and enable larger time step sizes in simulations.
Our experiments on a variety of complex dynamical system benchmarks demonstrate that NeurVec exhibits remarkable generalization capability.
arXiv Detail & Related papers (2022-08-07T09:02:18Z) - Physics-Inspired Temporal Learning of Quadrotor Dynamics for Accurate
Model Predictive Trajectory Tracking [76.27433308688592]
Accurately modeling quadrotor's system dynamics is critical for guaranteeing agile, safe, and stable navigation.
We present a novel Physics-Inspired Temporal Convolutional Network (PI-TCN) approach to learning quadrotor's system dynamics purely from robot experience.
Our approach combines the expressive power of sparse temporal convolutions and dense feed-forward connections to make accurate system predictions.
arXiv Detail & Related papers (2022-06-07T13:51:35Z) - PlasticineLab: A Soft-Body Manipulation Benchmark with Differentiable
Physics [89.81550748680245]
We introduce a new differentiable physics benchmark called PasticineLab.
In each task, the agent uses manipulators to deform the plasticine into the desired configuration.
We evaluate several existing reinforcement learning (RL) methods and gradient-based methods on this benchmark.
arXiv Detail & Related papers (2021-04-07T17:59:23Z) - Physics-Integrated Variational Autoencoders for Robust and Interpretable
Generative Modeling [86.9726984929758]
We focus on the integration of incomplete physics models into deep generative models.
We propose a VAE architecture in which a part of the latent space is grounded by physics.
We demonstrate generative performance improvements over a set of synthetic and real-world datasets.
arXiv Detail & Related papers (2021-02-25T20:28:52Z) - Teaching the Incompressible Navier-Stokes Equations to Fast Neural
Surrogate Models in 3D [4.981834139548193]
In this work, we present significant extensions to a recently proposed deep learning framework, which addresses the aforementioned challenges in 2D.
We go from 2D to 3D and propose an efficient architecture to cope with the high demands of 3D grids in terms of memory and computational complexity.
Our method indicates strong improvements in terms of accuracy, speed and generalization capabilities over current 3D NN-based fluid models.
arXiv Detail & Related papers (2020-12-22T09:21:40Z) - Learning Incompressible Fluid Dynamics from Scratch -- Towards Fast,
Differentiable Fluid Models that Generalize [7.707887663337803]
Recent deep learning based approaches promise vast speed-ups but do not generalize to new fluid domains.
We propose a novel physics-constrained training approach that generalizes to new fluid domains.
We present an interactive real-time demo to show the speed and generalization capabilities of our trained models.
arXiv Detail & Related papers (2020-06-15T20:59:28Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.