Hybrid Physical-Neural Simulator for Fast Cosmological Hydrodynamics
- URL: http://arxiv.org/abs/2510.26593v1
- Date: Thu, 30 Oct 2025 15:18:07 GMT
- Title: Hybrid Physical-Neural Simulator for Fast Cosmological Hydrodynamics
- Authors: Arne Thomsen, Tilman Tröster, François Lanusse,
- Abstract summary: We propose a hybrid approach where gravitational forces are computed using a differentiable particle-mesh solver, while the hydrodynamics are parametrized by a neural network.<n>The approach is furthermore highly data efficient, with a single reference simulation of cosmological structure formation being sufficient to constrain the neural pressure model.
- Score: 2.5807659587068534
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cosmological field-level inference requires differentiable forward models that solve the challenging dynamics of gas and dark matter under hydrodynamics and gravity. We propose a hybrid approach where gravitational forces are computed using a differentiable particle-mesh solver, while the hydrodynamics are parametrized by a neural network that maps local quantities to an effective pressure field. We demonstrate that our method improves upon alternative approaches, such as an Enthalpy Gradient Descent baseline, both at the field and summary-statistic level. The approach is furthermore highly data efficient, with a single reference simulation of cosmological structure formation being sufficient to constrain the neural pressure model. This opens the door for future applications where the model is fit directly to observational data, rather than a training set of simulations.
Related papers
- Foundation Models for Discovery and Exploration in Chemical Space [57.97784111110166]
MIST is a family of molecular foundation models trained on large unlabeled datasets.<n>We demonstrate the ability of these models to solve real-world problems across chemical space.
arXiv Detail & Related papers (2025-10-20T17:56:01Z) - Neural Networks as Surrogate Solvers for Time-Dependent Accretion Disk Dynamics [25.142937286955547]
Accretion disks are ubiquitous in astrophysics, appearing in diverse environments from planet-forming systems to X-ray binaries and active galactic nuclei.<n>Traditionally, modeling their dynamics requires computationally intensive (magneto)hydrodynamic simulations.<n>Recently, Physics-Informed Neural Networks (PINNs) have emerged as a promising alternative.<n>We for the first time demonstrate PINNs for solving the two-dimensional, time-dependent hydrodynamics of non-self-gravitating accretion disks.
arXiv Detail & Related papers (2025-09-24T18:01:04Z) - Iterative Distillation for Reward-Guided Fine-Tuning of Diffusion Models in Biomolecular Design [58.8094854658848]
We address the problem of fine-tuning diffusion models for reward-guided generation in biomolecular design.<n>We propose an iterative distillation-based fine-tuning framework that enables diffusion models to optimize for arbitrary reward functions.<n>Our off-policy formulation, combined with KL divergence minimization, enhances training stability and sample efficiency compared to existing RL-based methods.
arXiv Detail & Related papers (2025-07-01T05:55:28Z) - Physics-guided weak-form discovery of reduced-order models for trapped ultracold hydrodynamics [0.0]
We study the relaxation of a highly collisional, ultracold but nondegenerate gas of polar molecules.
The gas is subject to fluid-gas coupled dynamics that lead to a breakdown of first-order hydrodynamics.
We present substantially improved reduced-order models for these same observables.
arXiv Detail & Related papers (2024-06-11T17:50:04Z) - Capturing dynamical correlations using implicit neural representations [85.66456606776552]
We develop an artificial intelligence framework which combines a neural network trained to mimic simulated data from a model Hamiltonian with automatic differentiation to recover unknown parameters from experimental data.
In doing so, we illustrate the ability to build and train a differentiable model only once, which then can be applied in real-time to multi-dimensional scattering data.
arXiv Detail & Related papers (2023-04-08T07:55:36Z) - Physics-informed neural networks in the recreation of hydrodynamic
simulations from dark matter [1.786053901581251]
This paper presents the first application of physics-informed neural networks to baryon inpainting.
We introduce a punitive prediction comparison based on the Kullback-Leibler divergence, which enforces scatter reproduction.
arXiv Detail & Related papers (2023-03-24T15:55:09Z) - Implicit Geometry and Interaction Embeddings Improve Few-Shot Molecular
Property Prediction [53.06671763877109]
We develop molecular embeddings that encode complex molecular characteristics to improve the performance of few-shot molecular property prediction.
Our approach leverages large amounts of synthetic data, namely the results of molecular docking calculations.
On multiple molecular property prediction benchmarks, training from the embedding space substantially improves Multi-Task, MAML, and Prototypical Network few-shot learning performance.
arXiv Detail & Related papers (2023-02-04T01:32:40Z) - Physics-informed machine learning with differentiable programming for
heterogeneous underground reservoir pressure management [64.17887333976593]
Avoiding over-pressurization in subsurface reservoirs is critical for applications like CO2 sequestration and wastewater injection.
Managing the pressures by controlling injection/extraction are challenging because of complex heterogeneity in the subsurface.
We use differentiable programming with a full-physics model and machine learning to determine the fluid extraction rates that prevent over-pressurization.
arXiv Detail & Related papers (2022-06-21T20:38:13Z) - NeuroFluid: Fluid Dynamics Grounding with Particle-Driven Neural
Radiance Fields [65.07940731309856]
Deep learning has shown great potential for modeling the physical dynamics of complex particle systems such as fluids.
In this paper, we consider a partially observable scenario known as fluid dynamics grounding.
We propose a differentiable two-stage network named NeuroFluid.
It is shown to reasonably estimate the underlying physics of fluids with different initial shapes, viscosity, and densities.
arXiv Detail & Related papers (2022-03-03T15:13:29Z) - Physics informed machine learning with Smoothed Particle Hydrodynamics:
Hierarchy of reduced Lagrangian models of turbulence [0.6542219246821327]
This manuscript develops a hierarchy of parameterized reduced Lagrangian models for turbulent flows.
It investigates the effects of enforcing physical structure through Smoothed Particle Hydrodynamics (SPH) versus relying on neural networks (NN)s as universal function approximators.
arXiv Detail & Related papers (2021-10-25T22:57:40Z) - A Gradient-based Deep Neural Network Model for Simulating Multiphase
Flow in Porous Media [1.5791732557395552]
We describe a gradient-based deep neural network (GDNN) constrained by the physics related to multiphase flow in porous media.
We demonstrate that GDNN can effectively predict the nonlinear patterns of subsurface responses.
arXiv Detail & Related papers (2021-04-30T02:14:00Z) - Neural Ordinary Differential Equations for Data-Driven Reduced Order
Modeling of Environmental Hydrodynamics [4.547988283172179]
We explore the use of Neural Ordinary Differential Equations for fluid flow simulation.
Test problems we consider include incompressible flow around a cylinder and real-world applications of shallow water hydrodynamics in riverine and estuarine systems.
Our findings indicate that Neural ODEs provide an elegant framework for stable and accurate evolution of latent-space dynamics with a promising potential of extrapolatory predictions.
arXiv Detail & Related papers (2021-04-22T19:20:47Z) - A Near-Optimal Gradient Flow for Learning Neural Energy-Based Models [93.24030378630175]
We propose a novel numerical scheme to optimize the gradient flows for learning energy-based models (EBMs)
We derive a second-order Wasserstein gradient flow of the global relative entropy from Fokker-Planck equation.
Compared with existing schemes, Wasserstein gradient flow is a smoother and near-optimal numerical scheme to approximate real data densities.
arXiv Detail & Related papers (2019-10-31T02:26:20Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.