diffSPH: Differentiable Smoothed Particle Hydrodynamics for Adjoint Optimization and Machine Learning
- URL: http://arxiv.org/abs/2507.21684v1
- Date: Tue, 29 Jul 2025 10:54:27 GMT
- Title: diffSPH: Differentiable Smoothed Particle Hydrodynamics for Adjoint Optimization and Machine Learning
- Authors: Rene Winchenbach, Nils Thuerey,
- Abstract summary: diffSPH is a differentiable Smoothed Particle Hydrodynamics (SPH) framework developed entirely in PyTorch with GPU acceleration.<n> diffSPH is designed centrally around differentiation to facilitate optimization and machine learning (ML) applications in Computational Fluid Dynamics(CFD)<n>We demonstrate the framework's unique capabilities through several applications, including addressing particle shifting via a novel, target-oriented approach.
- Score: 21.05257407408671
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We present diffSPH, a novel open-source differentiable Smoothed Particle Hydrodynamics (SPH) framework developed entirely in PyTorch with GPU acceleration. diffSPH is designed centrally around differentiation to facilitate optimization and machine learning (ML) applications in Computational Fluid Dynamics~(CFD), including training neural networks and the development of hybrid models. Its differentiable SPH core, and schemes for compressible (with shock capturing and multi-phase flows), weakly compressible (with boundary handling and free-surface flows), and incompressible physics, enable a broad range of application areas. We demonstrate the framework's unique capabilities through several applications, including addressing particle shifting via a novel, target-oriented approach by minimizing physical and regularization loss terms, a task often intractable in traditional solvers. Further examples include optimizing initial conditions and physical parameters to match target trajectories, shape optimization, implementing a solver-in-the-loop setup to emulate higher-order integration, and demonstrating gradient propagation through hundreds of full simulation steps. Prioritizing readability, usability, and extensibility, this work offers a foundational platform for the CFD community to develop and deploy novel neural networks and adjoint optimization applications.
Related papers
- Self-Supervised Coarsening of Unstructured Grid with Automatic Differentiation [55.88862563823878]
In this work, we present an original algorithm to coarsen an unstructured grid based on the concepts of differentiable physics.<n>We demonstrate performance of the algorithm on two PDEs: a linear equation which governs slightly compressible fluid flow in porous media and the wave equation.<n>Our results show that in the considered scenarios, we reduced the number of grid points up to 10 times while preserving the modeled variable dynamics in the points of interest.
arXiv Detail & Related papers (2025-07-24T11:02:13Z) - HiLAB: A Hybrid Inverse-Design Framework [0.0]
HiLAB is a new paradigm for inverse design of nanophotonic structures.<n>It addresses multi-functional device design by generating diverse freeform configurations at reduced simulation costs.
arXiv Detail & Related papers (2025-05-23T05:34:56Z) - Accelerating Multiscale Modeling with Hybrid Solvers: Coupling FEM and Neural Operators with Domain Decomposition [3.0635300721402228]
This work introduces a novel hybrid framework that integrates PI-NO with finite element method (FE) through domain decomposition.<n>The framework efficacy has been validated across a range of problems, spanning static, quasi-static, and dynamic regimes.<n>Our study shows that our hybrid solver: (1) maintains solution continuity across subdomain interfaces, (2) reduces computational costs by eliminating fine mesh requirements, (3) mitigates error accumulation in time dependent simulations, and (4) enables automatic adaptation to evolving physical phenomena.
arXiv Detail & Related papers (2025-04-15T16:54:04Z) - Implicit Neural Differential Model for Spatiotemporal Dynamics [5.1854032131971195]
We introduce Im-PiNDiff, a novel implicit physics-integrated neural differentiable solver for stabletemporal dynamics.<n>Inspired by deep equilibrium models, Im-PiNDiff advances the state using implicit fixed-point layers, enabling robust long-term simulation.<n>Im-PiNDiff achieves superior predictive performance, enhanced numerical stability, and substantial reductions in memory and cost.
arXiv Detail & Related papers (2025-04-03T04:07:18Z) - Go With the Flow: Fast Diffusion for Gaussian Mixture Models [16.07896640031724]
Schrodinger Bridges (SBs) are diffusion processes that steer in finite time, a given initial distribution to another final one while minimizing a suitable cost functional.<n>We propose an analytic parametrization of a set of feasible policies for solving low dimensional problems.<n>We showcase the potential of this approach in low-to-image problems such as image-to-image translation in the latent space of an autoencoder, learning of cellular dynamics using multi-marginal momentum SB problems and various other examples.
arXiv Detail & Related papers (2024-12-12T08:40:22Z) - Accelerate Neural Subspace-Based Reduced-Order Solver of Deformable Simulation by Lipschitz Optimization [9.364019847856714]
Reduced-order simulation is an emerging method for accelerating physical simulations with high DOFs.
We propose a method for finding optimized subspace mappings, enabling further acceleration of neural reduced-order simulations.
We demonstrate the effectiveness of our approach through general cases in both quasi-static and dynamics simulations.
arXiv Detail & Related papers (2024-09-05T12:56:03Z) - Efficient and Flexible Neural Network Training through Layer-wise Feedback Propagation [49.44309457870649]
Layer-wise Feedback feedback (LFP) is a novel training principle for neural network-like predictors.<n>LFP decomposes a reward to individual neurons based on their respective contributions.<n>Our method then implements a greedy reinforcing approach helpful parts of the network and weakening harmful ones.
arXiv Detail & Related papers (2023-08-23T10:48:28Z) - Differentiable Turbulence: Closure as a partial differential equation constrained optimization [1.8749305679160366]
We leverage the concept of differentiable turbulence, whereby an end-to-end differentiable solver is used in combination with physics-inspired choices of deep learning architectures.
We show that the differentiable physics paradigm is more successful than offline, textita-priori learning, and that hybrid solver-in-the-loop approaches to deep learning offer an ideal balance between computational efficiency, accuracy, and generalization.
arXiv Detail & Related papers (2023-07-07T15:51:55Z) - Hybrid quantum physics-informed neural networks for simulating computational fluid dynamics in complex shapes [37.69303106863453]
We present a hybrid quantum physics-informed neural network that simulates laminar fluid flows in 3D Y-shaped mixers.
Our approach combines the expressive power of a quantum model with the flexibility of a physics-informed neural network, resulting in a 21% higher accuracy compared to a purely classical neural network.
arXiv Detail & Related papers (2023-04-21T20:49:29Z) - NeuralStagger: Accelerating Physics-constrained Neural PDE Solver with
Spatial-temporal Decomposition [67.46012350241969]
This paper proposes a general acceleration methodology called NeuralStagger.
It decomposing the original learning tasks into several coarser-resolution subtasks.
We demonstrate the successful application of NeuralStagger on 2D and 3D fluid dynamics simulations.
arXiv Detail & Related papers (2023-02-20T19:36:52Z) - Optimization-Derived Learning with Essential Convergence Analysis of
Training and Hyper-training [52.39882976848064]
We design a Generalized Krasnoselskii-Mann (GKM) scheme based on fixed-point iterations as our fundamental ODL module.
Under the GKM scheme, a Bilevel Meta Optimization (BMO) algorithmic framework is constructed to solve the optimal training and hyper-training variables together.
arXiv Detail & Related papers (2022-06-16T01:50:25Z) - Efficient Differentiable Simulation of Articulated Bodies [89.64118042429287]
We present a method for efficient differentiable simulation of articulated bodies.
This enables integration of articulated body dynamics into deep learning frameworks.
We show that reinforcement learning with articulated systems can be accelerated using gradients provided by our method.
arXiv Detail & Related papers (2021-09-16T04:48:13Z) - PlasticineLab: A Soft-Body Manipulation Benchmark with Differentiable
Physics [89.81550748680245]
We introduce a new differentiable physics benchmark called PasticineLab.
In each task, the agent uses manipulators to deform the plasticine into the desired configuration.
We evaluate several existing reinforcement learning (RL) methods and gradient-based methods on this benchmark.
arXiv Detail & Related papers (2021-04-07T17:59:23Z) - Combining Differentiable PDE Solvers and Graph Neural Networks for Fluid
Flow Prediction [79.81193813215872]
We develop a hybrid (graph) neural network that combines a traditional graph convolutional network with an embedded differentiable fluid dynamics simulator inside the network itself.
We show that we can both generalize well to new situations and benefit from the substantial speedup of neural network CFD predictions.
arXiv Detail & Related papers (2020-07-08T21:23:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.