Surrogate-Based Differentiable Pipeline for Shape Optimization
- URL: http://arxiv.org/abs/2511.10761v1
- Date: Thu, 13 Nov 2025 19:30:50 GMT
- Title: Surrogate-Based Differentiable Pipeline for Shape Optimization
- Authors: Andrin Rehmann, Nolan Black, Josiah Bjorgaard, Alessandro Angioi, Andrei Paleyes, Niklas Heim, Dion Häfner, Alexander Lavin,
- Abstract summary: We propose replacing non-differentiable pipeline components with surrogate models which are inherently differentiable.<n>We demonstrate an end-to-end differentiable pipeline where a 3D U-Net full-field surrogate replaces both meshing and simulation steps by training it on the mapping between the signed distance field (SDF) of the shape and the fields of interest.
- Score: 64.24199762940444
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Gradient-based optimization of engineering designs is limited by non-differentiable components in the typical computer-aided engineering (CAE) workflow, which calculates performance metrics from design parameters. While gradient-based methods could provide noticeable speed-ups in high-dimensional design spaces, codes for meshing, physical simulations, and other common components are not differentiable even if the math or physics underneath them is. We propose replacing non-differentiable pipeline components with surrogate models which are inherently differentiable. Using a toy example of aerodynamic shape optimization, we demonstrate an end-to-end differentiable pipeline where a 3D U-Net full-field surrogate replaces both meshing and simulation steps by training it on the mapping between the signed distance field (SDF) of the shape and the fields of interest. This approach enables gradient-based shape optimization without the need for differentiable solvers, which can be useful in situations where adjoint methods are unavailable and/or hard to implement.
Related papers
- Guided Diffusion by Optimized Loss Functions on Relaxed Parameters for Inverse Material Design [4.620353116375784]
Inverse design problems are common in engineering and materials science.<n>We propose a novel inverse design method based on diffusion models.<n>We show that our method can propose diverse designs within 1% relative error margin.
arXiv Detail & Related papers (2026-02-17T15:15:28Z) - Differentiation Strategies for Acoustic Inverse Problems: Admittance Estimation and Shape Optimization [0.0]
We show that JAX-FEM's automatic differentiation (AD) enables direct gradient-based estimation of complex boundary admittance from sparse pressure measurements.<n>We apply randomized finite differences to acoustic shape optimization, combining JAX-FEM for forward simulation with PyTorch3D for mesh manipulation through AD.
arXiv Detail & Related papers (2025-11-14T15:46:05Z) - Adjoint-Based Aerodynamic Shape Optimization with a Manifold Constraint Learned by Diffusion Models [12.019764781438603]
We introduce an adjoint-based aerodynamic shape optimization framework that integrates a diffusion model trained on existing designs to learn a smooth manifold of aerodynamically viable shapes.<n>We demonstrate how AI generated priors integrates effectively with adjoint methods to enable robust, high-fidelity aerodynamic shape optimization through automatic differentiation.
arXiv Detail & Related papers (2025-07-31T11:21:20Z) - diffSPH: Differentiable Smoothed Particle Hydrodynamics for Adjoint Optimization and Machine Learning [21.05257407408671]
diffSPH is a differentiable Smoothed Particle Hydrodynamics (SPH) framework developed entirely in PyTorch with GPU acceleration.<n> diffSPH is designed centrally around differentiation to facilitate optimization and machine learning (ML) applications in Computational Fluid Dynamics(CFD)<n>We demonstrate the framework's unique capabilities through several applications, including addressing particle shifting via a novel, target-oriented approach.
arXiv Detail & Related papers (2025-07-29T10:54:27Z) - Geometric Operator Learning with Optimal Transport [77.16909146519227]
We propose integrating optimal transport (OT) into operator learning for partial differential equations (PDEs) on complex geometries.<n>For 3D simulations focused on surfaces, our OT-based neural operator embeds the surface geometry into a 2D parameterized latent space.<n> Experiments with Reynolds-averaged Navier-Stokes equations (RANS) on the ShapeNet-Car and DrivAerNet-Car datasets show that our method achieves better accuracy and also reduces computational expenses.
arXiv Detail & Related papers (2025-07-26T21:28:25Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - Flexible Differentiable Optimization via Model Transformations [1.081463830315253]
We introduce DiffOpt, a Julia library to differentiate through the solution of optimization problems with respect to arbitrary parameters present in the objective and/or constraints.
arXiv Detail & Related papers (2022-06-10T09:59:13Z) - Differentiable Spline Approximations [48.10988598845873]
Differentiable programming has significantly enhanced the scope of machine learning.
Standard differentiable programming methods (such as autodiff) typically require that the machine learning models be differentiable.
We show that leveraging this redesigned Jacobian in the form of a differentiable "layer" in predictive models leads to improved performance in diverse applications.
arXiv Detail & Related papers (2021-10-04T16:04:46Z) - ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks [86.37110868126548]
In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
arXiv Detail & Related papers (2021-02-16T04:07:13Z) - Efficient Learning of Generative Models via Finite-Difference Score
Matching [111.55998083406134]
We present a generic strategy to efficiently approximate any-order directional derivative with finite difference.
Our approximation only involves function evaluations, which can be executed in parallel, and no gradient computations.
arXiv Detail & Related papers (2020-07-07T10:05:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.