Generative Design of Ship Propellers using Conditional Flow Matching
- URL: http://arxiv.org/abs/2601.21637v1
- Date: Thu, 29 Jan 2026 12:40:37 GMT
- Title: Generative Design of Ship Propellers using Conditional Flow Matching
- Authors: Patrick Kruger, Rafael Diaz, Simon Hauschulz, Stefan Harries, Hanno Gottschalk,
- Abstract summary: We explore the use of generative artificial intelligence (GenAI) for ship propeller design.<n>We employ conditional flow matching to establish a bidirectional mapping between design parameters and simulated noise.<n>We present examples of distinct propeller geometries that exhibit nearly identical performance characteristics.
- Score: 2.213715905604014
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we explore the use of generative artificial intelligence (GenAI) for ship propeller design. While traditional forward machine learning models predict the performance of mechanical components based on given design parameters, GenAI models aim to generate designs that achieve specified performance targets. In particular, we employ conditional flow matching to establish a bidirectional mapping between design parameters and simulated noise that is conditioned on performance labels. This approach enables the generation of multiple valid designs corresponding to the same performance targets by sampling over the noise vector. To support model training, we generate data using a vortex lattice method for numerical simulation and analyze the trade-off between model accuracy and the amount of available data. We further propose data augmentation using pseudo-labels derived from less data-intensive forward surrogate models, which can often improve overall model performance. Finally, we present examples of distinct propeller geometries that exhibit nearly identical performance characteristics, illustrating the versatility and potential of GenAI in engineering design.
Related papers
- TabPFN for Zero-shot Parametric Engineering Design Generation [8.681307193373241]
We propose a zero-shot generation framework for parametric engineering design based on TabPFN.<n>The proposed method generates design parameters sequentially conditioned on target performance indicators.<n>Compared with diffusion-based generative models, the proposed framework significantly reduces computational overhead and data requirements.
arXiv Detail & Related papers (2026-02-02T19:51:40Z) - Nonparametric Data Attribution for Diffusion Models [57.820618036556084]
Data attribution for generative models seeks to quantify the influence of individual training examples on model outputs.<n>We propose a nonparametric attribution method that operates entirely on data, measuring influence via patch-level similarity between generated and training images.
arXiv Detail & Related papers (2025-10-16T03:37:16Z) - Jet: A Modern Transformer-Based Normalizing Flow [62.2573739835562]
We revisit the design of the coupling-based normalizing flow models by carefully ablating prior design choices.<n>We achieve state-of-the-art quantitative and qualitative performance with a much simpler architecture.
arXiv Detail & Related papers (2024-12-19T18:09:42Z) - Supervised Score-Based Modeling by Gradient Boosting [49.556736252628745]
We propose a Supervised Score-based Model (SSM) which can be viewed as a gradient boosting algorithm combining score matching.<n>We provide a theoretical analysis of learning and sampling for SSM to balance inference time and prediction accuracy.<n>Our model outperforms existing models in both accuracy and inference time.
arXiv Detail & Related papers (2024-11-02T07:06:53Z) - Generative VS non-Generative Models in Engineering Shape Optimization [0.3749861135832073]
We compare the effectiveness and efficiency of generative and non-generative models in constructing design spaces.
Non-generative models generate robust latent spaces with none or significantly fewer invalid designs when compared to generative models.
arXiv Detail & Related papers (2024-02-13T15:45:20Z) - Precision-Recall Divergence Optimization for Generative Modeling with
GANs and Normalizing Flows [54.050498411883495]
We develop a novel training method for generative models, such as Generative Adversarial Networks and Normalizing Flows.
We show that achieving a specified precision-recall trade-off corresponds to minimizing a unique $f$-divergence from a family we call the textitPR-divergences.
Our approach improves the performance of existing state-of-the-art models like BigGAN in terms of either precision or recall when tested on datasets such as ImageNet.
arXiv Detail & Related papers (2023-05-30T10:07:17Z) - VTAE: Variational Transformer Autoencoder with Manifolds Learning [144.0546653941249]
Deep generative models have demonstrated successful applications in learning non-linear data distributions through a number of latent variables.
The nonlinearity of the generator implies that the latent space shows an unsatisfactory projection of the data space, which results in poor representation learning.
We show that geodesics and accurate computation can substantially improve the performance of deep generative models.
arXiv Detail & Related papers (2023-04-03T13:13:19Z) - Design Space Exploration and Explanation via Conditional Variational
Autoencoders in Meta-model-based Conceptual Design of Pedestrian Bridges [52.77024349608834]
This paper provides a performance-driven design exploration framework to augment the human designer through a Conditional Variational Autoencoder (CVAE)
The CVAE is trained on 18'000 synthetically generated instances of a pedestrian bridge in Switzerland.
arXiv Detail & Related papers (2022-11-29T17:28:31Z) - Design Target Achievement Index: A Differentiable Metric to Enhance Deep
Generative Models in Multi-Objective Inverse Design [4.091593765662773]
Design Target Achievement Index (DTAI) is a differentiable, tunable metric that scores a design's ability to achieve designer-specified minimum performance targets.
We apply DTAI to a Performance-Augmented Diverse GAN (PaDGAN) and demonstrate superior generative performance compared to a set of baseline Deep Generative Models.
arXiv Detail & Related papers (2022-05-06T04:14:34Z) - Early-Phase Performance-Driven Design using Generative Models [0.0]
This research introduces a novel method for performance-driven geometry generation that can afford interaction directly in the 3d modeling environment.
The method uses Machine Learning techniques to train a generative model offline.
By navigating the generative model's latent space, geometries with the desired characteristics can be quickly generated.
arXiv Detail & Related papers (2021-07-19T01:25:11Z) - Multi-Objective Evolutionary Design of CompositeData-Driven Models [0.0]
The implemented approach is based on a parameter-free genetic algorithm for model design called GPComp@Free.
The experimental results confirm that a multi-objective approach to the model design allows achieving better diversity and quality of obtained models.
arXiv Detail & Related papers (2021-03-01T20:45:24Z) - Regularized Autoencoders via Relaxed Injective Probability Flow [35.39933775720789]
Invertible flow-based generative models are an effective method for learning to generate samples, while allowing for tractable likelihood computation and inference.
We propose a generative model based on probability flows that does away with the bijectivity requirement on the model and only assumes injectivity.
arXiv Detail & Related papers (2020-02-20T18:22:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.