Flow Battery Manifold Design with Heterogeneous Inputs Through Generative Adversarial Neural Networks
- URL: http://arxiv.org/abs/2508.08863v1
- Date: Tue, 12 Aug 2025 11:40:09 GMT
- Title: Flow Battery Manifold Design with Heterogeneous Inputs Through Generative Adversarial Neural Networks
- Authors: Eric Seng, Hugh O'Connor, Adam Boyce, Josh J. Bailey, Anton van Beek,
- Abstract summary: We introduce a systematic framework for constructing training datasets tailored to generative models.<n>We show how integrating generative models with Bayesian optimization can enhance the interpretability of the latent space of admissible designs.<n>This work broadens the applicability of generative machine-learning models in system designs by enhancing quality and reliability.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Generative machine learning has emerged as a powerful tool for design representation and exploration. However, its application is often constrained by the need for large datasets of existing designs and the lack of interpretability about what features drive optimality. To address these challenges, we introduce a systematic framework for constructing training datasets tailored to generative models and demonstrate how these models can be leveraged for interpretable design. The novelty of this work is twofold: (i) we present a systematic framework for generating archetypes with internally homogeneous but mutually heterogeneous inputs that can be used to generate a training dataset, and (ii) we show how integrating generative models with Bayesian optimization can enhance the interpretability of the latent space of admissible designs. These findings are validated by using the framework to design a flow battery manifold, demonstrating that it effectively captures the space of feasible designs, including novel configurations while enabling efficient exploration. This work broadens the applicability of generative machine-learning models in system designs by enhancing quality and reliability.
Related papers
- TabPFN for Zero-shot Parametric Engineering Design Generation [8.681307193373241]
We propose a zero-shot generation framework for parametric engineering design based on TabPFN.<n>The proposed method generates design parameters sequentially conditioned on target performance indicators.<n>Compared with diffusion-based generative models, the proposed framework significantly reduces computational overhead and data requirements.
arXiv Detail & Related papers (2026-02-02T19:51:40Z) - GLUE: Generative Latent Unification of Expertise-Informed Engineering Models [3.005158583027536]
We introduce Generative Latent Unification of Expertise-Informed Engineering Models (GLUE)<n>GLUE orchestrates pre-trained, frozen subsystem generators while enforcing system-level feasibility, optimality, and diversity.<n>On a UAV design problem with five coupling constraints, we find that data-driven approaches yield diverse, high-performing designs but require large datasets to satisfy constraints reliably.
arXiv Detail & Related papers (2025-12-22T15:23:19Z) - High-Fidelity Scientific Simulation Surrogates via Adaptive Implicit Neural Representations [51.90920900332569]
Implicit neural representations (INRs) offer a compact and continuous framework for modeling spatially structured data.<n>Recent approaches address this by introducing additional features along rigid geometric structures.<n>We propose a simple yet effective alternative: Feature-Adaptive INR (FA-INR)
arXiv Detail & Related papers (2025-06-07T16:45:17Z) - Exploring the design space of deep-learning-based weather forecasting systems [56.129148006412855]
This paper systematically analyzes the impact of different design choices on deep-learning-based weather forecasting systems.
We study fixed-grid architectures such as UNet, fully convolutional architectures, and transformer-based models.
We propose a hybrid system that combines the strong performance of fixed-grid models with the flexibility of grid-invariant architectures.
arXiv Detail & Related papers (2024-10-09T22:25:50Z) - The Extrapolation Power of Implicit Models [2.3526338188342653]
Implicit models are put to the test across various extrapolation scenarios: out-of-distribution, geographical, and temporal shifts.
Our experiments consistently demonstrate significant performance advantage with implicit models.
arXiv Detail & Related papers (2024-07-19T16:01:37Z) - Implicitly Guided Design with PropEn: Match your Data to Follow the Gradient [52.2669490431145]
PropEn is inspired by'matching', which enables implicit guidance without training a discriminator.
We show that training with a matched dataset approximates the gradient of the property of interest while remaining within the data distribution.
arXiv Detail & Related papers (2024-05-28T11:30:19Z) - Diffusion Model for Data-Driven Black-Box Optimization [54.25693582870226]
We focus on diffusion models, a powerful generative AI technology, and investigate their potential for black-box optimization.
We study two practical types of labels: 1) noisy measurements of a real-valued reward function and 2) human preference based on pairwise comparisons.
Our proposed method reformulates the design optimization problem into a conditional sampling problem, which allows us to leverage the power of diffusion models.
arXiv Detail & Related papers (2024-03-20T00:41:12Z) - Generative VS non-Generative Models in Engineering Shape Optimization [0.3749861135832073]
We compare the effectiveness and efficiency of generative and non-generative models in constructing design spaces.
Non-generative models generate robust latent spaces with none or significantly fewer invalid designs when compared to generative models.
arXiv Detail & Related papers (2024-02-13T15:45:20Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - Latent Diffusion Models for Structural Component Design [11.342098118480802]
This paper proposes a framework for the generative design of structural components.
We employ a Latent Diffusion model to generate potential designs of a component that can satisfy a set of problem-specific loading conditions.
arXiv Detail & Related papers (2023-09-20T19:28:45Z) - Aligning Optimization Trajectories with Diffusion Models for Constrained
Design Generation [17.164961143132473]
We introduce a learning framework that demonstrates the efficacy of aligning the sampling trajectory of diffusion models with the optimization trajectory derived from traditional physics-based methods.
Our method allows for generating feasible and high-performance designs in as few as two steps without the need for expensive preprocessing, external surrogate models, or additional labeled data.
Our results demonstrate that TA outperforms state-of-the-art deep generative models on in-distribution configurations and halves the inference computational cost.
arXiv Detail & Related papers (2023-05-29T09:16:07Z) - S2RMs: Spatially Structured Recurrent Modules [105.0377129434636]
We take a step towards exploiting dynamic structure that are capable of simultaneously exploiting both modular andtemporal structures.
We find our models to be robust to the number of available views and better capable of generalization to novel tasks without additional training.
arXiv Detail & Related papers (2020-07-13T17:44:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.