Surrogate Modeling for the Design of Optimal Lattice Structures using Tensor Completion
- URL: http://arxiv.org/abs/2510.07474v1
- Date: Wed, 08 Oct 2025 19:20:59 GMT
- Title: Surrogate Modeling for the Design of Optimal Lattice Structures using Tensor Completion
- Authors: Shaan Pakala, Aldair E. Gongora, Brian Giera, Evangelos E. Papalexakis,
- Abstract summary: We focus on the design of optimal lattice structures with regard to mechanical performance.<n>In this work, we suggest the use of tensor completion as a surrogate model to accelerate the design of materials.
- Score: 2.629069307455598
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: When designing new materials, it is often necessary to design a material with specific desired properties. Unfortunately, as new design variables are added, the search space grows exponentially, which makes synthesizing and validating the properties of each material very impractical and time-consuming. In this work, we focus on the design of optimal lattice structures with regard to mechanical performance. Computational approaches, including the use of machine learning (ML) methods, have shown improved success in accelerating materials design. However, these ML methods are still lacking in scenarios when training data (i.e. experimentally validated materials) come from a non-uniformly random sampling across the design space. For example, an experimentalist might synthesize and validate certain materials more frequently because of convenience. For this reason, we suggest the use of tensor completion as a surrogate model to accelerate the design of materials in these atypical supervised learning scenarios. In our experiments, we show that tensor completion is superior to classic ML methods such as Gaussian Process and XGBoost with biased sampling of the search space, with around 5\% increased $R^2$. Furthermore, tensor completion still gives comparable performance with a uniformly random sampling of the entire search space.
Related papers
- Tensor Methods: A Unified and Interpretable Approach for Material Design [2.629069307455598]
We suggest the use of tensor completion methods as an all-in-one approach for interpretability and predictions.<n>We study the effects of both types of surrogate models when we encounter training data from a non-uniform sampling of the design space.<n>We find the best generalization comes from a tensor model, which is able to improve upon the baseline ML methods by up to 5% on aggregate $R2$, and halve the error in some out of distribution regions.
arXiv Detail & Related papers (2026-02-11T00:30:39Z) - Reward driven discovery of the optimal microstructure representations with invariant variational autoencoders [0.015295722752489374]
Variational Autoencoders (VAEs) provide a powerful means of constructing such low-dimensional representations.<n>VAEs are often optimized through trial-and-error and empirical analysis.<n>We investigated reward-based strategies for evaluating latent space representations.
arXiv Detail & Related papers (2025-09-30T20:15:42Z) - Enhancing Experimental Efficiency in Materials Design: A Comparative Study of Taguchi and Machine Learning Methods [0.0]
Materials design problems often require optimizing multiple variables, rendering full factorial exploration impractical.<n>In this work, we demonstrate how machine learning (ML) methods can be used to overcome these limitations.<n>We compare the performance of Taguchi method against an active learning based Gaussian process regression (GPR) model in a wire arc additive manufacturing (WAAM) process.
arXiv Detail & Related papers (2025-06-04T13:04:29Z) - A Materials Foundation Model via Hybrid Invariant-Equivariant Architectures [53.273077346444886]
Machine learning interatomic potentials (MLIPs) can predict energy, force, and stress of materials.<n>A key design choice in MLIPs involves the trade-off between invariant and equivariant architectures.<n>HIENet is a hybrid invariant-equivariant materials interatomic potential model that integrates both invariant and equivariant message passing layers.
arXiv Detail & Related papers (2025-02-25T18:01:05Z) - Tensor Completion for Surrogate Modeling of Material Property Prediction [0.5735035463793009]
We model the optimization of certain material properties as a tensor completion problem.<n>We leverage the structure of our datasets and navigate the vast number of combinations of material configurations.<n>Across a variety of material property prediction tasks, our experiments show tensor completion methods achieving 10-20% decreased error.
arXiv Detail & Related papers (2025-01-30T04:59:21Z) - Compositional Generative Inverse Design [69.22782875567547]
Inverse design, where we seek to design input variables in order to optimize an underlying objective function, is an important problem.
We show that by instead optimizing over the learned energy function captured by the diffusion model, we can avoid such adversarial examples.
In an N-body interaction task and a challenging 2D multi-airfoil design task, we demonstrate that by composing the learned diffusion model at test time, our method allows us to design initial states and boundary shapes.
arXiv Detail & Related papers (2024-01-24T01:33:39Z) - FAENet: Frame Averaging Equivariant GNN for Materials Modeling [123.19473575281357]
We introduce a flexible framework relying on frameaveraging (SFA) to make any model E(3)-equivariant or invariant through data transformations.
We prove the validity of our method theoretically and empirically demonstrate its superior accuracy and computational scalability in materials modeling.
arXiv Detail & Related papers (2023-04-28T21:48:31Z) - A Pareto-optimal compositional energy-based model for sampling and
optimization of protein sequences [55.25331349436895]
Deep generative models have emerged as a popular machine learning-based approach for inverse problems in the life sciences.
These problems often require sampling new designs that satisfy multiple properties of interest in addition to learning the data distribution.
arXiv Detail & Related papers (2022-10-19T19:04:45Z) - Uncertainty-aware Mixed-variable Machine Learning for Materials Design [9.259285449415676]
We survey frequentist and Bayesian approaches to uncertainty quantification of machine learning with mixed variables.
We examine the efficacy of the two models in the optimization of mathematical functions, as well as properties of structural and functional materials.
Our results provide practical guidance on choosing between frequentist and Bayesian uncertainty-aware machine learning models for mixed-variable BO in materials design.
arXiv Detail & Related papers (2022-07-11T16:37:17Z) - Toward Learning Robust and Invariant Representations with Alignment
Regularization and Data Augmentation [76.85274970052762]
This paper is motivated by a proliferation of options of alignment regularizations.
We evaluate the performances of several popular design choices along the dimensions of robustness and invariance.
We also formally analyze the behavior of alignment regularization to complement our empirical study under assumptions we consider realistic.
arXiv Detail & Related papers (2022-06-04T04:29:19Z) - Goal-directed Generation of Discrete Structures with Conditional
Generative Models [85.51463588099556]
We introduce a novel approach to directly optimize a reinforcement learning objective, maximizing an expected reward.
We test our methodology on two tasks: generating molecules with user-defined properties and identifying short python expressions which evaluate to a given target value.
arXiv Detail & Related papers (2020-10-05T20:03:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.