Sample-Efficient Bayesian Transfer Learning for Online Machine Parameter Optimization
- URL: http://arxiv.org/abs/2503.15928v2
- Date: Fri, 21 Mar 2025 10:32:21 GMT
- Title: Sample-Efficient Bayesian Transfer Learning for Online Machine Parameter Optimization
- Authors: Philipp Wagner, Tobias Nagel, Philipp Leube, Marco F. Huber,
- Abstract summary: This work introduces a method to optimize the machine parameters in the system itself using a Bayesian optimization algorithm.<n>By leveraging existing machine data, we use a transfer learning approach in order to identify an optimum with minimal iterations.<n>We validate our approach on a laser machine for cutting sheet metal in the real world.
- Score: 5.467297536043163
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Correctly setting the parameters of a production machine is essential to improve product quality, increase efficiency, and reduce production costs while also supporting sustainability goals. Identifying optimal parameters involves an iterative process of producing an object and evaluating its quality. Minimizing the number of iterations is, therefore, desirable to reduce the costs associated with unsuccessful attempts. This work introduces a method to optimize the machine parameters in the system itself using a Bayesian optimization algorithm. By leveraging existing machine data, we use a transfer learning approach in order to identify an optimum with minimal iterations, resulting in a cost-effective transfer learning algorithm. We validate our approach on a laser machine for cutting sheet metal in the real world.
Related papers
- Autocorrelated Optimize-via-Estimate: Predict-then-Optimize versus Finite-sample Optimal [2.0228793142608588]
Models that directly optimize for out-of-sample performance in the finite-sample regime have emerged as a promising alternative to traditional estimate-then-optimize approaches.<n>We compare their performance in the context of autocorrelated uncertainties, specifically, under a Vector Autoregressive Moving Average VARMA(p,q) process.
arXiv Detail & Related papers (2026-02-02T09:49:51Z) - A Collaborative Process Parameter Recommender System for Fleets of Networked Manufacturing Machines -- with Application to 3D Printing [4.886682562411186]
3D printing farms consist of multiple networked 3D printers operating in parallel.<n> optimizing process parameters across a fleet of manufacturing machines, even of the same type, remains a challenge due to machine-to-machine variability.<n>We introduce a machine learning-based collaborative recommender system that optimize process parameters for each machine in a fleet by modeling the problem as a sequential matrix completion task.
arXiv Detail & Related papers (2025-06-13T21:56:53Z) - Self-Steering Optimization: Autonomous Preference Optimization for Large Language Models [79.84205827056907]
We present Self-Steering Optimization ($SSO$), an algorithm that autonomously generates high-quality preference data.<n>$SSO$ employs a specialized optimization objective to build a data generator from the policy model itself, which is used to produce accurate and on-policy data.<n>Our evaluation shows that $SSO$ consistently outperforms baselines in human preference alignment and reward optimization.
arXiv Detail & Related papers (2024-10-22T16:04:03Z) - Cost-Sensitive Multi-Fidelity Bayesian Optimization with Transfer of Learning Curve Extrapolation [55.75188191403343]
We introduce utility, which is a function predefined by each user and describes the trade-off between cost and performance of BO.
We validate our algorithm on various LC datasets and found it outperform all the previous multi-fidelity BO and transfer-BO baselines we consider.
arXiv Detail & Related papers (2024-05-28T07:38:39Z) - Adaptive Bayesian Optimization for High-Precision Motion Systems [2.073673208115137]
We propose a real-time purely data-driven, model-free approach for adaptive control, by online tuning low-level controller parameters.
We base our algorithm on GoOSE, an algorithm for safe and sample-efficient Bayesian optimization.
We evaluate the algorithm's performance on a real precision-motion system utilized in semiconductor industry applications.
arXiv Detail & Related papers (2024-04-22T21:58:23Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - TranDRL: A Transformer-Driven Deep Reinforcement Learning Enabled Prescriptive Maintenance Framework [58.474610046294856]
Industrial systems demand reliable predictive maintenance strategies to enhance operational efficiency and reduce downtime.
This paper introduces an integrated framework that leverages the capabilities of the Transformer model-based neural networks and deep reinforcement learning (DRL) algorithms to optimize system maintenance actions.
arXiv Detail & Related papers (2023-09-29T02:27:54Z) - Landscape-Sketch-Step: An AI/ML-Based Metaheuristic for Surrogate
Optimization Problems [0.0]
We introduce a newimats for global optimization in scenarios where extensive evaluations of the cost function are expensive, inaccessible, or even prohibitive.
The method, which we call Landscape-Sketch-and-Step (LSS), combines Machine Learning, Replica Optimization, and Reinforcement Learning techniques.
arXiv Detail & Related papers (2023-09-14T01:53:45Z) - Effective Pre-Training Objectives for Transformer-based Autoencoders [97.99741848756302]
We study trade-offs between efficiency, cost and accuracy of Transformer encoders.
We combine features of common objectives and create new effective pre-training approaches.
arXiv Detail & Related papers (2022-10-24T18:39:44Z) - Advanced Manufacturing Configuration by Sample-efficient Batch Bayesian
Optimization [5.766036473197784]
The framework unifies a tailored acquisition function, a parallel acquisition procedure, and the integration of process information.
We apply the optimization approach to atmospheric plasma spraying in simulation and experiments.
arXiv Detail & Related papers (2022-05-24T06:45:06Z) - Optimal Parameter-free Online Learning with Switching Cost [47.415099037249085]
-freeness in online learning refers to the adaptivity of an algorithm with respect to the optimal decision in hindsight.
In this paper, we design such algorithms in the presence of switching cost - the latter penalizes the optimistic updates required by parameter-freeness.
We propose a simple yet powerful algorithm for Online Linear Optimization (OLO) with switching cost, which improves the existing suboptimal regret bound [ZCP22a] to the optimal rate.
arXiv Detail & Related papers (2022-05-13T18:44:27Z) - Bayesian Optimization for Selecting Efficient Machine Learning Models [53.202224677485525]
We present a unified Bayesian Optimization framework for jointly optimizing models for both prediction effectiveness and training efficiency.
Experiments on model selection for recommendation tasks indicate models selected this way significantly improves model training efficiency.
arXiv Detail & Related papers (2020-08-02T02:56:30Z) - Automatically Learning Compact Quality-aware Surrogates for Optimization
Problems [55.94450542785096]
Solving optimization problems with unknown parameters requires learning a predictive model to predict the values of the unknown parameters and then solving the problem using these values.
Recent work has shown that including the optimization problem as a layer in a complex training model pipeline results in predictions of iteration of unobserved decision making.
We show that we can improve solution quality by learning a low-dimensional surrogate model of a large optimization problem.
arXiv Detail & Related papers (2020-06-18T19:11:54Z) - Towards Automatic Bayesian Optimization: A first step involving
acquisition functions [0.0]
Bayesian optimization is the state of the art technique for the optimization of black boxes, i.e., functions where we do not have access to their analytical expression.
We propose a first attempt over automatic bayesian optimization by exploring several techniques that automatically tune the acquisition function.
arXiv Detail & Related papers (2020-03-21T12:22:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.