Learning Low-Dimensional Embeddings for Black-Box Optimization
- URL: http://arxiv.org/abs/2505.01112v1
- Date: Fri, 02 May 2025 08:46:14 GMT
- Title: Learning Low-Dimensional Embeddings for Black-Box Optimization
- Authors: Riccardo Busetto, Manas Mejari, Marco Forgione, Alberto Bemporad, Dario Piga,
- Abstract summary: Black-box optimization (BBO) provides a valuable alternative to gradient-based methods.<n>BBO often struggles with high-dimensional problems and limited trial budgets.<n>We propose a novel approach based on meta-learning to pre-compute a reduced-dimensional manifold.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: When gradient-based methods are impractical, black-box optimization (BBO) provides a valuable alternative. However, BBO often struggles with high-dimensional problems and limited trial budgets. In this work, we propose a novel approach based on meta-learning to pre-compute a reduced-dimensional manifold where optimal points lie for a specific class of optimization problems. When optimizing a new problem instance sampled from the class, black-box optimization is carried out in the reduced-dimensional space, effectively reducing the effort required for finding near-optimal solutions.
Related papers
- An Adaptive Dropout Approach for High-Dimensional Bayesian Optimization [0.0]
We propose AdaDropout to tackle high-dimensional challenges and improve solution quality.<n>It achieves superior results when compared with state-of-the-art high-dimensional Bayesian optimization approaches.
arXiv Detail & Related papers (2025-04-15T16:23:25Z) - Scalable Min-Max Optimization via Primal-Dual Exact Pareto Optimization [66.51747366239299]
We propose a smooth variant of the min-max problem based on the augmented Lagrangian.<n>The proposed algorithm scales better with the number of objectives than subgradient-based strategies.
arXiv Detail & Related papers (2025-03-16T11:05:51Z) - Posterior Inference with Diffusion Models for High-dimensional Black-box Optimization [17.92257026306603]
generative models have emerged to solve black-box optimization problems.<n>We introduce textbfDiBO, a novel framework for solving high-dimensional black-box optimization problems.<n>Our method outperforms state-of-the-art baselines across various synthetic and real-world black-box optimization tasks.
arXiv Detail & Related papers (2025-02-24T04:19:15Z) - I3S: Importance Sampling Subspace Selection for Low-Rank Optimization in LLM Pretraining [50.89661053183944]
Low-rank optimization has emerged as a promising approach to enabling memory-efficient training of large language models (LLMs)<n>Existing low-rank optimization methods typically project gradients onto a low-rank subspace, reducing the memory cost of storing states.<n>We propose importance sampling subspace selection (I3S) for low-rank optimization, which theoretically offers a comparable convergence rate to the dominant subspace approach.
arXiv Detail & Related papers (2025-02-09T06:30:19Z) - High-Dimensional Bayesian Optimization Using Both Random and Supervised Embeddings [0.6291443816903801]
This paper proposes a high-dimensionnal optimization method incorporating linear embedding subspaces of small dimension.<n>The resulting BO method combines in an adaptive way both random and supervised linear embeddings.<n>The obtained results show the high potential of EGORSE to solve high-dimensional blackbox optimization problems.
arXiv Detail & Related papers (2025-02-02T16:57:05Z) - BOIDS: High-dimensional Bayesian Optimization via Incumbent-guided Direction Lines and Subspace Embeddings [14.558601519561721]
We introduce BOIDS, a novel high-dimensional BO algorithm that guides optimization by a sequence of one-dimensional direction lines.<n>We also propose an adaptive selection technique to identify most optimal lines for each round of line-based optimization.<n>Our experimental results show that BOIDS outperforms state-of-the-art baselines on various synthetic and real-world benchmark problems.
arXiv Detail & Related papers (2024-12-17T13:51:24Z) - Sharpness-Aware Black-Box Optimization [47.95184866255126]
We propose a Sharpness-Aware Black-box Optimization (SABO) algorithm, which applies a sharpness-aware minimization strategy to improve the model generalization.
Empirically, extensive experiments on the black-box prompt fine-tuning tasks demonstrate the effectiveness of the proposed SABO method in improving model generalization performance.
arXiv Detail & Related papers (2024-10-16T11:08:06Z) - Learning Joint Models of Prediction and Optimization [56.04498536842065]
Predict-Then-Then framework uses machine learning models to predict unknown parameters of an optimization problem from features before solving.
This paper proposes an alternative method, in which optimal solutions are learned directly from the observable features by joint predictive models.
arXiv Detail & Related papers (2024-09-07T19:52:14Z) - Discovering Preference Optimization Algorithms with and for Large Language Models [50.843710797024805]
offline preference optimization is a key method for enhancing and controlling the quality of Large Language Model (LLM) outputs.
We perform objective discovery to automatically discover new state-of-the-art preference optimization algorithms without (expert) human intervention.
Experiments demonstrate the state-of-the-art performance of DiscoPOP, a novel algorithm that adaptively blends logistic and exponential losses.
arXiv Detail & Related papers (2024-06-12T16:58:41Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.