Improving Performance Insensitivity of Large-scale Multiobjective
Optimization via Monte Carlo Tree Search
- URL: http://arxiv.org/abs/2304.04071v2
- Date: Fri, 14 Apr 2023 04:46:23 GMT
- Title: Improving Performance Insensitivity of Large-scale Multiobjective
Optimization via Monte Carlo Tree Search
- Authors: Haokai Hong, Min Jiang, and Gary G. Yen
- Abstract summary: We propose an evolutionary algorithm for solving large-scale multiobjective optimization problems based on Monte Carlo tree search.
The proposed method samples the decision variables to construct new nodes on the Monte Carlo tree for optimization and evaluation.
It selects nodes with good evaluation for further search to reduce the performance sensitivity caused by large-scale decision variables.
- Score: 7.34812867861951
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The large-scale multiobjective optimization problem (LSMOP) is characterized
by simultaneously optimizing multiple conflicting objectives and involving
hundreds of decision variables. Many real-world applications in engineering
fields can be modeled as LSMOPs; simultaneously, engineering applications
require insensitivity in performance. This requirement usually means that the
results from the algorithm runs should not only be good for every run in terms
of performance but also that the performance of multiple runs should not
fluctuate too much, i.e., the algorithm shows good insensitivity. Considering
that substantial computational resources are requested for each run, it is
essential to improve upon the performance of the large-scale multiobjective
optimization algorithm, as well as the insensitivity of the algorithm. However,
existing large-scale multiobjective optimization algorithms solely focus on
improving the performance of the algorithms, leaving the insensitivity
characteristics unattended. In this work, we propose an evolutionary algorithm
for solving LSMOPs based on Monte Carlo tree search, the so-called LMMOCTS,
which aims to improve the performance and insensitivity for large-scale
multiobjective optimization problems. The proposed method samples the decision
variables to construct new nodes on the Monte Carlo tree for optimization and
evaluation. It selects nodes with good evaluation for further search to reduce
the performance sensitivity caused by large-scale decision variables. We
compare the proposed algorithm with several state-of-the-art designs on
different benchmark functions. We also propose two metrics to measure the
sensitivity of the algorithm. The experimental results confirm the
effectiveness and performance insensitivity of the proposed design for solving
large-scale multiobjective optimization problems.
Related papers
- Iterative or Innovative? A Problem-Oriented Perspective for Code Optimization [81.88668100203913]
Large language models (LLMs) have demonstrated strong capabilities in solving a wide range of programming tasks.
In this paper, we explore code optimization with a focus on performance enhancement, specifically aiming to optimize code for minimal execution time.
arXiv Detail & Related papers (2024-06-17T16:10:10Z) - Equitable and Fair Performance Evaluation of Whale Optimization
Algorithm [4.0814527055582746]
It is essential that all algorithms are exhaustively, somewhat, and intelligently evaluated.
evaluating the effectiveness of optimization algorithms equitably and fairly is not an easy process for various reasons.
arXiv Detail & Related papers (2023-09-04T06:32:02Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Efficient Non-Parametric Optimizer Search for Diverse Tasks [93.64739408827604]
We present the first efficient scalable and general framework that can directly search on the tasks of interest.
Inspired by the innate tree structure of the underlying math expressions, we re-arrange the spaces into a super-tree.
We adopt an adaptation of the Monte Carlo method to tree search, equipped with rejection sampling and equivalent- form detection.
arXiv Detail & Related papers (2022-09-27T17:51:31Z) - Balancing Exploration and Exploitation for Solving Large-scale
Multiobjective Optimization via Attention Mechanism [18.852491892952514]
We propose a large-scale multiobjective optimization algorithm based on the attention mechanism, called (LMOAM)
The attention mechanism will assign a unique weight to each decision variable, and LMOAM will use this weight to strike a balance between exploration and exploitation from the decision variable level.
arXiv Detail & Related papers (2022-05-20T09:45:49Z) - A Simple Evolutionary Algorithm for Multi-modal Multi-objective
Optimization [0.0]
We introduce a steady-state evolutionary algorithm for solving multi-modal, multi-objective optimization problems (MMOPs)
We report its performance on 21 MMOPs from various test suites that are widely used for benchmarking using a low computational budget of 1000 function evaluations.
arXiv Detail & Related papers (2022-01-18T03:31:11Z) - Solving Large-Scale Multi-Objective Optimization via Probabilistic
Prediction Model [10.916384208006157]
An efficient LSMOP algorithm should have the ability to escape the local optimal solution from the huge search space.
Maintaining the diversity of the population is one of the effective ways to improve search efficiency.
We propose a probabilistic prediction model based on trend prediction model and generating-filtering strategy, called LT-PPM, to tackle the LSMOP.
arXiv Detail & Related papers (2021-07-16T09:43:35Z) - Towards Optimally Efficient Tree Search with Deep Learning [76.64632985696237]
This paper investigates the classical integer least-squares problem which estimates signals integer from linear models.
The problem is NP-hard and often arises in diverse applications such as signal processing, bioinformatics, communications and machine learning.
We propose a general hyper-accelerated tree search (HATS) algorithm by employing a deep neural network to estimate the optimal estimation for the underlying simplified memory-bounded A* algorithm.
arXiv Detail & Related papers (2021-01-07T08:00:02Z) - Generalized and Scalable Optimal Sparse Decision Trees [56.35541305670828]
We present techniques that produce optimal decision trees over a variety of objectives.
We also introduce a scalable algorithm that produces provably optimal results in the presence of continuous variables.
arXiv Detail & Related papers (2020-06-15T19:00:11Z) - Multi-objective beetle antennae search algorithm [4.847470451539327]
The proposed multi-objective beetle antennae search algorithm is tested using four well-selected benchmark functions.
The results show that the proposed multi-objective beetle antennae search algorithm has higher computational efficiency with satisfactory accuracy.
arXiv Detail & Related papers (2020-02-24T06:34:32Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.