A Tailored NSGA-III Instantiation for Flexible Job Shop Scheduling
- URL: http://arxiv.org/abs/2004.06564v1
- Date: Tue, 14 Apr 2020 14:49:36 GMT
- Title: A Tailored NSGA-III Instantiation for Flexible Job Shop Scheduling
- Authors: Yali Wang, Bas van Stein, Michael T.M. Emmerich, Thomas B\"ack
- Abstract summary: A customized evolutionary algorithm is proposed to solve the flexible job scheduling problem.
Different local search strategies are employed to explore the neighborhood parameters for better solutions.
The experimental results show excellent performance with less computing budget.
- Score: 18.401817124823832
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A customized multi-objective evolutionary algorithm (MOEA) is proposed for
the multi-objective flexible job shop scheduling problem (FJSP). It uses smart
initialization approaches to enrich the first generated population, and
proposes various crossover operators to create a better diversity of offspring.
Especially, the MIP-EGO configurator, which can tune algorithm parameters, is
adopted to automatically tune operator probabilities. Furthermore, different
local search strategies are employed to explore the neighborhood for better
solutions. In general, the algorithm enhancement strategy can be integrated
with any standard EMO algorithm. In this paper, it has been combined with
NSGA-III to solve benchmark multi-objective FJSPs, whereas an off-the-shelf
implementation of NSGA-III is not capable of solving the FJSP. The experimental
results show excellent performance with less computing budget.
Related papers
- Decoding-Time Language Model Alignment with Multiple Objectives [116.42095026960598]
Existing methods primarily focus on optimizing LMs for a single reward function, limiting their adaptability to varied objectives.
Here, we propose $textbfmulti-objective decoding (MOD)$, a decoding-time algorithm that outputs the next token from a linear combination of predictions.
We show why existing approaches can be sub-optimal even in natural settings and obtain optimality guarantees for our method.
arXiv Detail & Related papers (2024-06-27T02:46:30Z) - A new simplified MOPSO based on Swarm Elitism and Swarm Memory: MO-ETPSO [0.0]
Elitist PSO (MO-ETPSO) is adapted for multi-objective optimization problems.
The proposed algorithm integrates core strategies from the well-established NSGA-II approach.
A novel aspect of the algorithm is the introduction of a swarm memory and swarm elitism.
arXiv Detail & Related papers (2024-02-20T09:36:18Z) - Multi-objective Binary Coordinate Search for Feature Selection [0.24578723416255746]
We propose the binary multi-objective coordinate search (MOCS) algorithm to solve large-scale feature selection problems.
Results indicate the significant superiority of our method over NSGA-II, on five real-world large-scale datasets.
arXiv Detail & Related papers (2024-02-20T00:50:26Z) - PS-AAS: Portfolio Selection for Automated Algorithm Selection in
Black-Box Optimization [4.842307002343618]
The performance of automated algorithm selection depends on the portfolio of algorithms to choose from.
In practice, probably the most common way to choose the algorithms for the portfolio is a greedy selection of the algorithms that perform well in some reference tasks of interest.
Our proposed method creates algorithm behavior meta-representations, constructs a graph from a set of algorithms based on their meta-representation similarity, and applies a graph algorithm to select a final portfolio of diverse, representative, and non-redundant algorithms.
arXiv Detail & Related papers (2023-10-14T12:13:41Z) - Maximize to Explore: One Objective Function Fusing Estimation, Planning,
and Exploration [87.53543137162488]
We propose an easy-to-implement online reinforcement learning (online RL) framework called textttMEX.
textttMEX integrates estimation and planning components while balancing exploration exploitation automatically.
It can outperform baselines by a stable margin in various MuJoCo environments with sparse rewards.
arXiv Detail & Related papers (2023-05-29T17:25:26Z) - HKNAS: Classification of Hyperspectral Imagery Based on Hyper Kernel
Neural Architecture Search [104.45426861115972]
We propose to directly generate structural parameters by utilizing the specifically designed hyper kernels.
We obtain three kinds of networks to separately conduct pixel-level or image-level classifications with 1-D or 3-D convolutions.
A series of experiments on six public datasets demonstrate that the proposed methods achieve state-of-the-art results.
arXiv Detail & Related papers (2023-04-23T17:27:40Z) - OFA$^2$: A Multi-Objective Perspective for the Once-for-All Neural
Architecture Search [79.36688444492405]
Once-for-All (OFA) is a Neural Architecture Search (NAS) framework designed to address the problem of searching efficient architectures for devices with different resources constraints.
We aim to give one step further in the search for efficiency by explicitly conceiving the search stage as a multi-objective optimization problem.
arXiv Detail & Related papers (2023-03-23T21:30:29Z) - A Simple Evolutionary Algorithm for Multi-modal Multi-objective
Optimization [0.0]
We introduce a steady-state evolutionary algorithm for solving multi-modal, multi-objective optimization problems (MMOPs)
We report its performance on 21 MMOPs from various test suites that are widely used for benchmarking using a low computational budget of 1000 function evaluations.
arXiv Detail & Related papers (2022-01-18T03:31:11Z) - Result Diversification by Multi-objective Evolutionary Algorithms with
Theoretical Guarantees [94.72461292387146]
We propose to reformulate the result diversification problem as a bi-objective search problem, and solve it by a multi-objective evolutionary algorithm (EA)
We theoretically prove that the GSEMO can achieve the optimal-time approximation ratio, $1/2$.
When the objective function changes dynamically, the GSEMO can maintain this approximation ratio in running time, addressing the open question proposed by Borodin et al.
arXiv Detail & Related papers (2021-10-18T14:00:22Z) - ES-Based Jacobian Enables Faster Bilevel Optimization [53.675623215542515]
Bilevel optimization (BO) has arisen as a powerful tool for solving many modern machine learning problems.
Existing gradient-based methods require second-order derivative approximations via Jacobian- or/and Hessian-vector computations.
We propose a novel BO algorithm, which adopts Evolution Strategies (ES) based method to approximate the response Jacobian matrix in the hypergradient of BO.
arXiv Detail & Related papers (2021-10-13T19:36:50Z) - Hybrid Henry Gas Solubility Optimization Algorithm with Dynamic
Cluster-to-Algorithm Mapping for Search-based Software Engineering Problems [1.0323063834827413]
This paper discusses a new variant of the Henry Gas Solubility Optimization (HGSO) Algorithm, called Hybrid HGSO (HHGSO)
Unlike its predecessor, HHGSO allows multiple clusters serving different individual meta-heuristic algorithms to coexist within the same population.
Exploiting the dynamic cluster-to-algorithm mapping via penalized and reward model with adaptive switching factor, HHGSO offers a novel approach for meta-heuristic hybridization.
arXiv Detail & Related papers (2021-05-31T12:42:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.