Addressing The Knapsack Challenge Through Cultural Algorithm
Optimization
- URL: http://arxiv.org/abs/2401.03324v1
- Date: Mon, 30 Oct 2023 17:05:19 GMT
- Title: Addressing The Knapsack Challenge Through Cultural Algorithm
Optimization
- Authors: Mohammad Saleh Vahdatpour
- Abstract summary: We introduce a novel variant of Cultural Algorithms tailored specifically for solving 0-1 knapsack problems.
Our proposed algorithm incorporates a belief space to refine the population and introduces two vital functions for dynamically adjusting the crossover and mutation rates.
We provide compelling evidence of the algorithm's remarkable efficiency in consistently locating the global optimum.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The "0-1 knapsack problem" stands as a classical combinatorial optimization
conundrum, necessitating the selection of a subset of items from a given set.
Each item possesses inherent values and weights, and the primary objective is
to formulate a selection strategy that maximizes the total value while adhering
to a predefined capacity constraint. In this research paper, we introduce a
novel variant of Cultural Algorithms tailored specifically for solving 0-1
knapsack problems, a well-known combinatorial optimization challenge. Our
proposed algorithm incorporates a belief space to refine the population and
introduces two vital functions for dynamically adjusting the crossover and
mutation rates during the evolutionary process. Through extensive
experimentation, we provide compelling evidence of the algorithm's remarkable
efficiency in consistently locating the global optimum, even in knapsack
problems characterized by high dimensions and intricate constraints.
Related papers
- Training Greedy Policy for Proposal Batch Selection in Expensive Multi-Objective Combinatorial Optimization [52.80408805368928]
We introduce a novel greedy-style subset selection algorithm for batch acquisition.
Our experiments on the red fluorescent proteins show that our proposed method achieves the baseline performance in 1.69x fewer queries.
arXiv Detail & Related papers (2024-06-21T05:57:08Z) - A quantum algorithm for the solution of the 0-1 Knapsack problem [0.0]
We introduce the ''Quantum Tree Generator'', an approach to generate in superposition all feasible solutions of a given instance.
We also introduce a high-level simulation strategy that exploits logging data from COMBO.
arXiv Detail & Related papers (2023-10-10T13:40:30Z) - Sample Complexity for Quadratic Bandits: Hessian Dependent Bounds and
Optimal Algorithms [64.10576998630981]
We show the first tight characterization of the optimal Hessian-dependent sample complexity.
A Hessian-independent algorithm universally achieves the optimal sample complexities for all Hessian instances.
The optimal sample complexities achieved by our algorithm remain valid for heavy-tailed noise distributions.
arXiv Detail & Related papers (2023-06-21T17:03:22Z) - Evolutionary Multi-Objective Algorithms for the Knapsack Problems with
Stochastic Profits [13.026567958569965]
We consider a version of the knapsack problem with profits to guarantee a certain level of confidence in the profit of an item.
We introduce multi-objective formulations of the profit chance constrained knapsack problem and design three bi-objective fitness evaluation methods.
We show the effectiveness of our approaches on several benchmarks for both settings.
arXiv Detail & Related papers (2023-03-03T03:28:51Z) - Tree ensemble kernels for Bayesian optimization with known constraints
over mixed-feature spaces [54.58348769621782]
Tree ensembles can be well-suited for black-box optimization tasks such as algorithm tuning and neural architecture search.
Two well-known challenges in using tree ensembles for black-box optimization are (i) effectively quantifying model uncertainty for exploration and (ii) optimizing over the piece-wise constant acquisition function.
Our framework performs as well as state-of-the-art methods for unconstrained black-box optimization over continuous/discrete features and outperforms competing methods for problems combining mixed-variable feature spaces and known input constraints.
arXiv Detail & Related papers (2022-07-02T16:59:37Z) - High Probability Complexity Bounds for Non-Smooth Stochastic Optimization with Heavy-Tailed Noise [51.31435087414348]
It is essential to theoretically guarantee that algorithms provide small objective residual with high probability.
Existing methods for non-smooth convex optimization have complexity bounds with dependence on confidence level.
We propose novel stepsize rules for two methods with gradient clipping.
arXiv Detail & Related papers (2021-06-10T17:54:21Z) - Adaptive Sampling of Pareto Frontiers with Binary Constraints Using
Regression and Classification [0.0]
We present a novel adaptive optimization algorithm for black-box multi-objective optimization problems with binary constraints.
Our method is based on probabilistic regression and classification models, which act as a surrogate for the optimization goals.
We also present a novel ellipsoid truncation method to speed up the expected hypervolume calculation.
arXiv Detail & Related papers (2020-08-27T09:15:02Z) - Convergence of adaptive algorithms for weakly convex constrained
optimization [59.36386973876765]
We prove the $mathcaltilde O(t-1/4)$ rate of convergence for the norm of the gradient of Moreau envelope.
Our analysis works with mini-batch size of $1$, constant first and second order moment parameters, and possibly smooth optimization domains.
arXiv Detail & Related papers (2020-06-11T17:43:19Z) - Specific Single- and Multi-Objective Evolutionary Algorithms for the
Chance-Constrained Knapsack Problem [14.352521012951865]
We introduce a new effective multi-objective model for the chance-constrained knapsack problem.
Our experimental results show that this leads to significant performance improvements when using the approach in evolutionary multi-objective algorithms.
arXiv Detail & Related papers (2020-04-07T08:46:51Z) - Cross Entropy Hyperparameter Optimization for Constrained Problem
Hamiltonians Applied to QAOA [68.11912614360878]
Hybrid quantum-classical algorithms such as Quantum Approximate Optimization Algorithm (QAOA) are considered as one of the most encouraging approaches for taking advantage of near-term quantum computers in practical applications.
Such algorithms are usually implemented in a variational form, combining a classical optimization method with a quantum machine to find good solutions to an optimization problem.
In this study we apply a Cross-Entropy method to shape this landscape, which allows the classical parameter to find better parameters more easily and hence results in an improved performance.
arXiv Detail & Related papers (2020-03-11T13:52:41Z) - Evolutionary Bi-objective Optimization for the Dynamic
Chance-Constrained Knapsack Problem Based on Tail Bound Objectives [12.634782111072585]
We consider the dynamic chance-constrained knapsack problem where the weight of each item is the subject of a capacity constraint.
The objective is to maximize the total profit subject to the probability that total weight exceeds the capacity.
We introduce an additional objective which estimates the minimal capacity bound for a given solution that still meets the chance constraint.
arXiv Detail & Related papers (2020-02-17T04:36:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.