Social Distancing Induced Coronavirus Optimization Algorithm (COVO): Application to Multimodal Function Optimization and Noise Removal
- URL: http://arxiv.org/abs/2411.17282v1
- Date: Tue, 26 Nov 2024 10:09:36 GMT
- Title: Social Distancing Induced Coronavirus Optimization Algorithm (COVO): Application to Multimodal Function Optimization and Noise Removal
- Authors: Om Ramakisan Varma, Mala Kalra,
- Abstract summary: The pace of propagation of the coronavirus can indeed be slowed by maintaining social distance.
The main motive of COVO optimization is to obtain a global solution to various applications by solving complex problems with faster convergence.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The metaheuristic optimization technique attained more awareness for handling complex optimization problems. Over the last few years, numerous optimization techniques have been developed that are inspired by natural phenomena. Recently, the propagation of the new COVID-19 implied a burden on the public health system to suffer several deaths. Vaccination, masks, and social distancing are the major steps taken to minimize the spread of the deadly COVID-19 virus. Considering the social distance to combat the coronavirus epidemic, a novel bio-inspired metaheuristic optimization model is proposed in this work, and it is termed as Social Distancing Induced Coronavirus Optimization Algorithm (COVO). The pace of propagation of the coronavirus can indeed be slowed by maintaining social distance. Thirteen benchmark functions are used to evaluate the COVO performance for discrete, continuous, and complex problems, and the COVO model performance is compared with other well-known optimization algorithms. The main motive of COVO optimization is to obtain a global solution to various applications by solving complex problems with faster convergence. At last, the validated results depict that the proposed COVO optimization has a reasonable and acceptable performance.
Related papers
- Goat Optimization Algorithm: A Novel Bio-Inspired Metaheuristic for Global Optimization [1.2289361708127877]
This paper presents a novel bio-inspired metaheuristic optimization technique inspired by goats' adaptive foraging, strategic movement, and parasite avoidance behaviors.
The algorithm's performance is evaluated on standard unimodal benchmark functions.
The findings suggest that GOA is a promising advancement in bio-inspired optimization techniques.
arXiv Detail & Related papers (2025-03-04T06:44:07Z) - Random-Key Algorithms for Optimizing Integrated Operating Room Scheduling [0.16385815610837165]
This study introduces a novel concept of Random-Key (RKO), rigorously tested on literature and new real-world inspired instances.
Our literature optimization problem incorporates multi-room scheduling, equipment scheduling, and complex availability constraints.
The RKO approach represents solutions as points in a continuous space, which are then mapped in the problem solution space via a deterministic function known as a decoder.
arXiv Detail & Related papers (2025-01-17T15:11:30Z) - Preventing Local Pitfalls in Vector Quantization via Optimal Transport [77.15924044466976]
We introduce OptVQ, a novel vector quantization method that employs the Sinkhorn algorithm to optimize the optimal transport problem.
Our experiments on image reconstruction tasks demonstrate that OptVQ achieves 100% codebook utilization and surpasses current state-of-the-art VQNs in reconstruction quality.
arXiv Detail & Related papers (2024-12-19T18:58:14Z) - Advancements in Optimization: Adaptive Differential Evolution with
Diversification Strategy [0.0]
The study employs single-objective optimization in a two-dimensional space and runs ADEDS on each of the benchmark functions with multiple iterations.
ADEDS consistently outperforms standard DE for a variety of optimization challenges, including functions with numerous local optima, plate-shaped, valley-shaped, stretched-shaped, and noisy functions.
arXiv Detail & Related papers (2023-10-02T10:05:41Z) - A Data-Driven Evolutionary Transfer Optimization for Expensive Problems
in Dynamic Environments [9.098403098464704]
Data-driven, a.k.a. surrogate-assisted, evolutionary optimization has been recognized as an effective approach for tackling expensive black-box optimization problems.
This paper proposes a simple but effective transfer learning framework to empower data-driven evolutionary optimization to solve dynamic optimization problems.
Experiments on synthetic benchmark test problems and a real-world case study demonstrate the effectiveness of our proposed algorithm.
arXiv Detail & Related papers (2022-11-05T11:19:50Z) - An Empirical Evaluation of Zeroth-Order Optimization Methods on
AI-driven Molecule Optimization [78.36413169647408]
We study the effectiveness of various ZO optimization methods for optimizing molecular objectives.
We show the advantages of ZO sign-based gradient descent (ZO-signGD)
We demonstrate the potential effectiveness of ZO optimization methods on widely used benchmark tasks from the Guacamol suite.
arXiv Detail & Related papers (2022-10-27T01:58:10Z) - Discrete Stochastic Optimization for Public Health Interventions with
Constraints [1.8275108630751844]
This paper addresses aspects of the 2009 H1N1 and the COVID-19 pandemics with the spread of disease modeled by the open source Monte Carlo simulations.
The objective of the optimization is to determine the best combination of intervention strategies so as to result in minimal economic loss to society.
arXiv Detail & Related papers (2022-06-27T21:21:25Z) - Resource Planning for Hospitals Under Special Consideration of the
COVID-19 Pandemic: Optimization and Sensitivity Analysis [87.31348761201716]
Crises like the COVID-19 pandemic pose a serious challenge to health-care institutions.
BaBSim.Hospital is a tool for capacity planning based on discrete event simulation.
We aim to investigate and optimize these parameters to improve BaBSim.Hospital.
arXiv Detail & Related papers (2021-05-16T12:38:35Z) - Speeding up Computational Morphogenesis with Online Neural Synthetic
Gradients [51.42959998304931]
A wide range of modern science and engineering applications are formulated as optimization problems with a system of partial differential equations (PDEs) as constraints.
These PDE-constrained optimization problems are typically solved in a standard discretize-then-optimize approach.
We propose a general framework to speed up PDE-constrained optimization using online neural synthetic gradients (ONSG) with a novel two-scale optimization scheme.
arXiv Detail & Related papers (2021-04-25T22:43:51Z) - Balancing Common Treatment and Epidemic Control in Medical Procurement
during COVID-19: Transform-and-Divide Evolutionary Optimization [10.29490155247067]
Balancing common disease treatment and epidemic control is a key objective of medical supplies procurement in hospitals during a pandemic such as COVID-19.
We present an approach that first transforms the original high-dimensional, constrained multiobjective optimization problem to a low-dimensional, unconstrained multiobjective optimization problem.
We show that the proposed approach exhibits significantly better performance than that of directly solving the original problem.
arXiv Detail & Related papers (2020-08-02T04:47:34Z) - EOS: a Parallel, Self-Adaptive, Multi-Population Evolutionary Algorithm
for Constrained Global Optimization [68.8204255655161]
EOS is a global optimization algorithm for constrained and unconstrained problems of real-valued variables.
It implements a number of improvements to the well-known Differential Evolution (DE) algorithm.
Results prove that EOSis capable of achieving increased performance compared to state-of-the-art single-population self-adaptive DE algorithms.
arXiv Detail & Related papers (2020-07-09T10:19:22Z) - A Novel Meta-Heuristic Optimization Algorithm Inspired by the Spread of
Viruses [0.0]
A novel nature-inspired meta-heuristic optimization algorithm called virus spread optimization (VSO) is proposed.
VSO loosely mimics the spread of viruses among hosts, and can be effectively applied to solving many challenging and continuous optimization problems.
arXiv Detail & Related papers (2020-06-11T09:35:28Z) - Global Optimization of Gaussian processes [52.77024349608834]
We propose a reduced-space formulation with trained Gaussian processes trained on few data points.
The approach also leads to significantly smaller and computationally cheaper sub solver for lower bounding.
In total, we reduce time convergence by orders of orders of the proposed method.
arXiv Detail & Related papers (2020-05-21T20:59:11Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.