A Comparative Study on Parameter Estimation in Software Reliability
Modeling using Swarm Intelligence
- URL: http://arxiv.org/abs/2003.04770v1
- Date: Sun, 8 Mar 2020 16:35:42 GMT
- Title: A Comparative Study on Parameter Estimation in Software Reliability
Modeling using Swarm Intelligence
- Authors: Najla Akram AL-Saati, Marrwa Abd-AlKareem Alabajee
- Abstract summary: This work focuses on a comparison between the performances of two well-known Swarm algorithms: Cuckoo Search (CS) and Firefly Algorithm (FA)
All algorithms are evaluated according to real software failure data, the tests are performed and the obtained results are compared to show the performance of each of the used algorithms.
Experimental results show that CS is more efficient in estimating the parameters of SRGMs, and it has outperformed FA in addition to PSO and ACO for the selected Data sets and employed models.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work focuses on a comparison between the performances of two well-known
Swarm algorithms: Cuckoo Search (CS) and Firefly Algorithm (FA), in estimating
the parameters of Software Reliability Growth Models. This study is further
reinforced using Particle Swarm Optimization (PSO) and Ant Colony Optimization
(ACO). All algorithms are evaluated according to real software failure data,
the tests are performed and the obtained results are compared to show the
performance of each of the used algorithms. Furthermore, CS and FA are also
compared with each other on bases of execution time and iteration number.
Experimental results show that CS is more efficient in estimating the
parameters of SRGMs, and it has outperformed FA in addition to PSO and ACO for
the selected Data sets and employed models.
Related papers
- Parameter optimization comparison in QAOA using Stochastic Hill Climbing with Random Re-starts and Local Search with entangled and non-entangled mixing operators [0.0]
This study investigates the efficacy of Hill Climbing with Random Restarts (SHC-RR) compared to Local Search (LS) strategies.
Our results consistently show that SHC-RR outperforms LS approaches, showcasing superior efficacy despite its ostensibly simpler optimization mechanism.
arXiv Detail & Related papers (2024-05-14T20:12:17Z) - Online Variational Sequential Monte Carlo [49.97673761305336]
We build upon the variational sequential Monte Carlo (VSMC) method, which provides computationally efficient and accurate model parameter estimation and Bayesian latent-state inference.
Online VSMC is capable of performing efficiently, entirely on-the-fly, both parameter estimation and particle proposal adaptation.
arXiv Detail & Related papers (2023-12-19T21:45:38Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Comparative Evaluation of Metaheuristic Algorithms for Hyperparameter
Selection in Short-Term Weather Forecasting [0.0]
This paper explores the application of metaheuristic algorithms, namely Genetic Algorithm (GA), Differential Evolution (DE) and Particle Swarm Optimization (PSO)
We evaluate their performance in weather forecasting based on metrics such as Mean Squared Error (MSE) and Mean Absolute Percentage Error (MAPE)
arXiv Detail & Related papers (2023-09-05T22:13:35Z) - Revisiting the Evaluation of Image Synthesis with GANs [55.72247435112475]
This study presents an empirical investigation into the evaluation of synthesis performance, with generative adversarial networks (GANs) as a representative of generative models.
In particular, we make in-depth analyses of various factors, including how to represent a data point in the representation space, how to calculate a fair distance using selected samples, and how many instances to use from each set.
arXiv Detail & Related papers (2023-04-04T17:54:32Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Forecasting Algorithms for Causal Inference with Panel Data [1.4628956155379451]
We adapt a deep neural architecture for time series forecasting (the N-BEATS algorithm) to more accurately impute the counterfactual evolution of a treated unit had treatment not occurred.
Across a range of settings, the resulting estimator (SyNBEATS'') significantly outperforms commonly employed methods.
An implementation of this estimator is available for public use.
arXiv Detail & Related papers (2022-08-06T10:23:38Z) - Duck swarm algorithm: theory, numerical optimization, and applications [6.244015536594532]
A swarm intelligence-based optimization algorithm, named Duck Swarm Algorithm (DSA), is proposed in this study.
Two rules are modeled from the finding food and foraging of the duck, which corresponds to the exploration and exploitation phases of the proposed DSA.
Results show that DSA is a high-performance optimization method in terms of convergence speed and exploration-exploitation balance.
arXiv Detail & Related papers (2021-12-27T04:53:36Z) - Learning to Hash Robustly, with Guarantees [79.68057056103014]
In this paper, we design an NNS algorithm for the Hamming space that has worst-case guarantees essentially matching that of theoretical algorithms.
We evaluate the algorithm's ability to optimize for a given dataset both theoretically and practically.
Our algorithm has a 1.8x and 2.1x better recall on the worst-performing queries to the MNIST and ImageNet datasets.
arXiv Detail & Related papers (2021-08-11T20:21:30Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Cat Swarm Optimization Algorithm -- A Survey and Performance Evaluation [0.9990687944474739]
Cat Swarm Optimization (CSO) algorithm is a robust and powerful metaheuristic swarm-based optimization approach.
This paper presents an in-depth survey and performance evaluation of CSO algorithm.
arXiv Detail & Related papers (2020-01-10T18:18:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.