Comparative Evaluation of Metaheuristic Algorithms for Hyperparameter
Selection in Short-Term Weather Forecasting
- URL: http://arxiv.org/abs/2309.02600v1
- Date: Tue, 5 Sep 2023 22:13:35 GMT
- Title: Comparative Evaluation of Metaheuristic Algorithms for Hyperparameter
Selection in Short-Term Weather Forecasting
- Authors: Anuvab Sen, Arul Rhik Mazumder, Dibyarup Dutta, Udayon Sen, Pathikrit
Syam and Sandipan Dhar
- Abstract summary: This paper explores the application of metaheuristic algorithms, namely Genetic Algorithm (GA), Differential Evolution (DE) and Particle Swarm Optimization (PSO)
We evaluate their performance in weather forecasting based on metrics such as Mean Squared Error (MSE) and Mean Absolute Percentage Error (MAPE)
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Weather forecasting plays a vital role in numerous sectors, but accurately
capturing the complex dynamics of weather systems remains a challenge for
traditional statistical models. Apart from Auto Regressive time forecasting
models like ARIMA, deep learning techniques (Vanilla ANNs, LSTM and GRU
networks), have shown promise in improving forecasting accuracy by capturing
temporal dependencies. This paper explores the application of metaheuristic
algorithms, namely Genetic Algorithm (GA), Differential Evolution (DE), and
Particle Swarm Optimization (PSO), to automate the search for optimal
hyperparameters in these model architectures. Metaheuristic algorithms excel in
global optimization, offering robustness, versatility, and scalability in
handling non-linear problems. We present a comparative analysis of different
model architectures integrated with metaheuristic optimization, evaluating
their performance in weather forecasting based on metrics such as Mean Squared
Error (MSE) and Mean Absolute Percentage Error (MAPE). The results demonstrate
the potential of metaheuristic algorithms in enhancing weather forecasting
accuracy \& helps in determining the optimal set of hyper-parameters for each
model. The paper underscores the importance of harnessing advanced optimization
techniques to select the most suitable metaheuristic algorithm for the given
weather forecasting task.
Related papers
- Testing the Efficacy of Hyperparameter Optimization Algorithms in Short-Term Load Forecasting [0.0]
We use the Panama Electricity dataset to evaluate HPO algorithms' performances on a surrogate forecasting algorithm, XGBoost, in terms of accuracy (i.e., MAPE, $R2$) and runtime.
Results reveal significant runtime advantages for HPO algorithms over Random Search.
arXiv Detail & Related papers (2024-10-19T09:08:52Z) - RHiOTS: A Framework for Evaluating Hierarchical Time Series Forecasting Algorithms [0.393259574660092]
RHiOTS is designed to assess the robustness of hierarchical time series forecasting models and algorithms on real-world datasets.
RHiOTS incorporates an innovative visualization component, turning complex, multidimensional robustness evaluation results into intuitive, easily interpretable visuals.
Our findings show that traditional statistical methods are more robust than state-of-the-art deep learning algorithms, except when the transformation effect is highly disruptive.
arXiv Detail & Related papers (2024-08-06T18:52:15Z) - End-to-End Learning for Fair Multiobjective Optimization Under
Uncertainty [55.04219793298687]
The Predict-Then-Forecast (PtO) paradigm in machine learning aims to maximize downstream decision quality.
This paper extends the PtO methodology to optimization problems with nondifferentiable Ordered Weighted Averaging (OWA) objectives.
It shows how optimization of OWA functions can be effectively integrated with parametric prediction for fair and robust optimization under uncertainty.
arXiv Detail & Related papers (2024-02-12T16:33:35Z) - Embedded feature selection in LSTM networks with multi-objective
evolutionary ensemble learning for time series forecasting [49.1574468325115]
We present a novel feature selection method embedded in Long Short-Term Memory networks.
Our approach optimize the weights and biases of the LSTM in a partitioned manner.
Experimental evaluations on air quality time series data from Italy and southeast Spain demonstrate that our method substantially improves the ability generalization of conventional LSTMs.
arXiv Detail & Related papers (2023-12-29T08:42:10Z) - Fine-Tuning Adaptive Stochastic Optimizers: Determining the Optimal Hyperparameter $ε$ via Gradient Magnitude Histogram Analysis [0.7366405857677226]
We introduce a new framework based on the empirical probability density function of the loss's magnitude, termed the "gradient magnitude histogram"
We propose a novel algorithm using gradient magnitude histograms to automatically estimate a refined and accurate search space for the optimal safeguard.
arXiv Detail & Related papers (2023-11-20T04:34:19Z) - Federated Conditional Stochastic Optimization [110.513884892319]
Conditional optimization has found in a wide range of machine learning tasks, such as in-variant learning tasks, AUPRC, andAML.
This paper proposes algorithms for distributed federated learning.
arXiv Detail & Related papers (2023-10-04T01:47:37Z) - Applications of Nature-Inspired Metaheuristic Algorithms for Tackling Optimization Problems Across Disciplines [12.664160352147293]
This paper demonstrates the usefulness of nature-inspired metaheuristic algorithms for solving a variety of challenging optimization problems in statistics.
The main goal of this paper is to show a typical metaheuristic algorithmi, like CSO-MA, is efficient for tackling many different types of optimization problems in statistics.
arXiv Detail & Related papers (2023-08-08T16:41:33Z) - Multi-objective hyperparameter optimization with performance uncertainty [62.997667081978825]
This paper presents results on multi-objective hyperparameter optimization with uncertainty on the evaluation of Machine Learning algorithms.
We combine the sampling strategy of Tree-structured Parzen Estimators (TPE) with the metamodel obtained after training a Gaussian Process Regression (GPR) with heterogeneous noise.
Experimental results on three analytical test functions and three ML problems show the improvement over multi-objective TPE and GPR.
arXiv Detail & Related papers (2022-09-09T14:58:43Z) - Adaptive LASSO estimation for functional hidden dynamic geostatistical
model [69.10717733870575]
We propose a novel model selection algorithm based on a penalized maximum likelihood estimator (PMLE) for functional hiddenstatistical models (f-HD)
The algorithm is based on iterative optimisation and uses an adaptive least absolute shrinkage and selector operator (GMSOLAS) penalty function, wherein the weights are obtained by the unpenalised f-HD maximum-likelihood estimators.
arXiv Detail & Related papers (2022-08-10T19:17:45Z) - Stochastic batch size for adaptive regularization in deep network
optimization [63.68104397173262]
We propose a first-order optimization algorithm incorporating adaptive regularization applicable to machine learning problems in deep learning framework.
We empirically demonstrate the effectiveness of our algorithm using an image classification task based on conventional network models applied to commonly used benchmark datasets.
arXiv Detail & Related papers (2020-04-14T07:54:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.