Memory Enhanced Fractional-Order Dung Beetle Optimization for Photovoltaic Parameter Identification
- URL: http://arxiv.org/abs/2508.06841v1
- Date: Sat, 09 Aug 2025 05:48:30 GMT
- Title: Memory Enhanced Fractional-Order Dung Beetle Optimization for Photovoltaic Parameter Identification
- Authors: Yiwei Li, Zhihua Allen-Zhao, Yuncheng Xu, Sanyang Liu,
- Abstract summary: This paper proposes a Memory Enhanced Fractional-Order Dung Beetle Optimization (MFO-DBO) algorithm that integrates three coordinated strategies.<n>It consistently outperforms advanced DBO variants, FO-based algorithms, enhanced classical algorithms, and recent metaheuristics in terms of accuracy, robustness, convergence speed.
- Score: 8.924286864388922
- License: http://creativecommons.org/licenses/by-nc-sa/4.0/
- Abstract: Accurate parameter identification in photovoltaic (PV) models is crucial for performance evaluation but remains challenging due to their nonlinear, multimodal, and high-dimensional nature. Although the Dung Beetle Optimization (DBO) algorithm has shown potential in addressing such problems, it often suffers from premature convergence. To overcome these issues, this paper proposes a Memory Enhanced Fractional-Order Dung Beetle Optimization (MFO-DBO) algorithm that integrates three coordinated strategies. Firstly, fractional-order (FO) calculus introduces memory into the search process, enhancing convergence stability and solution quality. Secondly, a fractional-order logistic chaotic map improves population diversity during initialization. Thirdly, a chaotic perturbation mechanism helps elite solutions escape local optima. Numerical results on the CEC2017 benchmark suite and the PV parameter identification problem demonstrate that MFO-DBO consistently outperforms advanced DBO variants, CEC competition winners, FO-based optimizers, enhanced classical algorithms, and recent metaheuristics in terms of accuracy, robustness, convergence speed, while also maintaining an excellent balance between exploration and exploitation compared to the standard DBO algorithm.
Related papers
- Beyond Confidence: Adaptive and Coherent Decoding for Diffusion Language Models [64.92045568376705]
Coherent Contextual Decoding (CCD) is a novel inference framework built upon two core innovations.<n>CCD employs a trajectory rectification mechanism that leverages historical context to enhance sequence coherence.<n>Instead of rigid allocations based on diffusion steps, we introduce an adaptive sampling strategy that dynamically adjusts the unmasking budget for each step.
arXiv Detail & Related papers (2025-11-26T09:49:48Z) - Problem-Parameter-Free Decentralized Bilevel Optimization [31.15538292038612]
Decentralized bilevel optimization has garnered significant attention due to its critical role in solving large-scale machine learning problems.<n>In this paper, we propose AdaSDBO, a fully problem- parameter-free algorithm for decentralized bilevel optimization with a single-loop structure.<n>Through rigorous theoretical analysis, we establish that AdaSDBO achieves a convergence rate of $widetildemathcalOleft(frac1Tright)$, matching the performance of well-tuned state-of-the-art methods up to polylogarithmic factors.
arXiv Detail & Related papers (2025-10-28T10:50:04Z) - Nonlinear Dimensionality Reduction Techniques for Bayesian Optimization [0.9303501974597549]
We investigate nonlinear dimensionality reduction techniques that reduce the problem to a sequence of low-dimensional Latent-Space BO (LSBO)<n>We propose some changes in their implementation, originally designed for tasks such as molecule generation, and reformulate the algorithm for broader optimisation purposes.<n>We then couple LSBO with Sequential Domain Reduction (SDR) directly in the latent space (SDR-LSBO), yielding an algorithm that narrows the latent search domains as evidence accumulates.
arXiv Detail & Related papers (2025-10-17T08:45:38Z) - Adam assisted Fully informed Particle Swarm Optimization ( Adam-FIPSO ) based Parameter Prediction for the Quantum Approximate Optimization Algorithm (QAOA) [1.024113475677323]
The Quantum Approximate Optimization Algorithm (QAOA) is a prominent variational algorithm used for solving optimization problems such as the Max-Cut problem.<n>A key challenge in QAOA lies in efficiently identifying suitable parameters that lead to high-quality solutions.
arXiv Detail & Related papers (2025-06-07T13:14:41Z) - A Memetic Algorithm based on Variational Autoencoder for Black-Box Discrete Optimization with Epistasis among Parameters [2.6774008509841005]
Black-box discrete optimization (BB-DO) problems arise in many real-world applications.<n>Key challenge in BB-DO is epistasis among parameters where multiple variables must be modified simultaneously.<n>We propose a new memetic algorithm that combines VAE-based sampling with local search.
arXiv Detail & Related papers (2025-04-30T05:56:22Z) - A Stochastic Approach to Bi-Level Optimization for Hyperparameter Optimization and Meta Learning [74.80956524812714]
We tackle the general differentiable meta learning problem that is ubiquitous in modern deep learning.
These problems are often formalized as Bi-Level optimizations (BLO)
We introduce a novel perspective by turning a given BLO problem into a ii optimization, where the inner loss function becomes a smooth distribution, and the outer loss becomes an expected loss over the inner distribution.
arXiv Detail & Related papers (2024-10-14T12:10:06Z) - Memory-Efficient Optimization with Factorized Hamiltonian Descent [11.01832755213396]
We introduce a novel adaptive, H-Fac, which incorporates a memory-efficient factorization approach to address this challenge.
By employing a rank-1 parameterization for both momentum and scaling parameter estimators, H-Fac reduces memory costs to a sublinear level.
We develop our algorithms based on principles derived from Hamiltonian dynamics, providing robust theoretical underpinnings in optimization dynamics and convergence guarantees.
arXiv Detail & Related papers (2024-06-14T12:05:17Z) - SGD with Partial Hessian for Deep Neural Networks Optimization [18.78728272603732]
We propose a compound, which is a combination of a second-order with a precise partial Hessian matrix for updating channel-wise parameters and the first-order gradient descent (SGD) algorithms for updating the other parameters.
Compared with first-orders, it adopts a certain amount of information from the Hessian matrix to assist optimization, while compared with the existing second-order generalizations, it keeps the good performance of first-order generalizations imprecise.
arXiv Detail & Related papers (2024-03-05T06:10:21Z) - Model-based Causal Bayesian Optimization [78.120734120667]
We propose model-based causal Bayesian optimization (MCBO)
MCBO learns a full system model instead of only modeling intervention-reward pairs.
Unlike in standard Bayesian optimization, our acquisition function cannot be evaluated in closed form.
arXiv Detail & Related papers (2022-11-18T14:28:21Z) - Nesterov Meets Optimism: Rate-Optimal Separable Minimax Optimization [108.35402316802765]
We propose a new first-order optimization algorithm -- AcceleratedGradient-OptimisticGradient (AG-OG) Ascent.
We show that AG-OG achieves the optimal convergence rate (up to a constant) for a variety of settings.
We extend our algorithm to extend the setting and achieve the optimal convergence rate in both bi-SC-SC and bi-C-SC settings.
arXiv Detail & Related papers (2022-10-31T17:59:29Z) - Exploring the Algorithm-Dependent Generalization of AUPRC Optimization
with List Stability [107.65337427333064]
optimization of the Area Under the Precision-Recall Curve (AUPRC) is a crucial problem for machine learning.
In this work, we present the first trial in the single-dependent generalization of AUPRC optimization.
Experiments on three image retrieval datasets on speak to the effectiveness and soundness of our framework.
arXiv Detail & Related papers (2022-09-27T09:06:37Z) - Bilevel Optimization: Convergence Analysis and Enhanced Design [63.64636047748605]
Bilevel optimization is a tool for many machine learning problems.
We propose a novel stoc-efficientgradient estimator named stoc-BiO.
arXiv Detail & Related papers (2020-10-15T18:09:48Z) - Distributionally Robust Bayesian Optimization [121.71766171427433]
We present a novel distributionally robust Bayesian optimization algorithm (DRBO) for zeroth-order, noisy optimization.
Our algorithm provably obtains sub-linear robust regret in various settings.
We demonstrate the robust performance of our method on both synthetic and real-world benchmarks.
arXiv Detail & Related papers (2020-02-20T22:04:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.