GBO:AMulti-Granularity Optimization Algorithm via Granular-ball for Continuous Problems
- URL: http://arxiv.org/abs/2303.12807v2
- Date: Tue, 18 Feb 2025 10:07:23 GMT
- Title: GBO:AMulti-Granularity Optimization Algorithm via Granular-ball for Continuous Problems
- Authors: Shuyin Xia, Xinyu Lin, Guan Wang, De-Gang Chen, Sen Zhao, Guoyin Wang, Jing Liang,
- Abstract summary: This paper proposes a new multi-granularity evolutionary optimization method, namely the Granular-ball Optimization (GBO) algorithm.
Using granular-balls instead of traditional points for optimization increases the diversity and robustness of the random search process.
Experiments on commonly used benchmarks have shown that GBO outperforms popular and advanced evolutionary algorithms.
- Score: 18.229944332377755
- License:
- Abstract: Optimization problems aim to find the optimal solution, which is becoming increasingly complex and difficult to solve. Traditional evolutionary optimization methods always overlook the granular characteristics of solution space. In the real scenario of numerous optimizations, the solution space is typically partitioned into sub-regions characterized by varying degree distributions. These sub-regions present different granularity characteristics at search potential and difficulty. Considering the granular characteristics of the solution space, the number of coarse-grained regions is smaller than the number of points, so the calculation is more efficient. On the other hand, coarse-grained characteristics are not easily affected by fine-grained sample points, so the calculation is more robust. To this end, this paper proposes a new multi-granularity evolutionary optimization method, namely the Granular-ball Optimization (GBO) algorithm, which characterizes and searches the solution space from coarse to fine. Specifically, using granular-balls instead of traditional points for optimization increases the diversity and robustness of the random search process. At the same time, the search range in different iteration processes is limited by the radius of granular-balls, covering the solution space from large to small. The mechanism of granular-ball splitting is applied to continuously split and evolve the large granular-balls into smaller ones for refining the solution space. Extensive experiments on commonly used benchmarks have shown that GBO outperforms popular and advanced evolutionary algorithms. The code can be found in the supporting materials.
Related papers
- GBFRS: Robust Fuzzy Rough Sets via Granular-ball Computing [48.33779268699777]
Fuzzy rough set theory is effective for processing datasets with complex attributes.
Most existing models operate at the finest granularity, rendering them inefficient and sensitive to noise.
This paper proposes integrating multi-granularity granular-ball computing into fuzzy rough set theory, using granular-balls to replace sample points.
arXiv Detail & Related papers (2025-01-30T15:09:26Z) - Scalable Bayesian Optimization via Focalized Sparse Gaussian Processes [8.40647440727154]
We argue that Bayesian optimization algorithms with sparse GPs can more efficiently allocate their representational power to relevant regions of the search space.
We show that FocalBO can efficiently leverage large amounts of offline and online data to achieve state-of-the-art performance on robot morphology design and to control a 585-dimensional musculoskeletal system.
arXiv Detail & Related papers (2024-12-29T06:36:15Z) - Accelerated Distributed Aggregative Optimization [5.5491171448159715]
We propose two novel algorithms named DAGT-HB and DAGT-NES for solving the distributed aggregative optimization problem.
We analyse that the DAGT-HB and DAGT-NES algorithms can converge to an optimal solution at a global $mathbfR-$linear convergence rate.
arXiv Detail & Related papers (2023-04-17T08:11:01Z) - Research on Efficient Fuzzy Clustering Method Based on Local Fuzzy
Granular balls [67.33923111887933]
In this paper, the data is fuzzy iterated using granular-balls, and the membership degree of data only considers the two granular-balls where it is located.
The formed fuzzy granular-balls set can use more processing methods in the face of different data scenarios.
arXiv Detail & Related papers (2023-03-07T01:52:55Z) - An Improved Greedy Algorithm for Subset Selection in Linear Estimation [5.994412766684842]
We consider a subset selection problem in a spatial field where we seek to find a set of k locations whose observations provide the best estimate of the field value at a finite set of prediction locations.
One approach for observation selection is to perform a grid discretization of the space and obtain an approximate solution using the greedy algorithm.
We propose a method to reduce the computational complexity by considering a search space consisting only of prediction locations and centroids of cliques formed by the prediction locations.
arXiv Detail & Related papers (2022-03-30T05:52:16Z) - Density-Based Clustering with Kernel Diffusion [59.4179549482505]
A naive density corresponding to the indicator function of a unit $d$-dimensional Euclidean ball is commonly used in density-based clustering algorithms.
We propose a new kernel diffusion density function, which is adaptive to data of varying local distributional characteristics and smoothness.
arXiv Detail & Related papers (2021-10-11T09:00:33Z) - Intermediate Layer Optimization for Inverse Problems using Deep
Generative Models [86.29330440222199]
ILO is a novel optimization algorithm for solving inverse problems with deep generative models.
We empirically show that our approach outperforms state-of-the-art methods introduced in StyleGAN-2 and PULSE for a wide range of inverse problems.
arXiv Detail & Related papers (2021-02-15T06:52:22Z) - Bayesian Variational Optimization for Combinatorial Spaces [0.0]
Broad applications include the study of molecules, proteins, DNA, device structures and quantum circuit designs.
A on optimization over categorical spaces is needed to find optimal or pareto-optimal solutions.
We introduce a variational Bayesian optimization method that combines variational optimization and continuous relaxations.
arXiv Detail & Related papers (2020-11-03T20:56:13Z) - Sequential Subspace Search for Functional Bayesian Optimization
Incorporating Experimenter Intuition [63.011641517977644]
Our algorithm generates a sequence of finite-dimensional random subspaces of functional space spanned by a set of draws from the experimenter's Gaussian Process.
Standard Bayesian optimisation is applied on each subspace, and the best solution found used as a starting point (origin) for the next subspace.
We test our algorithm in simulated and real-world experiments, namely blind function matching, finding the optimal precipitation-strengthening function for an aluminium alloy, and learning rate schedule optimisation for deep networks.
arXiv Detail & Related papers (2020-09-08T06:54:11Z) - Private Stochastic Convex Optimization: Optimal Rates in Linear Time [74.47681868973598]
We study the problem of minimizing the population loss given i.i.d. samples from a distribution over convex loss functions.
A recent work of Bassily et al. has established the optimal bound on the excess population loss achievable given $n$ samples.
We describe two new techniques for deriving convex optimization algorithms both achieving the optimal bound on excess loss and using $O(minn, n2/d)$ gradient computations.
arXiv Detail & Related papers (2020-05-10T19:52:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.