Accelerated Fuzzy C-Means Clustering Based on New Affinity Filtering and
Membership Scaling
- URL: http://arxiv.org/abs/2302.07060v1
- Date: Tue, 14 Feb 2023 14:20:31 GMT
- Title: Accelerated Fuzzy C-Means Clustering Based on New Affinity Filtering and
Membership Scaling
- Authors: Dong Li, Shuisheng Zhou, and Witold Pedrycz
- Abstract summary: Fuzzy C-Means (FCM) is a widely used clustering method.
FCM has low efficiency in the mid-to-late stage of the clustering process.
FCM based on new affinity filtering and membership scaling (AMFCM) is proposed to accelerate the whole convergence process.
- Score: 74.85538972921917
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Fuzzy C-Means (FCM) is a widely used clustering method. However, FCM and its
many accelerated variants have low efficiency in the mid-to-late stage of the
clustering process. In this stage, all samples are involved in the update of
their non-affinity centers, and the fuzzy membership grades of the most of
samples, whose assignment is unchanged, are still updated by calculating the
samples-centers distances. All those lead to the algorithms converging slowly.
In this paper, a new affinity filtering technique is developed to recognize a
complete set of the non-affinity centers for each sample with low computations.
Then, a new membership scaling technique is suggested to set the membership
grades between each sample and its non-affinity centers to 0 and maintain the
fuzzy membership grades for others. By integrating those two techniques, FCM
based on new affinity filtering and membership scaling (AMFCM) is proposed to
accelerate the whole convergence process of FCM. Many experimental results
performed on synthetic and real-world data sets have shown the feasibility and
efficiency of the proposed algorithm. Compared with the state-of-the-art
algorithms, AMFCM is significantly faster and more effective. For example,
AMFCM reduces the number of the iteration of FCM by 80% on average.
Related papers
- Beyond adaptive gradient: Fast-Controlled Minibatch Algorithm for large-scale optimization [1.6749379740049926]
We introduce F-CMA, a Fast-Controlled Mini-batch Algorithm with a random reshuffling method featuring a sufficient decrease condition and a line-search procedure to ensure loss reduction per epoch.
Tests show significant improvements, including a decrease in the overall training time by 68%, an increase in per-epoch efficiency by up to 20%, and in model accuracy by up to 5%.
arXiv Detail & Related papers (2024-11-24T11:46:47Z) - Fuzzy K-Means Clustering without Cluster Centroids [21.256564324236333]
Fuzzy K-Means clustering is a critical technique in unsupervised data analysis.
This paper proposes a novel Fuzzy textitK-Means clustering algorithm that entirely eliminates the reliance on cluster centroids.
arXiv Detail & Related papers (2024-04-07T12:25:03Z) - DFedADMM: Dual Constraints Controlled Model Inconsistency for
Decentralized Federated Learning [52.83811558753284]
Decentralized learning (DFL) discards the central server and establishes a decentralized communication network.
Existing DFL methods still suffer from two major challenges: local inconsistency and local overfitting.
arXiv Detail & Related papers (2023-08-16T11:22:36Z) - Large-scale Optimization of Partial AUC in a Range of False Positive
Rates [51.12047280149546]
The area under the ROC curve (AUC) is one of the most widely used performance measures for classification models in machine learning.
We develop an efficient approximated gradient descent method based on recent practical envelope smoothing technique.
Our proposed algorithm can also be used to minimize the sum of some ranked range loss, which also lacks efficient solvers.
arXiv Detail & Related papers (2022-03-03T03:46:18Z) - Faster One-Sample Stochastic Conditional Gradient Method for Composite
Convex Minimization [61.26619639722804]
We propose a conditional gradient method (CGM) for minimizing convex finite-sum objectives formed as a sum of smooth and non-smooth terms.
The proposed method, equipped with an average gradient (SAG) estimator, requires only one sample per iteration. Nevertheless, it guarantees fast convergence rates on par with more sophisticated variance reduction techniques.
arXiv Detail & Related papers (2022-02-26T19:10:48Z) - Self-supervised Symmetric Nonnegative Matrix Factorization [82.59905231819685]
Symmetric nonnegative factor matrix (SNMF) has demonstrated to be a powerful method for data clustering.
Inspired by ensemble clustering that aims to seek better clustering results, we propose self-supervised SNMF (S$3$NMF)
We take advantage of the sensitivity to code characteristic of SNMF, without relying on any additional information.
arXiv Detail & Related papers (2021-03-02T12:47:40Z) - Interval Type-2 Enhanced Possibilistic Fuzzy C-Means Clustering for Gene
Expression Data Analysis [0.6445605125467573]
Both FCM and PCM clustering methods have been widely applied to pattern recognition and data clustering.
PFCM is an extension of the PCM model by combining FCM and PCM, but this method still suffers from the weaknesses of PCM and FCM.
In the current paper, the weaknesses of the PFCM algorithm are corrected and the enhanced possibilistic fuzzy c-means (EPFCM) clustering algorithm is presented.
arXiv Detail & Related papers (2021-01-01T19:29:24Z) - Modified Possibilistic Fuzzy C-Means Algorithm for Clustering Incomplete
Data Sets [0.0]
Possibilistic fuzzy c-means (PFCM) algorithm has been proposed to deal the weakness of two popular algorithms for clustering, fuzzy c-means (FCM) and possibilistic c-means (PCM)
arXiv Detail & Related papers (2020-07-09T16:12:11Z) - A Centroid Auto-Fused Hierarchical Fuzzy c-Means Clustering [30.709797128259236]
Centroid Auto-Fused Hierarchical Fuzzy c-means method (CAF-HFCM)
We present a Centroid Auto-Fused Hierarchical Fuzzy c-means method (CAF-HFCM) whose optimization procedure can automatically agglomerate to form a cluster hierarchy.
Our proposed CAF-HFCM method is able to be straightforwardly extended to various variants of FCM.
arXiv Detail & Related papers (2020-04-27T12:59:22Z) - Kullback-Leibler Divergence-Based Fuzzy $C$-Means Clustering
Incorporating Morphological Reconstruction and Wavelet Frames for Image
Segmentation [152.609322951917]
We come up with a Kullback-Leibler (KL) divergence-based Fuzzy C-Means (FCM) algorithm by incorporating a tight wavelet frame transform and a morphological reconstruction operation.
The proposed algorithm works well and comes with better segmentation performance than other comparative algorithms.
arXiv Detail & Related papers (2020-02-21T05:19:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.