A Parameter-free Adaptive Resonance Theory-based Topological Clustering
Algorithm Capable of Continual Learning
- URL: http://arxiv.org/abs/2305.01507v2
- Date: Wed, 3 May 2023 01:58:25 GMT
- Title: A Parameter-free Adaptive Resonance Theory-based Topological Clustering
Algorithm Capable of Continual Learning
- Authors: Naoki Masuyama, Takanori Takebayashi, Yusuke Nojima, Chu Kiong Loo,
Hisao Ishibuchi, Stefan Wermter
- Abstract summary: We propose a new parameter-free ART-based topological clustering algorithm capable of continual learning by introducing parameter estimation methods.
Experimental results with synthetic and real-world datasets show that the proposed algorithm has superior clustering performance to the state-of-the-art clustering algorithms without any parameter pre-specifications.
- Score: 20.995946115633963
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In general, a similarity threshold (i.e., a vigilance parameter) for a node
learning process in Adaptive Resonance Theory (ART)-based algorithms has a
significant impact on clustering performance. In addition, an edge deletion
threshold in a topological clustering algorithm plays an important role in
adaptively generating well-separated clusters during a self-organizing process.
In this paper, we propose a new parameter-free ART-based topological clustering
algorithm capable of continual learning by introducing parameter estimation
methods. Experimental results with synthetic and real-world datasets show that
the proposed algorithm has superior clustering performance to the
state-of-the-art clustering algorithms without any parameter
pre-specifications.
Related papers
- A3S: A General Active Clustering Method with Pairwise Constraints [66.74627463101837]
A3S features strategic active clustering adjustment on the initial cluster result, which is obtained by an adaptive clustering algorithm.
In extensive experiments across diverse real-world datasets, A3S achieves desired results with significantly fewer human queries.
arXiv Detail & Related papers (2024-07-14T13:37:03Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Class-wise Classifier Design Capable of Continual Learning using
Adaptive Resonance Theory-based Topological Clustering [4.772368796656325]
This paper proposes a supervised classification algorithm capable of continual learning by utilizing an Adaptive Resonance (ART)-based growing self-organizing clustering algorithm.
The ART-based clustering algorithm is theoretically capable of continual learning.
The proposed algorithm has superior classification performance compared with state-of-the-art clustering-based classification algorithms capable of continual learning.
arXiv Detail & Related papers (2022-03-18T11:43:12Z) - Gradient Based Clustering [72.15857783681658]
We propose a general approach for distance based clustering, using the gradient of the cost function that measures clustering quality.
The approach is an iterative two step procedure (alternating between cluster assignment and cluster center updates) and is applicable to a wide range of functions.
arXiv Detail & Related papers (2022-02-01T19:31:15Z) - Adaptive Resonance Theory-based Topological Clustering with a Divisive
Hierarchical Structure Capable of Continual Learning [8.581682204722894]
This paper proposes an ART-based topological clustering algorithm with a mechanism that automatically estimates a similarity threshold from a distribution of data points.
For the improving information extraction performance, a divisive hierarchical clustering algorithm capable of continual learning is proposed.
arXiv Detail & Related papers (2022-01-26T02:34:52Z) - An iterative clustering algorithm for the Contextual Stochastic Block
Model with optimality guarantees [4.007017852999008]
We propose a new iterative algorithm to cluster networks with side information for nodes.
We show that our algorithm is optimal under the Contextual Symmetric Block Model.
arXiv Detail & Related papers (2021-12-20T12:04:07Z) - Fractal Structure and Generalization Properties of Stochastic
Optimization Algorithms [71.62575565990502]
We prove that the generalization error of an optimization algorithm can be bounded on the complexity' of the fractal structure that underlies its generalization measure.
We further specialize our results to specific problems (e.g., linear/logistic regression, one hidden/layered neural networks) and algorithms.
arXiv Detail & Related papers (2021-06-09T08:05:36Z) - Unsupervised Clustered Federated Learning in Complex Multi-source
Acoustic Environments [75.8001929811943]
We introduce a realistic and challenging, multi-source and multi-room acoustic environment.
We present an improved clustering control strategy that takes into account the variability of the acoustic scene.
The proposed approach is optimized using clustering-based measures and validated via a network-wide classification task.
arXiv Detail & Related papers (2021-06-07T14:51:39Z) - Online Deterministic Annealing for Classification and Clustering [0.0]
We introduce an online prototype-based learning algorithm for clustering and classification.
We show that the proposed algorithm constitutes a competitive-learning neural network, the learning rule of which is formulated as an online approximation algorithm.
arXiv Detail & Related papers (2021-02-11T04:04:21Z) - A self-adaptive and robust fission clustering algorithm via heat
diffusion and maximal turning angle [4.246818236277977]
A novel and fast clustering algorithm, fission clustering algorithm, is proposed in recent year.
We propose a robust fission clustering (RFC) algorithm and a self-adaptive noise identification method.
arXiv Detail & Related papers (2021-02-07T13:16:47Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.