Adaptive Resonance Theory-based Topological Clustering with a Divisive
Hierarchical Structure Capable of Continual Learning
- URL: http://arxiv.org/abs/2201.10713v1
- Date: Wed, 26 Jan 2022 02:34:52 GMT
- Title: Adaptive Resonance Theory-based Topological Clustering with a Divisive
Hierarchical Structure Capable of Continual Learning
- Authors: Naoki Masuyama, Narito Amako, Yuna Yamada, Yusuke Nojima, Hisao
Ishibuchi
- Abstract summary: This paper proposes an ART-based topological clustering algorithm with a mechanism that automatically estimates a similarity threshold from a distribution of data points.
For the improving information extraction performance, a divisive hierarchical clustering algorithm capable of continual learning is proposed.
- Score: 8.581682204722894
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Thanks to an ability for handling the plasticity-stability dilemma, Adaptive
Resonance Theory (ART) is considered as an effective approach for realizing
continual learning. In general, however, the clustering performance of
ART-based algorithms strongly depends on a similarity threshold, i.e., a
vigilance parameter, which is data-dependent and specified by hand. This paper
proposes an ART-based topological clustering algorithm with a mechanism that
automatically estimates a similarity threshold from a distribution of data
points. In addition, for the improving information extraction performance, a
divisive hierarchical clustering algorithm capable of continual learning is
proposed by introducing a hierarchical structure to the proposed algorithm.
Simulation experiments show that the proposed algorithm shows the comparative
clustering performance compared with recently proposed hierarchical clustering
algorithms.
Related papers
- Quantized Hierarchical Federated Learning: A Robust Approach to
Statistical Heterogeneity [3.8798345704175534]
We present a novel hierarchical federated learning algorithm that incorporates quantization for communication-efficiency.
We offer a comprehensive analytical framework to evaluate its optimality gap and convergence rate.
Our findings reveal that our algorithm consistently achieves high learning accuracy over a range of parameters.
arXiv Detail & Related papers (2024-03-03T15:40:24Z) - A Parameter-free Adaptive Resonance Theory-based Topological Clustering
Algorithm Capable of Continual Learning [20.995946115633963]
We propose a new parameter-free ART-based topological clustering algorithm capable of continual learning by introducing parameter estimation methods.
Experimental results with synthetic and real-world datasets show that the proposed algorithm has superior clustering performance to the state-of-the-art clustering algorithms without any parameter pre-specifications.
arXiv Detail & Related papers (2023-05-01T01:04:07Z) - Improved Algorithms for Neural Active Learning [74.89097665112621]
We improve the theoretical and empirical performance of neural-network(NN)-based active learning algorithms for the non-parametric streaming setting.
We introduce two regret metrics by minimizing the population loss that are more suitable in active learning than the one used in state-of-the-art (SOTA) related work.
arXiv Detail & Related papers (2022-10-02T05:03:38Z) - Deep Equilibrium Assisted Block Sparse Coding of Inter-dependent
Signals: Application to Hyperspectral Imaging [71.57324258813675]
A dataset of inter-dependent signals is defined as a matrix whose columns demonstrate strong dependencies.
A neural network is employed to act as structure prior and reveal the underlying signal interdependencies.
Deep unrolling and Deep equilibrium based algorithms are developed, forming highly interpretable and concise deep-learning-based architectures.
arXiv Detail & Related papers (2022-03-29T21:00:39Z) - Class-wise Classifier Design Capable of Continual Learning using
Adaptive Resonance Theory-based Topological Clustering [4.772368796656325]
This paper proposes a supervised classification algorithm capable of continual learning by utilizing an Adaptive Resonance (ART)-based growing self-organizing clustering algorithm.
The ART-based clustering algorithm is theoretically capable of continual learning.
The proposed algorithm has superior classification performance compared with state-of-the-art clustering-based classification algorithms capable of continual learning.
arXiv Detail & Related papers (2022-03-18T11:43:12Z) - Amortized Implicit Differentiation for Stochastic Bilevel Optimization [53.12363770169761]
We study a class of algorithms for solving bilevel optimization problems in both deterministic and deterministic settings.
We exploit a warm-start strategy to amortize the estimation of the exact gradient.
By using this framework, our analysis shows these algorithms to match the computational complexity of methods that have access to an unbiased estimate of the gradient.
arXiv Detail & Related papers (2021-11-29T15:10:09Z) - Scalable Intervention Target Estimation in Linear Models [52.60799340056917]
Current approaches to causal structure learning either work with known intervention targets or use hypothesis testing to discover the unknown intervention targets.
This paper proposes a scalable and efficient algorithm that consistently identifies all intervention targets.
The proposed algorithm can be used to also update a given observational Markov equivalence class into the interventional Markov equivalence class.
arXiv Detail & Related papers (2021-11-15T03:16:56Z) - Unsupervised Clustered Federated Learning in Complex Multi-source
Acoustic Environments [75.8001929811943]
We introduce a realistic and challenging, multi-source and multi-room acoustic environment.
We present an improved clustering control strategy that takes into account the variability of the acoustic scene.
The proposed approach is optimized using clustering-based measures and validated via a network-wide classification task.
arXiv Detail & Related papers (2021-06-07T14:51:39Z) - HAWKS: Evolving Challenging Benchmark Sets for Cluster Analysis [2.5329716878122404]
Comprehensive benchmarking of clustering algorithms is difficult.
There is no consensus regarding the best practice for rigorous benchmarking.
We demonstrate the important role evolutionary algorithms play to support flexible generation of such benchmarks.
arXiv Detail & Related papers (2021-02-13T15:01:34Z) - Online Deterministic Annealing for Classification and Clustering [0.0]
We introduce an online prototype-based learning algorithm for clustering and classification.
We show that the proposed algorithm constitutes a competitive-learning neural network, the learning rule of which is formulated as an online approximation algorithm.
arXiv Detail & Related papers (2021-02-11T04:04:21Z) - Scalable Hierarchical Agglomerative Clustering [65.66407726145619]
Existing scalable hierarchical clustering methods sacrifice quality for speed.
We present a scalable, agglomerative method for hierarchical clustering that does not sacrifice quality and scales to billions of data points.
arXiv Detail & Related papers (2020-10-22T15:58:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.