EnHDC: Ensemble Learning for Brain-Inspired Hyperdimensional Computing
- URL: http://arxiv.org/abs/2203.13542v1
- Date: Fri, 25 Mar 2022 09:54:00 GMT
- Title: EnHDC: Ensemble Learning for Brain-Inspired Hyperdimensional Computing
- Authors: Ruixuan Wang, Dongning Ma, Xun Jiao
- Abstract summary: This paper presents the first effort in exploring ensemble learning in the context of hyperdimensional computing.
We propose the first ensemble HDC model referred to as EnHDC.
We show that EnHDC can achieve on average 3.2% accuracy improvement over a single HDC classifier.
- Score: 2.7462881838152913
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Ensemble learning is a classical learning method utilizing a group of weak
learners to form a strong learner, which aims to increase the accuracy of the
model. Recently, brain-inspired hyperdimensional computing (HDC) becomes an
emerging computational paradigm that has achieved success in various domains
such as human activity recognition, voice recognition, and bio-medical signal
classification. HDC mimics the brain cognition and leverages high-dimensional
vectors (e.g., 10000 dimensions) with fully distributed holographic
representation and (pseudo-)randomness. This paper presents the first effort in
exploring ensemble learning in the context of HDC and proposes the first
ensemble HDC model referred to as EnHDC. EnHDC uses a majority voting-based
mechanism to synergistically integrate the prediction outcomes of multiple base
HDC classifiers. To enhance the diversity of base classifiers, we vary the
encoding mechanisms, dimensions, and data width settings among base
classifiers. By applying EnHDC on a wide range of applications, results show
that the EnHDC can achieve on average 3.2\% accuracy improvement over a single
HDC classifier. Further, we show that EnHDC with reduced dimensionality, e.g.,
1000 dimensions, can achieve similar or even surpass the accuracy of baseline
HDC with higher dimensionality, e.g., 10000 dimensions. This leads to a 20\%
reduction of storage requirement of HDC model, which is key to enabling HDC on
low-power computing platforms.
Related papers
- HEAL: Brain-inspired Hyperdimensional Efficient Active Learning [13.648600396116539]
We introduce Hyperdimensional Efficient Active Learning (HEAL), a novel Active Learning framework tailored for HDC classification.
HEAL proactively annotates unlabeled data points via uncertainty and diversity-guided acquisition, leading to a more efficient dataset annotation and lowering labor costs.
Our evaluation shows that HEAL surpasses a diverse set of baselines in AL quality and achieves notably faster acquisition than many BNN-powered or diversity-guided AL methods.
arXiv Detail & Related papers (2024-02-17T08:41:37Z) - A Weighted K-Center Algorithm for Data Subset Selection [70.49696246526199]
Subset selection is a fundamental problem that can play a key role in identifying smaller portions of the training data.
We develop a novel factor 3-approximation algorithm to compute subsets based on the weighted sum of both k-center and uncertainty sampling objective functions.
arXiv Detail & Related papers (2023-12-17T04:41:07Z) - Dynamic Conceptional Contrastive Learning for Generalized Category
Discovery [76.82327473338734]
Generalized category discovery (GCD) aims to automatically cluster partially labeled data.
Unlabeled data contain instances that are not only from known categories of the labeled data but also from novel categories.
One effective way for GCD is applying self-supervised learning to learn discriminate representation for unlabeled data.
We propose a Dynamic Conceptional Contrastive Learning framework, which can effectively improve clustering accuracy.
arXiv Detail & Related papers (2023-03-30T14:04:39Z) - Efficient Hyperdimensional Computing [4.8915861089531205]
We develop HDC models that use binary hypervectors with dimensions orders of magnitude lower than those of state-of-the-art HDC models.
For instance, on the MNIST dataset, we achieve 91.12% HDC accuracy in image classification with a dimension of only 64.
arXiv Detail & Related papers (2023-01-26T02:22:46Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - GraphHD: Efficient graph classification using hyperdimensional computing [58.720142291102135]
We present a baseline approach for graph classification with HDC.
We evaluate GraphHD on real-world graph classification problems.
Our results show that when compared to the state-of-the-art Graph Neural Networks (GNNs) the proposed model achieves comparable accuracy.
arXiv Detail & Related papers (2022-05-16T17:32:58Z) - LeHDC: Learning-Based Hyperdimensional Computing Classifier [14.641707790969914]
We propose a new HDC framework, called LeHDC, which leverages a principled learning approach to improve the model accuracy.
Experimental validation shows that LeHDC outperforms previous HDC training strategies and can improve on average the inference accuracy over 15%.
arXiv Detail & Related papers (2022-03-18T01:13:58Z) - A Brain-Inspired Low-Dimensional Computing Classifier for Inference on
Tiny Devices [17.976792694929063]
We propose a low-dimensional computing (LDC) alternative to hyperdimensional computing (HDC)
We map our LDC classifier into a neural equivalent network and optimize our model using a principled training approach.
Our LDC classifier offers an overwhelming advantage over the existing brain-inspired HDC models and is particularly suitable for inference on tiny devices.
arXiv Detail & Related papers (2022-03-09T17:20:12Z) - Understanding Hyperdimensional Computing for Parallel Single-Pass
Learning [47.82940409267635]
We show that HDC can outperform the state-of-the-art HDC model by up to 7.6% while maintaining hardware efficiency.
We propose a new class of VSAs, finite group VSAs, which surpass the limits of HDC.
Experimental results show that our RFF method and group VSA can both outperform the state-of-the-art HDC model by up to 7.6%.
arXiv Detail & Related papers (2022-02-10T02:38:56Z) - No Fear of Heterogeneity: Classifier Calibration for Federated Learning
with Non-IID Data [78.69828864672978]
A central challenge in training classification models in the real-world federated system is learning with non-IID data.
We propose a novel and simple algorithm called Virtual Representations (CCVR), which adjusts the classifier using virtual representations sampled from an approximated ssian mixture model.
Experimental results demonstrate that CCVR state-of-the-art performance on popular federated learning benchmarks including CIFAR-10, CIFAR-100, and CINIC-10.
arXiv Detail & Related papers (2021-06-09T12:02:29Z) - HDXplore: Automated Blackbox Testing of Brain-Inspired Hyperdimensional
Computing [2.3549478726261883]
HDC is an emerging computing scheme based on the working mechanism of brain that computes with deep and abstract patterns of neural activity instead of actual numbers.
Compared with traditional ML algorithms such as DNN, HDC is more memory-centric, granting it advantages such as relatively smaller model size, less cost, and one-shot learning.
We develop HDXplore, a blackbox differential testing-based framework to expose the unexpected or incorrect behaviors of HDC models.
arXiv Detail & Related papers (2021-05-26T18:08:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.