Efficient Hyperdimensional Computing
- URL: http://arxiv.org/abs/2301.10902v2
- Date: Thu, 12 Oct 2023 05:21:21 GMT
- Title: Efficient Hyperdimensional Computing
- Authors: Zhanglu Yan, Shida Wang, Kaiwen Tang, Weng-Fai Wong
- Abstract summary: We develop HDC models that use binary hypervectors with dimensions orders of magnitude lower than those of state-of-the-art HDC models.
For instance, on the MNIST dataset, we achieve 91.12% HDC accuracy in image classification with a dimension of only 64.
- Score: 4.8915861089531205
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperdimensional computing (HDC) is a method to perform classification that
uses binary vectors with high dimensions and the majority rule. This approach
has the potential to be energy-efficient and hence deemed suitable for
resource-limited platforms due to its simplicity and massive parallelism.
However, in order to achieve high accuracy, HDC sometimes uses hypervectors
with tens of thousands of dimensions. This potentially negates its efficiency
advantage. In this paper, we examine the necessity of such high dimensions and
conduct a detailed theoretical analysis of the relationship between hypervector
dimensions and accuracy. Our results demonstrate that as the dimension of the
hypervectors increases, the worst-case/average-case HDC prediction accuracy
with the majority rule decreases. Building on this insight, we develop HDC
models that use binary hypervectors with dimensions orders of magnitude lower
than those of state-of-the-art HDC models while maintaining equivalent or even
improved accuracy and efficiency. For instance, on the MNIST dataset, we
achieve 91.12% HDC accuracy in image classification with a dimension of only
64. Our methods perform operations that are only 0.35% of other HDC models with
dimensions of 10,000. Furthermore, we evaluate our methods on ISOLET, UCI-HAR,
and Fashion-MNIST datasets and investigate the limits of HDC computing.
Related papers
- Computation-Aware Gaussian Processes: Model Selection And Linear-Time Inference [55.150117654242706]
We show that model selection for computation-aware GPs trained on 1.8 million data points can be done within a few hours on a single GPU.
As a result of this work, Gaussian processes can be trained on large-scale datasets without significantly compromising their ability to quantify uncertainty.
arXiv Detail & Related papers (2024-11-01T21:11:48Z) - MicroHD: An Accuracy-Driven Optimization of Hyperdimensional Computing Algorithms for TinyML systems [8.54897708375791]
Hyperdimensional computing (HDC) is emerging as a promising AI approach that can effectively target TinyML applications.
Previous works on HDC showed that limiting the standard 10k dimensions of the hyperdimensional space to much lower values is possible.
arXiv Detail & Related papers (2024-03-24T02:45:34Z) - uHD: Unary Processing for Lightweight and Dynamic Hyperdimensional
Computing [1.7118124088316602]
Hyperdimensional computing (HDC) is a novel computational paradigm that operates on long-dimensional vectors known as hypervectors.
In this paper, we show how to generate intensity and position hypervectors in HDC using low-discrepancy sequences.
For the first time in the literature, our proposed approach employs lightweight vector generators utilizing unary bit-streams for efficient encoding of data.
arXiv Detail & Related papers (2023-11-16T06:28:19Z) - Improved Distribution Matching for Dataset Condensation [91.55972945798531]
We propose a novel dataset condensation method based on distribution matching.
Our simple yet effective method outperforms most previous optimization-oriented methods with much fewer computational resources.
arXiv Detail & Related papers (2023-07-19T04:07:33Z) - HDCC: A Hyperdimensional Computing compiler for classification on
embedded systems and high-performance computing [58.720142291102135]
This work introduces the name compiler, the first open-source compiler that translates high-level descriptions of HDC classification methods into optimized C code.
name is designed like a modern compiler, featuring an intuitive and descriptive input language, an intermediate representation (IR), and a retargetable backend.
To substantiate these claims, we conducted experiments with HDCC on several of the most popular datasets in the HDC literature.
arXiv Detail & Related papers (2023-04-24T19:16:03Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - EnHDC: Ensemble Learning for Brain-Inspired Hyperdimensional Computing [2.7462881838152913]
This paper presents the first effort in exploring ensemble learning in the context of hyperdimensional computing.
We propose the first ensemble HDC model referred to as EnHDC.
We show that EnHDC can achieve on average 3.2% accuracy improvement over a single HDC classifier.
arXiv Detail & Related papers (2022-03-25T09:54:00Z) - Understanding Hyperdimensional Computing for Parallel Single-Pass
Learning [47.82940409267635]
We show that HDC can outperform the state-of-the-art HDC model by up to 7.6% while maintaining hardware efficiency.
We propose a new class of VSAs, finite group VSAs, which surpass the limits of HDC.
Experimental results show that our RFF method and group VSA can both outperform the state-of-the-art HDC model by up to 7.6%.
arXiv Detail & Related papers (2022-02-10T02:38:56Z) - Visual Cluster Separation Using High-Dimensional Sharpened
Dimensionality Reduction [65.80631307271705]
High-Dimensional Sharpened DR' (HD-SDR) is tested on both synthetic and real-world data sets.
Our method achieves good quality (measured by quality metrics) and scales computationally well with large high-dimensional data.
To illustrate its concrete applications, we further apply HD-SDR on a recent astronomical catalog.
arXiv Detail & Related papers (2021-10-01T11:13:51Z) - Hypervector Design for Efficient Hyperdimensional Computing on Edge
Devices [0.20971479389679334]
This paper presents a technique to minimize the hypervector dimension while maintaining the accuracy and improving the robustness of the classifier.
The proposed approach decreases the hypervector dimension by more than $32times$ while maintaining or increasing the accuracy achieved by conventional HDC.
Experiments on a commercial hardware platform show that the proposed approach achieves more than one order of magnitude reduction in model size, inference time, and energy consumption.
arXiv Detail & Related papers (2021-03-08T05:25:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.