Classification using Hyperdimensional Computing: A Review
- URL: http://arxiv.org/abs/2004.11204v1
- Date: Sun, 19 Apr 2020 23:51:44 GMT
- Title: Classification using Hyperdimensional Computing: A Review
- Authors: Lulu Ge and Keshab K. Parhi
- Abstract summary: This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement.
Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images.
- Score: 16.329917143918028
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Hyperdimensional (HD) computing is built upon its unique data type referred
to as hypervectors. The dimension of these hypervectors is typically in the
range of tens of thousands. Proposed to solve cognitive tasks, HD computing
aims at calculating similarity among its data. Data transformation is realized
by three operations, including addition, multiplication and permutation. Its
ultra-wide data representation introduces redundancy against noise. Since
information is evenly distributed over every bit of the hypervectors, HD
computing is inherently robust. Additionally, due to the nature of those three
operations, HD computing leads to fast learning ability, high energy efficiency
and acceptable accuracy in learning and classification tasks. This paper
introduces the background of HD computing, and reviews the data representation,
data transformation, and similarity measurement. The orthogonality in high
dimensions presents opportunities for flexible computing. To balance the
tradeoff between accuracy and efficiency, strategies include but are not
limited to encoding, retraining, binarization and hardware acceleration.
Evaluations indicate that HD computing shows great potential in addressing
problems using data in the form of letters, signals and images. HD computing
especially shows significant promise to replace machine learning algorithms as
a light-weight classifier in the field of internet of things (IoTs).
Related papers
- Efficient and accurate neural field reconstruction using resistive memory [52.68088466453264]
Traditional signal reconstruction methods on digital computers face both software and hardware challenges.
We propose a systematic approach with software-hardware co-optimizations for signal reconstruction from sparse inputs.
This work advances the AI-driven signal restoration technology and paves the way for future efficient and robust medical AI and 3D vision applications.
arXiv Detail & Related papers (2024-04-15T09:33:09Z) - uHD: Unary Processing for Lightweight and Dynamic Hyperdimensional
Computing [1.7118124088316602]
Hyperdimensional computing (HDC) is a novel computational paradigm that operates on long-dimensional vectors known as hypervectors.
In this paper, we show how to generate intensity and position hypervectors in HDC using low-discrepancy sequences.
For the first time in the literature, our proposed approach employs lightweight vector generators utilizing unary bit-streams for efficient encoding of data.
arXiv Detail & Related papers (2023-11-16T06:28:19Z) - Streaming Encoding Algorithms for Scalable Hyperdimensional Computing [12.829102171258882]
Hyperdimensional computing (HDC) is a paradigm for data representation and learning originating in computational neuroscience.
In this work, we explore a family of streaming encoding techniques based on hashing.
We show formally that these methods enjoy comparable guarantees on performance for learning applications while being substantially more efficient than existing alternatives.
arXiv Detail & Related papers (2022-09-20T17:25:14Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - GraphHD: Efficient graph classification using hyperdimensional computing [58.720142291102135]
We present a baseline approach for graph classification with HDC.
We evaluate GraphHD on real-world graph classification problems.
Our results show that when compared to the state-of-the-art Graph Neural Networks (GNNs) the proposed model achieves comparable accuracy.
arXiv Detail & Related papers (2022-05-16T17:32:58Z) - Kubric: A scalable dataset generator [73.78485189435729]
Kubric is a Python framework that interfaces with PyBullet and Blender to generate photo-realistic scenes, with rich annotations, and seamlessly scales to large jobs distributed over thousands of machines.
We demonstrate the effectiveness of Kubric by presenting a series of 13 different generated datasets for tasks ranging from studying 3D NeRF models to optical flow estimation.
arXiv Detail & Related papers (2022-03-07T18:13:59Z) - Transformer Networks for Data Augmentation of Human Physical Activity
Recognition [61.303828551910634]
State of the art models like Recurrent Generative Adrial Networks (RGAN) are used to generate realistic synthetic data.
In this paper, transformer based generative adversarial networks which have global attention on data, are compared on PAMAP2 and Real World Human Activity Recognition data sets with RGAN.
arXiv Detail & Related papers (2021-09-02T16:47:29Z) - A Theoretical Perspective on Hyperdimensional Computing [17.50442191930551]
Hyperdimensional (HD) computing is a set of neurally inspired methods for obtaining high-dimensional, low-precision, distributed representations of data.
HD computing has recently garnered significant interest from the computer hardware community as an energy-efficient, low-latency, and noise-robust tool for solving learning problems.
arXiv Detail & Related papers (2020-10-14T22:39:11Z) - SHEARer: Highly-Efficient Hyperdimensional Computing by
Software-Hardware Enabled Multifold Approximation [7.528764144503429]
We propose SHEARer, an algorithm-hardware co-optimization to improve the performance and energy consumption of HD computing.
SHEARer achieves an average throughput boost of 104,904x (15.7x) and energy savings of up to 56,044x (301x) compared to state-of-the-art encoding methods.
We also develop a software framework that enables training HD models by emulating the proposed approximate encodings.
arXiv Detail & Related papers (2020-07-20T07:58:44Z) - Spatial Information Guided Convolution for Real-Time RGBD Semantic
Segmentation [79.78416804260668]
We propose Spatial information guided Convolution (S-Conv), which allows efficient RGB feature and 3D spatial information integration.
S-Conv is competent to infer the sampling offset of the convolution kernel guided by the 3D spatial information.
We further embed S-Conv into a semantic segmentation network, called Spatial information Guided convolutional Network (SGNet)
arXiv Detail & Related papers (2020-04-09T13:38:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.