An Encoding Framework for Binarized Images using HyperDimensional
Computing
- URL: http://arxiv.org/abs/2312.00454v1
- Date: Fri, 1 Dec 2023 09:34:28 GMT
- Title: An Encoding Framework for Binarized Images using HyperDimensional
Computing
- Authors: Laura Smets, Werner Van Leekwijck, Ing Jyh Tsang, and Steven Latr\'e
- Abstract summary: This article proposes a novel light-weight approach to encode binarized images that preserves similarity of patterns at nearby locations.
The method reaches an accuracy of 97.35% on the test set for the MNIST data set and 84.12% for the Fashion-MNIST data set.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperdimensional Computing (HDC) is a brain-inspired and light-weight machine
learning method. It has received significant attention in the literature as a
candidate to be applied in the wearable internet of things, near-sensor
artificial intelligence applications and on-device processing. HDC is
computationally less complex than traditional deep learning algorithms and
typically achieves moderate to good classification performance. A key aspect
that determines the performance of HDC is the encoding of the input data to the
hyperdimensional (HD) space. This article proposes a novel light-weight
approach relying only on native HD arithmetic vector operations to encode
binarized images that preserves similarity of patterns at nearby locations by
using point of interest selection and local linear mapping. The method reaches
an accuracy of 97.35% on the test set for the MNIST data set and 84.12% for the
Fashion-MNIST data set. These results outperform other studies using baseline
HDC with different encoding approaches and are on par with more complex hybrid
HDC models. The proposed encoding approach also demonstrates a higher
robustness to noise and blur compared to the baseline encoding.
Related papers
- DistHD: A Learner-Aware Dynamic Encoding Method for Hyperdimensional
Classification [10.535034643999344]
We propose DistHD, a novel dynamic encoding technique for HDC adaptive learning.
Our proposed algorithm DistHD successfully achieves the desired accuracy with considerably lower dimensionality.
arXiv Detail & Related papers (2023-04-11T21:18:52Z) - Streaming Encoding Algorithms for Scalable Hyperdimensional Computing [12.829102171258882]
Hyperdimensional computing (HDC) is a paradigm for data representation and learning originating in computational neuroscience.
In this work, we explore a family of streaming encoding techniques based on hashing.
We show formally that these methods enjoy comparable guarantees on performance for learning applications while being substantially more efficient than existing alternatives.
arXiv Detail & Related papers (2022-09-20T17:25:14Z) - HDTorch: Accelerating Hyperdimensional Computing with GP-GPUs for Design
Space Exploration [4.783565770657063]
We introduce HDTorch, an open-source, PyTorch-based HDC library with extensions for hypervector operations.
We analyze four HDC benchmark datasets in terms of accuracy, runtime, and memory consumption.
We perform the first-ever HD training and inference analysis of the entirety of the CHB-MIT EEG epilepsy database.
arXiv Detail & Related papers (2022-06-09T19:46:08Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - Efficient VVC Intra Prediction Based on Deep Feature Fusion and
Probability Estimation [57.66773945887832]
We propose to optimize Versatile Video Coding (VVC) complexity at intra-frame prediction, with a two-stage framework of deep feature fusion and probability estimation.
Experimental results on standard database demonstrate the superiority of proposed method, especially for High Definition (HD) and Ultra-HD (UHD) video sequences.
arXiv Detail & Related papers (2022-05-07T08:01:32Z) - EnHDC: Ensemble Learning for Brain-Inspired Hyperdimensional Computing [2.7462881838152913]
This paper presents the first effort in exploring ensemble learning in the context of hyperdimensional computing.
We propose the first ensemble HDC model referred to as EnHDC.
We show that EnHDC can achieve on average 3.2% accuracy improvement over a single HDC classifier.
arXiv Detail & Related papers (2022-03-25T09:54:00Z) - Small Lesion Segmentation in Brain MRIs with Subpixel Embedding [105.1223735549524]
We present a method to segment MRI scans of the human brain into ischemic stroke lesion and normal tissues.
We propose a neural network architecture in the form of a standard encoder-decoder where predictions are guided by a spatial expansion embedding network.
arXiv Detail & Related papers (2021-09-18T00:21:17Z) - Dynamic Neural Representational Decoders for High-Resolution Semantic
Segmentation [98.05643473345474]
We propose a novel decoder, termed dynamic neural representational decoder (NRD)
As each location on the encoder's output corresponds to a local patch of the semantic labels, in this work, we represent these local patches of labels with compact neural networks.
This neural representation enables our decoder to leverage the smoothness prior in the semantic label space, and thus makes our decoder more efficient.
arXiv Detail & Related papers (2021-07-30T04:50:56Z) - Neural Distributed Source Coding [59.630059301226474]
We present a framework for lossy DSC that is agnostic to the correlation structure and can scale to high dimensions.
We evaluate our method on multiple datasets and show that our method can handle complex correlations and state-of-the-art PSNR.
arXiv Detail & Related papers (2021-06-05T04:50:43Z) - SHEARer: Highly-Efficient Hyperdimensional Computing by
Software-Hardware Enabled Multifold Approximation [7.528764144503429]
We propose SHEARer, an algorithm-hardware co-optimization to improve the performance and energy consumption of HD computing.
SHEARer achieves an average throughput boost of 104,904x (15.7x) and energy savings of up to 56,044x (301x) compared to state-of-the-art encoding methods.
We also develop a software framework that enables training HD models by emulating the proposed approximate encodings.
arXiv Detail & Related papers (2020-07-20T07:58:44Z) - Auto-Encoding Twin-Bottleneck Hashing [141.5378966676885]
This paper proposes an efficient and adaptive code-driven graph.
It is updated by decoding in the context of an auto-encoder.
Experiments on benchmarked datasets clearly show the superiority of our framework over the state-of-the-art hashing methods.
arXiv Detail & Related papers (2020-02-27T05:58:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.