An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing
- URL: http://arxiv.org/abs/2205.07920v1
- Date: Mon, 16 May 2022 18:04:55 GMT
- Title: An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing
- Authors: Igor Nunes, Mike Heddes, Tony Givargis, Alexandru Nicolau
- Abstract summary: Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
- Score: 62.997667081978825
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Hyperdimensional Computing (HDC) is a computation framework based on
properties of high-dimensional random spaces. It is particularly useful for
machine learning in resource-constrained environments, such as embedded systems
and IoT, as it achieves a good balance between accuracy, efficiency and
robustness. The mapping of information to the hyperspace, named encoding, is
the most important stage in HDC. At its heart are basis-hypervectors,
responsible for representing the smallest units of meaningful information. In
this work we present a detailed study on basis-hypervector sets, which leads to
practical contributions to HDC in general: 1) we propose an improvement for
level-hypervectors, used to encode real numbers; 2) we introduce a method to
learn from circular data, an important type of information never before
addressed in machine learning with HDC. Empirical results indicate that these
contributions lead to considerably more accurate models for both classification
and regression with circular data.
Related papers
- Enabling High Data Throughput Reinforcement Learning on GPUs: A Domain Agnostic Framework for Data-Driven Scientific Research [90.91438597133211]
We introduce WarpSci, a framework designed to overcome crucial system bottlenecks in the application of reinforcement learning.
We eliminate the need for data transfer between the CPU and GPU, enabling the concurrent execution of thousands of simulations.
arXiv Detail & Related papers (2024-08-01T21:38:09Z) - Streaming Encoding Algorithms for Scalable Hyperdimensional Computing [12.829102171258882]
Hyperdimensional computing (HDC) is a paradigm for data representation and learning originating in computational neuroscience.
In this work, we explore a family of streaming encoding techniques based on hashing.
We show formally that these methods enjoy comparable guarantees on performance for learning applications while being substantially more efficient than existing alternatives.
arXiv Detail & Related papers (2022-09-20T17:25:14Z) - HDTorch: Accelerating Hyperdimensional Computing with GP-GPUs for Design
Space Exploration [4.783565770657063]
We introduce HDTorch, an open-source, PyTorch-based HDC library with extensions for hypervector operations.
We analyze four HDC benchmark datasets in terms of accuracy, runtime, and memory consumption.
We perform the first-ever HD training and inference analysis of the entirety of the CHB-MIT EEG epilepsy database.
arXiv Detail & Related papers (2022-06-09T19:46:08Z) - GraphHD: Efficient graph classification using hyperdimensional computing [58.720142291102135]
We present a baseline approach for graph classification with HDC.
We evaluate GraphHD on real-world graph classification problems.
Our results show that when compared to the state-of-the-art Graph Neural Networks (GNNs) the proposed model achieves comparable accuracy.
arXiv Detail & Related papers (2022-05-16T17:32:58Z) - Inducing Gaussian Process Networks [80.40892394020797]
We propose inducing Gaussian process networks (IGN), a simple framework for simultaneously learning the feature space as well as the inducing points.
The inducing points, in particular, are learned directly in the feature space, enabling a seamless representation of complex structured domains.
We report on experimental results for real-world data sets showing that IGNs provide significant advances over state-of-the-art methods.
arXiv Detail & Related papers (2022-04-21T05:27:09Z) - HDC-MiniROCKET: Explicit Time Encoding in Time Series Classification
with Hyperdimensional Computing [14.82489178857542]
MiniROCKET is one of the best existing methods for time series classification.
We extend this approach to provide better global temporal encodings using hyperdimensional computing (HDC) mechanisms.
The extension with HDC can achieve considerably better results on datasets with high temporal dependence without increasing the computational effort for inference.
arXiv Detail & Related papers (2022-02-16T13:33:13Z) - A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part II: Applications, Cognitive Models, and Challenges [7.240104756698618]
Part I of this survey covered the historical context leading to the development of HDC/VSA.
Part II surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work.
arXiv Detail & Related papers (2021-11-12T18:21:44Z) - Rank-R FNN: A Tensor-Based Learning Model for High-Order Data
Classification [69.26747803963907]
Rank-R Feedforward Neural Network (FNN) is a tensor-based nonlinear learning model that imposes Canonical/Polyadic decomposition on its parameters.
First, it handles inputs as multilinear arrays, bypassing the need for vectorization, and can thus fully exploit the structural information along every data dimension.
We establish the universal approximation and learnability properties of Rank-R FNN, and we validate its performance on real-world hyperspectral datasets.
arXiv Detail & Related papers (2021-04-11T16:37:32Z) - Classification using Hyperdimensional Computing: A Review [16.329917143918028]
This paper introduces the background of HD computing, and reviews the data representation, data transformation, and similarity measurement.
Evaluations indicate that HD computing shows great potential in addressing problems using data in the form of letters, signals and images.
arXiv Detail & Related papers (2020-04-19T23:51:44Z) - On Coresets for Support Vector Machines [61.928187390362176]
A coreset is a small, representative subset of the original data points.
We show that our algorithm can be used to extend the applicability of any off-the-shelf SVM solver to streaming, distributed, and dynamic data settings.
arXiv Detail & Related papers (2020-02-15T23:25:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.