A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part II: Applications, Cognitive Models, and Challenges
- URL: http://arxiv.org/abs/2112.15424v3
- Date: Tue, 1 Aug 2023 14:48:02 GMT
- Title: A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part II: Applications, Cognitive Models, and Challenges
- Authors: Denis Kleyko, Dmitri A. Rachkovskij, Evgeny Osipov, Abbas Rahimi
- Abstract summary: Part I of this survey covered the historical context leading to the development of HDC/VSA.
Part II surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work.
- Score: 7.240104756698618
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This is Part II of the two-part comprehensive survey devoted to a computing
framework most commonly known under the names Hyperdimensional Computing and
Vector Symbolic Architectures (HDC/VSA). Both names refer to a family of
computational models that use high-dimensional distributed representations and
rely on the algebraic properties of their key operations to incorporate the
advantages of structured symbolic representations and vector distributed
representations. Holographic Reduced Representations is an influential HDC/VSA
model that is well-known in the machine learning domain and often used to refer
to the whole family. However, for the sake of consistency, we use HDC/VSA to
refer to the field. Part I of this survey covered foundational aspects of the
field, such as the historical context leading to the development of HDC/VSA,
key elements of any HDC/VSA model, known HDC/VSA models, and the transformation
of input data of various types into high-dimensional vectors suitable for
HDC/VSA. This second part surveys existing applications, the role of HDC/VSA in
cognitive computing and architectures, as well as directions for future work.
Most of the applications lie within the Machine Learning/Artificial
Intelligence domain, however, we also cover other applications to provide a
complete picture. The survey is written to be useful for both newcomers and
practitioners.
Related papers
- HyperVQ: MLR-based Vector Quantization in Hyperbolic Space [56.4245885674567]
We study the use of hyperbolic spaces for vector quantization (HyperVQ)
We show that hyperVQ performs comparably in reconstruction and generative tasks while outperforming VQ in discriminative tasks and learning a highly disentangled latent space.
arXiv Detail & Related papers (2024-03-18T03:17:08Z) - Full High-Dimensional Intelligible Learning In 2-D Lossless
Visualization Space [7.005458308454871]
This study explores a new methodology for machine learning classification tasks in 2-D visualization space (2-D ML)
It is shown that this is a full machine learning approach that does not require processing n-dimensional data in an abstract n-dimensional space.
It enables discovering n-D patterns in 2-D space without loss of n-D information using graph representations of n-D data in 2-D.
arXiv Detail & Related papers (2023-05-29T00:21:56Z) - Learning Implicit Feature Alignment Function for Semantic Segmentation [51.36809814890326]
Implicit Feature Alignment function (IFA) is inspired by the rapidly expanding topic of implicit neural representations.
We show that IFA implicitly aligns the feature maps at different levels and is capable of producing segmentation maps in arbitrary resolutions.
Our method can be combined with improvement on various architectures, and it achieves state-of-the-art accuracy trade-off on common benchmarks.
arXiv Detail & Related papers (2022-06-17T09:40:14Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - Understanding Hyperdimensional Computing for Parallel Single-Pass
Learning [47.82940409267635]
We show that HDC can outperform the state-of-the-art HDC model by up to 7.6% while maintaining hardware efficiency.
We propose a new class of VSAs, finite group VSAs, which surpass the limits of HDC.
Experimental results show that our RFF method and group VSA can both outperform the state-of-the-art HDC model by up to 7.6%.
arXiv Detail & Related papers (2022-02-10T02:38:56Z) - A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part I: Models and Data Transformations [7.240104756698618]
HDC/VSA is a highly interdisciplinary field with connections to computer science, electrical engineering, artificial intelligence, mathematics, and cognitive science.
This two-part comprehensive survey is written to be useful for both newcomers and practitioners.
arXiv Detail & Related papers (2021-11-11T07:14:22Z) - HyperSeed: Unsupervised Learning with Vector Symbolic Architectures [5.258404928739212]
This paper presents a novel unsupervised machine learning approach named Hyperseed.
It leverages Vector Symbolic Architectures (VSA) for fast learning a topology preserving feature map of unlabelled data.
The two distinctive novelties of the Hyperseed algorithm are 1) Learning from only few input data samples and 2) A learning rule based on a single vector operation.
arXiv Detail & Related papers (2021-10-15T20:05:43Z) - Computing on Functions Using Randomized Vector Representations [4.066849397181077]
We call this new function encoding and computing framework Vector Function Architecture (VFA)
Our analyses and results suggest that VFAs constitute a powerful new framework for representing and manipulating functions in distributed neural systems.
arXiv Detail & Related papers (2021-09-08T04:39:48Z) - From Symbols to Embeddings: A Tale of Two Representations in
Computational Social Science [77.5409807529667]
The study of Computational Social Science (CSS) is data-driven and significantly benefits from the availability of online user-generated contents and social networks.
To explore the answer, we give a thorough review of data representations in CSS for both text and network.
We present the applications of the above representations based on the investigation of more than 400 research articles from 6 top venues involved with CSS.
arXiv Detail & Related papers (2021-06-27T11:04:44Z) - High-Dimensional Quadratic Discriminant Analysis under Spiked Covariance
Model [101.74172837046382]
We propose a novel quadratic classification technique, the parameters of which are chosen such that the fisher-discriminant ratio is maximized.
Numerical simulations show that the proposed classifier not only outperforms the classical R-QDA for both synthetic and real data but also requires lower computational complexity.
arXiv Detail & Related papers (2020-06-25T12:00:26Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.