A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part I: Models and Data Transformations
- URL: http://arxiv.org/abs/2111.06077v2
- Date: Mon, 31 Jul 2023 22:40:51 GMT
- Title: A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part I: Models and Data Transformations
- Authors: Denis Kleyko, Dmitri A. Rachkovskij, Evgeny Osipov, Abbas Rahimi
- Abstract summary: HDC/VSA is a highly interdisciplinary field with connections to computer science, electrical engineering, artificial intelligence, mathematics, and cognitive science.
This two-part comprehensive survey is written to be useful for both newcomers and practitioners.
- Score: 7.240104756698618
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This two-part comprehensive survey is devoted to a computing framework most
commonly known under the names Hyperdimensional Computing and Vector Symbolic
Architectures (HDC/VSA). Both names refer to a family of computational models
that use high-dimensional distributed representations and rely on the algebraic
properties of their key operations to incorporate the advantages of structured
symbolic representations and vector distributed representations. Notable models
in the HDC/VSA family are Tensor Product Representations, Holographic Reduced
Representations, Multiply-Add-Permute, Binary Spatter Codes, and Sparse Binary
Distributed Representations but there are other models too. HDC/VSA is a highly
interdisciplinary field with connections to computer science, electrical
engineering, artificial intelligence, mathematics, and cognitive science. This
fact makes it challenging to create a thorough overview of the field. However,
due to a surge of new researchers joining the field in recent years, the
necessity for a comprehensive survey of the field has become extremely
important. Therefore, amongst other aspects of the field, this Part I surveys
important aspects such as: known computational models of HDC/VSA and
transformations of various input data types to high-dimensional distributed
representations. Part II of this survey is devoted to applications, cognitive
computing and architectures, as well as directions for future work. The survey
is written to be useful for both newcomers and practitioners.
Related papers
- Geometry Distributions [51.4061133324376]
We propose a novel geometric data representation that models geometry as distributions.
Our approach uses diffusion models with a novel network architecture to learn surface point distributions.
We evaluate our representation qualitatively and quantitatively across various object types, demonstrating its effectiveness in achieving high geometric fidelity.
arXiv Detail & Related papers (2024-11-25T04:06:48Z) - Exploring the Effectiveness of Object-Centric Representations in Visual Question Answering: Comparative Insights with Foundation Models [24.579822095003685]
We conduct an empirical study on representation learning for downstream Visual Question Answering (VQA)
We thoroughly investigate the benefits and trade-offs of OC models and alternative approaches.
arXiv Detail & Related papers (2024-07-22T12:26:08Z) - Neural Clustering based Visual Representation Learning [61.72646814537163]
Clustering is one of the most classic approaches in machine learning and data analysis.
We propose feature extraction with clustering (FEC), which views feature extraction as a process of selecting representatives from data.
FEC alternates between grouping pixels into individual clusters to abstract representatives and updating the deep features of pixels with current representatives.
arXiv Detail & Related papers (2024-03-26T06:04:50Z) - Learning Implicit Feature Alignment Function for Semantic Segmentation [51.36809814890326]
Implicit Feature Alignment function (IFA) is inspired by the rapidly expanding topic of implicit neural representations.
We show that IFA implicitly aligns the feature maps at different levels and is capable of producing segmentation maps in arbitrary resolutions.
Our method can be combined with improvement on various architectures, and it achieves state-of-the-art accuracy trade-off on common benchmarks.
arXiv Detail & Related papers (2022-06-17T09:40:14Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - A Survey on Hyperdimensional Computing aka Vector Symbolic
Architectures, Part II: Applications, Cognitive Models, and Challenges [7.240104756698618]
Part I of this survey covered the historical context leading to the development of HDC/VSA.
Part II surveys existing applications, the role of HDC/VSA in cognitive computing and architectures, as well as directions for future work.
arXiv Detail & Related papers (2021-11-12T18:21:44Z) - Computing on Functions Using Randomized Vector Representations [4.066849397181077]
We call this new function encoding and computing framework Vector Function Architecture (VFA)
Our analyses and results suggest that VFAs constitute a powerful new framework for representing and manipulating functions in distributed neural systems.
arXiv Detail & Related papers (2021-09-08T04:39:48Z) - From Symbols to Embeddings: A Tale of Two Representations in
Computational Social Science [77.5409807529667]
The study of Computational Social Science (CSS) is data-driven and significantly benefits from the availability of online user-generated contents and social networks.
To explore the answer, we give a thorough review of data representations in CSS for both text and network.
We present the applications of the above representations based on the investigation of more than 400 research articles from 6 top venues involved with CSS.
arXiv Detail & Related papers (2021-06-27T11:04:44Z) - The Immersion of Directed Multi-graphs in Embedding Fields.
Generalisations [0.0]
This paper outlines a generalised model for representing hybrid-categorical, symbolic, perceptual-sensory and perceptual-latent data.
This variety of representation is currently used by various machine-learning models in computer vision, NLP/NLU.
It is achieved by endowing a directed relational-Typed Multi-Graph with at least some edge attributes which represent the embeddings from various latent spaces.
arXiv Detail & Related papers (2020-04-28T09:28:08Z) - A Theory of Usable Information Under Computational Constraints [103.5901638681034]
We propose a new framework for reasoning about information in complex systems.
Our foundation is based on a variational extension of Shannon's information theory.
We show that by incorporating computational constraints, $mathcalV$-information can be reliably estimated from data.
arXiv Detail & Related papers (2020-02-25T06:09:30Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.