HyperSeed: Unsupervised Learning with Vector Symbolic Architectures
- URL: http://arxiv.org/abs/2110.08343v1
- Date: Fri, 15 Oct 2021 20:05:43 GMT
- Title: HyperSeed: Unsupervised Learning with Vector Symbolic Architectures
- Authors: Evgeny Osipov, Sachin Kahawala, Dilantha Haputhanthri, Thimal
Kempitiya, Daswin De Silva, Damminda Alahakoon, Denis Kleyko
- Abstract summary: This paper presents a novel unsupervised machine learning approach named Hyperseed.
It leverages Vector Symbolic Architectures (VSA) for fast learning a topology preserving feature map of unlabelled data.
The two distinctive novelties of the Hyperseed algorithm are 1) Learning from only few input data samples and 2) A learning rule based on a single vector operation.
- Score: 5.258404928739212
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Motivated by recent innovations in biologically-inspired neuromorphic
hardware, this paper presents a novel unsupervised machine learning approach
named Hyperseed that leverages Vector Symbolic Architectures (VSA) for fast
learning a topology preserving feature map of unlabelled data. It relies on two
major capabilities of VSAs: the binding operation and computing in
superposition. In this paper, we introduce the algorithmic part of Hyperseed
expressed within Fourier Holographic Reduced Representations VSA model, which
is specifically suited for implementation on spiking neuromorphic hardware. The
two distinctive novelties of the Hyperseed algorithm are: 1) Learning from only
few input data samples and 2) A learning rule based on a single vector
operation. These properties are demonstrated on synthetic datasets as well as
on illustrative benchmark use-cases, IRIS classification and a language
identification task using n-gram statistics.
Related papers
- A Walsh Hadamard Derived Linear Vector Symbolic Architecture [83.27945465029167]
Symbolic Vector Architectures (VSAs) are an approach to developing Neuro-symbolic AI.
HLB is designed to have favorable computational efficiency, and efficacy in classic VSA tasks.
arXiv Detail & Related papers (2024-10-30T03:42:59Z) - Queryable Prototype Multiple Instance Learning with Vision-Language Models for Incremental Whole Slide Image Classification [10.667645628712542]
This paper proposes the first Vision-Language-based framework with Queryable Prototype Multiple Instance Learning (QPMIL-VL) specially designed for incremental WSI classification.
experiments on four TCGA datasets demonstrate that our QPMIL-VL framework is effective for incremental WSI classification.
arXiv Detail & Related papers (2024-10-14T14:49:34Z) - Heterogeneous Directed Hypergraph Neural Network over abstract syntax
tree (AST) for Code Classification [9.01892294402701]
We propose to represent AST as a heterogeneous directed hypergraph (HDHG) and process the graph by heterogeneous directed hypergraph neural network (HDHGN) for code classification.
Our method improves code understanding and can represent high-order data correlations beyond paired interactions.
arXiv Detail & Related papers (2023-05-07T09:28:16Z) - Learning Implicit Feature Alignment Function for Semantic Segmentation [51.36809814890326]
Implicit Feature Alignment function (IFA) is inspired by the rapidly expanding topic of implicit neural representations.
We show that IFA implicitly aligns the feature maps at different levels and is capable of producing segmentation maps in arbitrary resolutions.
Our method can be combined with improvement on various architectures, and it achieves state-of-the-art accuracy trade-off on common benchmarks.
arXiv Detail & Related papers (2022-06-17T09:40:14Z) - Gluing Neural Networks Symbolically Through Hyperdimensional Computing [8.209945970790741]
We explore the notion of using binary hypervectors to encode the final, classifying output signals of neural networks.
This allows multiple neural networks to work together to solve a problem, with little additional overhead.
We find that this outperforms the state of the art, or is on a par with it, while using very little overhead.
arXiv Detail & Related papers (2022-05-31T04:44:02Z) - An Extension to Basis-Hypervectors for Learning from Circular Data in
Hyperdimensional Computing [62.997667081978825]
Hyperdimensional Computing (HDC) is a computation framework based on properties of high-dimensional random spaces.
We present a study on basis-hypervector sets, which leads to practical contributions to HDC in general.
We introduce a method to learn from circular data, an important type of information never before addressed in machine learning with HDC.
arXiv Detail & Related papers (2022-05-16T18:04:55Z) - Self-Supervised Graph Representation Learning for Neuronal Morphologies [75.38832711445421]
We present GraphDINO, a data-driven approach to learn low-dimensional representations of 3D neuronal morphologies from unlabeled datasets.
We show, in two different species and across multiple brain areas, that this method yields morphological cell type clusterings on par with manual feature-based classification by experts.
Our method could potentially enable data-driven discovery of novel morphological features and cell types in large-scale datasets.
arXiv Detail & Related papers (2021-12-23T12:17:47Z) - ast2vec: Utilizing Recursive Neural Encodings of Python Programs [3.088385631471295]
We present ast2vec, a neural network that maps Python syntax trees to vectors and back.
Ast2vec has been trained on almost half a million programs of novice programmers.
arXiv Detail & Related papers (2021-03-22T06:53:52Z) - Synbols: Probing Learning Algorithms with Synthetic Datasets [112.45883250213272]
Synbols is a tool for rapidly generating new datasets with a rich composition of latent features rendered in low resolution images.
Our tool's high-level interface provides a language for rapidly generating new distributions on the latent features.
To showcase the versatility of Synbols, we use it to dissect the limitations and flaws in standard learning algorithms in various learning setups.
arXiv Detail & Related papers (2020-09-14T13:03:27Z) - Tensor Relational Algebra for Machine Learning System Design [7.764107702934616]
We present an alternative implementation abstraction called the relational tensor algebra (TRA)
TRA is a set-based algebra based on the relational algebra.
Our empirical study shows that the optimized TRA-based back-end can significantly outperform alternatives for running ML in distributed clusters.
arXiv Detail & Related papers (2020-09-01T15:51:24Z) - Closed Loop Neural-Symbolic Learning via Integrating Neural Perception,
Grammar Parsing, and Symbolic Reasoning [134.77207192945053]
Prior methods learn the neural-symbolic models using reinforcement learning approaches.
We introduce the textbfgrammar model as a textitsymbolic prior to bridge neural perception and symbolic reasoning.
We propose a novel textbfback-search algorithm which mimics the top-down human-like learning procedure to propagate the error.
arXiv Detail & Related papers (2020-06-11T17:42:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.