Automated Architecture Search for Brain-inspired Hyperdimensional
Computing
- URL: http://arxiv.org/abs/2202.05827v1
- Date: Fri, 11 Feb 2022 18:43:36 GMT
- Title: Automated Architecture Search for Brain-inspired Hyperdimensional
Computing
- Authors: Junhuan Yang, Yi Sheng, Sizhe Zhang, Ruixuan Wang, Kenneth Foreman,
Mikell Paige, Xun Jiao, Weiwen Jiang, Lei Yang
- Abstract summary: This paper represents the first effort to explore an automated architecture search for hyperdimensional computing (HDC)
The searched HDC architectures show competitive performance on case studies involving a drug discovery dataset and a language recognition task.
- Score: 5.489173080636452
- License: http://creativecommons.org/publicdomain/zero/1.0/
- Abstract: This paper represents the first effort to explore an automated architecture
search for hyperdimensional computing (HDC), a type of brain-inspired neural
network. Currently, HDC design is largely carried out in an
application-specific ad-hoc manner, which significantly limits its application.
Furthermore, the approach leads to inferior accuracy and efficiency, which
suggests that HDC cannot perform competitively against deep neural networks.
Herein, we present a thorough study to formulate an HDC architecture search
space. On top of the search space, we apply reinforcement-learning to
automatically explore the HDC architectures. The searched HDC architectures
show competitive performance on case studies involving a drug discovery dataset
and a language recognition task. On the Clintox dataset, which tries to learn
features from developed drugs that passed/failed clinical trials for toxicity
reasons, the searched HDC architecture obtains the state-of-the-art ROC-AUC
scores, which are 0.80% higher than the manually designed HDC and 9.75% higher
than conventional neural networks. Similar results are achieved on the language
recognition task, with 1.27% higher performance than conventional methods.
Related papers
- EM-DARTS: Hierarchical Differentiable Architecture Search for Eye Movement Recognition [54.99121380536659]
Eye movement biometrics have received increasing attention thanks to its high secure identification.
Deep learning (DL) models have been recently successfully applied for eye movement recognition.
DL architecture still is determined by human prior knowledge.
We propose EM-DARTS, a hierarchical differentiable architecture search algorithm to automatically design the DL architecture for eye movement recognition.
arXiv Detail & Related papers (2024-09-22T13:11:08Z) - DNA Family: Boosting Weight-Sharing NAS with Block-Wise Supervisions [121.05720140641189]
We develop a family of models with the distilling neural architecture (DNA) techniques.
Our proposed DNA models can rate all architecture candidates, as opposed to previous works that can only access a sub- search space using algorithms.
Our models achieve state-of-the-art top-1 accuracy of 78.9% and 83.6% on ImageNet for a mobile convolutional network and a small vision transformer, respectively.
arXiv Detail & Related papers (2024-03-02T22:16:47Z) - DCP-NAS: Discrepant Child-Parent Neural Architecture Search for 1-bit
CNNs [53.82853297675979]
1-bit convolutional neural networks (CNNs) with binary weights and activations show their potential for resource-limited embedded devices.
One natural approach is to use 1-bit CNNs to reduce the computation and memory cost of NAS.
We introduce Discrepant Child-Parent Neural Architecture Search (DCP-NAS) to efficiently search 1-bit CNNs.
arXiv Detail & Related papers (2023-06-27T11:28:29Z) - Hyperdimensional Computing vs. Neural Networks: Comparing Architecture
and Learning Process [3.244375684001034]
We make a comparative study between HDC and neural network to provide a different angle where HDC can be derived from an extremely compact neural network trained upfront.
Experimental results show such neural network-derived HDC model can achieve up to 21% and 5% accuracy increase from conventional and learning-based HDC models respectively.
arXiv Detail & Related papers (2022-07-24T21:23:50Z) - Automating Neural Architecture Design without Search [3.651848964235307]
We study the automated architecture design from a new perspective that eliminates the need to sequentially evaluate each neural architecture generated during algorithm execution.
We implement the proposed approach by using a graph neural network for link prediction and acquired the knowledge from NAS-Bench-101.
In addition, we also utilized the learned knowledge from NAS-Bench-101 to automate architecture design in the DARTS search space, and achieved 97.82% accuracy on CIFAR10, and 76.51% top-1 accuracy on ImageNet consuming only $2times10-4$ GPU days.
arXiv Detail & Related papers (2022-04-21T14:41:05Z) - EnHDC: Ensemble Learning for Brain-Inspired Hyperdimensional Computing [2.7462881838152913]
This paper presents the first effort in exploring ensemble learning in the context of hyperdimensional computing.
We propose the first ensemble HDC model referred to as EnHDC.
We show that EnHDC can achieve on average 3.2% accuracy improvement over a single HDC classifier.
arXiv Detail & Related papers (2022-03-25T09:54:00Z) - Probeable DARTS with Application to Computational Pathology [44.20005949950844]
We use differentiable architecture search (DARTS) for its efficiency.
We then apply our searching framework on CPath applications by searching for the optimum network architecture.
Results show that the searched network outperforms state-of-the-art networks in terms of prediction accuracy and complexity.
arXiv Detail & Related papers (2021-08-16T02:16:06Z) - MS-RANAS: Multi-Scale Resource-Aware Neural Architecture Search [94.80212602202518]
We propose Multi-Scale Resource-Aware Neural Architecture Search (MS-RANAS)
We employ a one-shot architecture search approach in order to obtain a reduced search cost.
We achieve state-of-the-art results in terms of accuracy-speed trade-off.
arXiv Detail & Related papers (2020-09-29T11:56:01Z) - Disentangled Neural Architecture Search [7.228790381070109]
We propose disentangled neural architecture search (DNAS) which disentangles the hidden representation of the controller into semantically meaningful concepts.
DNAS successfully disentangles the architecture representations, including operation selection, skip connections, and number of layers.
Dense-sampling leads to neural architecture search with higher efficiency and better performance.
arXiv Detail & Related papers (2020-09-24T03:35:41Z) - Off-Policy Reinforcement Learning for Efficient and Effective GAN
Architecture Search [50.40004966087121]
We introduce a new reinforcement learning based neural architecture search (NAS) methodology for generative adversarial network (GAN) architecture search.
The key idea is to formulate the GAN architecture search problem as a Markov decision process (MDP) for smoother architecture sampling.
We exploit an off-policy GAN architecture search algorithm that makes efficient use of the samples generated by previous policies.
arXiv Detail & Related papers (2020-07-17T18:29:17Z) - A Semi-Supervised Assessor of Neural Architectures [157.76189339451565]
We employ an auto-encoder to discover meaningful representations of neural architectures.
A graph convolutional neural network is introduced to predict the performance of architectures.
arXiv Detail & Related papers (2020-05-14T09:02:33Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.