Highly-scalable stochastic neuron based on Ovonic Threshold Switch (OTS)
and its applications in Restricted Boltzmann Machine (RBM)
- URL: http://arxiv.org/abs/2010.10986v1
- Date: Wed, 21 Oct 2020 13:20:01 GMT
- Title: Highly-scalable stochastic neuron based on Ovonic Threshold Switch (OTS)
and its applications in Restricted Boltzmann Machine (RBM)
- Authors: Seong-il Im, Hyejin Lee, Jaesang Lee, Jae-Seung Jeong, Joon Young
Kwak, Keunsu Kim, Jeong Ho Cho, Hyunsu Ju, Suyoun Lee
- Abstract summary: We propose a highly-scalable neuron device based on Ovonic Threshold Switch (OTS)
As a candidate for a true random number generator (TRNG), it passes 15 among the 16 tests of the National Institute of Standards and Technology (NIST) Statistical Test Suite.
reconstruction of images is successfully demonstrated using images contaminated with noises, resulting in images with the noise removed.
- Score: 0.2529563359433233
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Interest in Restricted Boltzmann Machine (RBM) is growing as a generative
stochastic artificial neural network to implement a novel energy-efficient
machine-learning (ML) technique. For a hardware implementation of the RBM, an
essential building block is a reliable stochastic binary neuron device that
generates random spikes following the Boltzmann distribution. Here, we propose
a highly-scalable stochastic neuron device based on Ovonic Threshold Switch
(OTS) which utilizes the random emission and capture process of traps as the
source of stochasticity. The switching probability is well described by the
Boltzmann distribution, which can be controlled by operating parameters. As a
candidate for a true random number generator (TRNG), it passes 15 among the 16
tests of the National Institute of Standards and Technology (NIST) Statistical
Test Suite (Special Publication 800-22). In addition, the recognition task of
handwritten digits (MNIST) is demonstrated using a simulated RBM network
consisting of the proposed device with a maximum recognition accuracy of 86.07
%. Furthermore, reconstruction of images is successfully demonstrated using
images contaminated with noises, resulting in images with the noise removed.
These results show the promising properties of OTS-based stochastic neuron
devices for applications in RBM systems.
Related papers
- Photonic probabilistic machine learning using quantum vacuum noise [8.194733686324204]
Probabilistic machine learning utilizes controllable sources of randomness to encode uncertainty and enable statistical modeling.
Here, we implement a photonic probabilistic computer consisting of a controllable photonic element.
Our work paves the way for scalable, ultrafast, and energy-efficient probabilistic machine learning hardware.
arXiv Detail & Related papers (2024-03-07T18:35:18Z) - Synaptic Sampling of Neural Networks [0.14732811715354452]
This paper describes the scANN technique -- textit (by coinflips) artificial neural networks -- which enables neural networks to be sampled directly by treating the weights as Bernoulli coin flips.
arXiv Detail & Related papers (2023-11-21T22:56:13Z) - SMRD: SURE-based Robust MRI Reconstruction with Diffusion Models [76.43625653814911]
Diffusion models have gained popularity for accelerated MRI reconstruction due to their high sample quality.
They can effectively serve as rich data priors while incorporating the forward model flexibly at inference time.
We introduce SURE-based MRI Reconstruction with Diffusion models (SMRD) to enhance robustness during testing.
arXiv Detail & Related papers (2023-10-03T05:05:35Z) - Neural Boltzmann Machines [2.179313476241343]
Conditional generative models are capable of using contextual information as input to create new imaginative outputs.
Conditional Restricted Boltzmann Machines (CRBMs) are one class of conditional generative models that have proven to be especially adept at modeling noisy discrete or continuous data.
We generalize CRBMs by converting each of the CRBM parameters to their own neural networks that are allowed to be functions of the conditional inputs.
arXiv Detail & Related papers (2023-05-15T04:03:51Z) - Bayesian Inference on Binary Spiking Networks Leveraging Nanoscale
Device Stochasticity [27.046123432931207]
We introduce a novel Phase Change Memory (PCM)-based hardware implementation for BNNs with binary synapses.
We obtain hardware accuracy and expected calibration error matching that of an 8-bit fixed-point (FxP8) implementation, with projected savings of over 9$times$ in terms of core area transistor count.
arXiv Detail & Related papers (2023-02-02T18:27:31Z) - Learning Probabilistic Models from Generator Latent Spaces with Hat EBM [81.35199221254763]
This work proposes a method for using any generator network as the foundation of an Energy-Based Model (EBM)
Experiments show strong performance of the proposed method on (1) unconditional ImageNet synthesis at 128x128 resolution, (2) refining the output of existing generators, and (3) learning EBMs that incorporate non-probabilistic generators.
arXiv Detail & Related papers (2022-10-29T03:55:34Z) - Multi-layered Discriminative Restricted Boltzmann Machine with Untrained
Probabilistic Layer [0.0]
An extreme learning machine (ELM) is a three-layered feed-forward neural network having untrained parameters.
Inspired by ELM, a probabilistic untrained layer called a probabilistic-ELM layer is proposed.
It is combined with a discriminative restricted Boltzmann machine (DRBM) to solve classification problems.
arXiv Detail & Related papers (2022-10-27T13:56:17Z) - Macroscopic noise amplification by asymmetric dyads in non-Hermitian
optical systems for generative diffusion models [55.2480439325792]
asymmetric non-Hermitian dyads are promising candidates for efficient sensors and ultra-fast random number generators.
integrated light emission from such asymmetric dyads can be efficiently used for all-optical degenerative diffusion models of machine learning.
arXiv Detail & Related papers (2022-06-24T10:19:36Z) - Adversarial Examples Detection with Bayesian Neural Network [57.185482121807716]
We propose a new framework to detect adversarial examples motivated by the observations that random components can improve the smoothness of predictors.
We propose a novel Bayesian adversarial example detector, short for BATer, to improve the performance of adversarial example detection.
arXiv Detail & Related papers (2021-05-18T15:51:24Z) - And/or trade-off in artificial neurons: impact on adversarial robustness [91.3755431537592]
Presence of sufficient number of OR-like neurons in a network can lead to classification brittleness and increased vulnerability to adversarial attacks.
We define AND-like neurons and propose measures to increase their proportion in the network.
Experimental results on the MNIST dataset suggest that our approach holds promise as a direction for further exploration.
arXiv Detail & Related papers (2021-02-15T08:19:05Z) - Sampling asymmetric open quantum systems for artificial neural networks [77.34726150561087]
We present a hybrid sampling strategy which takes asymmetric properties explicitly into account, achieving fast convergence times and high scalability for asymmetric open systems.
We highlight the universal applicability of artificial neural networks, underlining the universal applicability of neural networks.
arXiv Detail & Related papers (2020-12-20T18:25:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.