Evaluating complexity and resilience trade-offs in emerging memory
inference machines
- URL: http://arxiv.org/abs/2003.10396v1
- Date: Tue, 25 Feb 2020 21:40:08 GMT
- Title: Evaluating complexity and resilience trade-offs in emerging memory
inference machines
- Authors: Christopher H. Bennett, Ryan Dellana, T. Patrick Xiao, Ben Feinberg,
Sapan Agarwal, Suma Cardwell, Matthew J. Marinella, William Severa, Brad
Aimone
- Abstract summary: We show that compact implementations of deep neural networks are unexpectedly susceptible to collapse from multiple system disturbances.
Our work proposes a middle path towards high performance and strong resilience utilizing the Mosaics framework.
- Score: 0.6970352368216021
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic-style inference only works well if limited hardware resources
are maximized properly, e.g. accuracy continues to scale with parameters and
complexity in the face of potential disturbance. In this work, we use realistic
crossbar simulations to highlight that compact implementations of deep neural
networks are unexpectedly susceptible to collapse from multiple system
disturbances. Our work proposes a middle path towards high performance and
strong resilience utilizing the Mosaics framework, and specifically by re-using
synaptic connections in a recurrent neural network implementation that
possesses a natural form of noise-immunity.
Related papers
- Contrastive Learning in Memristor-based Neuromorphic Systems [55.11642177631929]
Spiking neural networks have become an important family of neuron-based models that sidestep many of the key limitations facing modern-day backpropagation-trained deep networks.
In this work, we design and investigate a proof-of-concept instantiation of contrastive-signal-dependent plasticity (CSDP), a neuromorphic form of forward-forward-based, backpropagation-free learning.
arXiv Detail & Related papers (2024-09-17T04:48:45Z) - Heterogenous Memory Augmented Neural Networks [84.29338268789684]
We introduce a novel heterogeneous memory augmentation approach for neural networks.
By introducing learnable memory tokens with attention mechanism, we can effectively boost performance without huge computational overhead.
We show our approach on various image and graph-based tasks under both in-distribution (ID) and out-of-distribution (OOD) conditions.
arXiv Detail & Related papers (2023-10-17T01:05:28Z) - Leveraging Low-Rank and Sparse Recurrent Connectivity for Robust
Closed-Loop Control [63.310780486820796]
We show how a parameterization of recurrent connectivity influences robustness in closed-loop settings.
We find that closed-form continuous-time neural networks (CfCs) with fewer parameters can outperform their full-rank, fully-connected counterparts.
arXiv Detail & Related papers (2023-10-05T21:44:18Z) - Stochastic resonance neurons in artificial neural networks [0.0]
We propose a new type of neural networks using resonances as an inherent part of the architecture.
We show that such a neural network is more robust against the impact of noise.
arXiv Detail & Related papers (2022-05-06T18:42:36Z) - Learning to Modulate Random Weights: Neuromodulation-inspired Neural
Networks For Efficient Continual Learning [1.9580473532948401]
We introduce a novel neural network architecture inspired by neuromodulation in biological nervous systems.
We show that this approach has strong learning performance per task despite the very small number of learnable parameters.
arXiv Detail & Related papers (2022-04-08T21:12:13Z) - Deep Impulse Responses: Estimating and Parameterizing Filters with Deep
Networks [76.830358429947]
Impulse response estimation in high noise and in-the-wild settings is a challenging problem.
We propose a novel framework for parameterizing and estimating impulse responses based on recent advances in neural representation learning.
arXiv Detail & Related papers (2022-02-07T18:57:23Z) - Online Training of Spiking Recurrent Neural Networks with Phase-Change
Memory Synapses [1.9809266426888898]
Training spiking neural networks (RNNs) on dedicated neuromorphic hardware is still an open challenge.
We present a simulation framework of differential-architecture arrays based on an accurate and comprehensive Phase-Change Memory (PCM) device model.
We train a spiking RNN whose weights are emulated in the presented simulation framework, using a recently proposed e-prop learning rule.
arXiv Detail & Related papers (2021-08-04T01:24:17Z) - Physical Constraint Embedded Neural Networks for inference and noise
regulation [0.0]
We present methods for embedding even--odd symmetries and conservation laws in neural networks.
We demonstrate that it can accurately infer symmetries without prior knowledge.
We highlight the noise resilient properties of physical constraint embedded neural networks.
arXiv Detail & Related papers (2021-05-19T14:07:20Z) - Non-Singular Adversarial Robustness of Neural Networks [58.731070632586594]
Adrial robustness has become an emerging challenge for neural network owing to its over-sensitivity to small input perturbations.
We formalize the notion of non-singular adversarial robustness for neural networks through the lens of joint perturbations to data inputs as well as model weights.
arXiv Detail & Related papers (2021-02-23T20:59:30Z) - Network Diffusions via Neural Mean-Field Dynamics [52.091487866968286]
We propose a novel learning framework for inference and estimation problems of diffusion on networks.
Our framework is derived from the Mori-Zwanzig formalism to obtain an exact evolution of the node infection probabilities.
Our approach is versatile and robust to variations of the underlying diffusion network models.
arXiv Detail & Related papers (2020-06-16T18:45:20Z) - Structural plasticity on an accelerated analog neuromorphic hardware
system [0.46180371154032884]
We present a strategy to achieve structural plasticity by constantly rewiring the pre- and gpostsynaptic partners.
We implemented this algorithm on the analog neuromorphic system BrainScaleS-2.
We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology.
arXiv Detail & Related papers (2019-12-27T10:15:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.