hxtorch.snn: Machine-learning-inspired Spiking Neural Network Modeling
on BrainScaleS-2
- URL: http://arxiv.org/abs/2212.12210v1
- Date: Fri, 23 Dec 2022 08:56:44 GMT
- Title: hxtorch.snn: Machine-learning-inspired Spiking Neural Network Modeling
on BrainScaleS-2
- Authors: Philipp Spilger, Elias Arnold, Luca Blessing, Christian Mauch,
Christian Pehle, Eric M\"uller, Johannes Schemmel
- Abstract summary: hxtorch.snn is a machine learning-based modeling framework for the BrainScaleS-2 neuromorphic system.
hxtorch.snn enables the hardware-in-the-loop training of spiking neural networks within PyTorch.
We demonstrate the capabilities of hxtorch.snn on a classification task using the Yin-Yang dataset.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Neuromorphic systems require user-friendly software to support the design and
optimization of experiments. In this work, we address this need by presenting
our development of a machine learning-based modeling framework for the
BrainScaleS-2 neuromorphic system. This work represents an improvement over
previous efforts, which either focused on the matrix-multiplication mode of
BrainScaleS-2 or lacked full automation. Our framework, called hxtorch.snn,
enables the hardware-in-the-loop training of spiking neural networks within
PyTorch, including support for auto differentiation in a fully-automated
hardware experiment workflow. In addition, hxtorch.snn facilitates seamless
transitions between emulating on hardware and simulating in software. We
demonstrate the capabilities of hxtorch.snn on a classification task using the
Yin-Yang dataset employing a gradient-based approach with surrogate gradients
and densely sampled membrane observations from the BrainScaleS-2 hardware
system.
Related papers
- Neural Metamorphosis [72.88137795439407]
This paper introduces a new learning paradigm termed Neural Metamorphosis (NeuMeta), which aims to build self-morphable neural networks.
NeuMeta directly learns the continuous weight manifold of neural networks.
It sustains full-size performance even at a 75% compression rate.
arXiv Detail & Related papers (2024-10-10T14:49:58Z) - jaxsnn: Event-driven Gradient Estimation for Analog Neuromorphic
Hardware [0.844044431480226]
We present a novel library (jaxsnn) built on top of JAX that departs from conventional machine learning frameworks.
Our library facilitates the simulation of spiking neural networks and gradient estimation, with a focus on compatibility with time-continuous neuromorphic backends.
arXiv Detail & Related papers (2024-01-30T09:27:13Z) - Scalable Network Emulation on Analog Neuromorphic Hardware [3.1934373544259813]
We present a novel software feature for the BrainScaleS-2 accelerated neuromorphic platform.
It facilitates the partitioned emulation of large-scale spiking neural networks.
We demonstrate the training of two deep spiking neural network models that exceed the physical size constraints of a single-chip BrainScaleS-2 system.
arXiv Detail & Related papers (2024-01-30T09:27:05Z) - Experimental study of Neural ODE training with adaptive solver for
dynamical systems modeling [72.84259710412293]
Some ODE solvers called adaptive can adapt their evaluation strategy depending on the complexity of the problem at hand.
This paper describes a simple set of experiments to show why adaptive solvers cannot be seamlessly leveraged as a black-box for dynamical systems modelling.
arXiv Detail & Related papers (2022-11-13T17:48:04Z) - Variable Bitrate Neural Fields [75.24672452527795]
We present a dictionary method for compressing feature grids, reducing their memory consumption by up to 100x.
We formulate the dictionary optimization as a vector-quantized auto-decoder problem which lets us learn end-to-end discrete neural representations in a space where no direct supervision is available.
arXiv Detail & Related papers (2022-06-15T17:58:34Z) - A Scalable Approach to Modeling on Accelerated Neuromorphic Hardware [0.0]
This work presents the software aspects of the BrainScaleS-2 system, a hybrid accelerated neuromorphic hardware architecture based on physical modeling.
We introduce key aspects of the BrainScaleS-2 Operating System: experiment workflow, API layering, software design, and platform operation.
The focus lies on novel system and software features such as multi-compartmental neurons, fast re-configuration for hardware-in-the-loop training, applications for the embedded processors, the non-spiking operation mode, interactive platform access, and sustainable hardware/software co-development.
arXiv Detail & Related papers (2022-03-21T16:30:18Z) - Can we learn gradients by Hamiltonian Neural Networks? [68.8204255655161]
We propose a meta-learner based on ODE neural networks that learns gradients.
We demonstrate that our method outperforms a meta-learner based on LSTM for an artificial task and the MNIST dataset with ReLU activations in the optimizee.
arXiv Detail & Related papers (2021-10-31T18:35:10Z) - Train your classifier first: Cascade Neural Networks Training from upper
layers to lower layers [54.47911829539919]
We develop a novel top-down training method which can be viewed as an algorithm for searching for high-quality classifiers.
We tested this method on automatic speech recognition (ASR) tasks and language modelling tasks.
The proposed method consistently improves recurrent neural network ASR models on Wall Street Journal, self-attention ASR models on Switchboard, and AWD-LSTM language models on WikiText-2.
arXiv Detail & Related papers (2021-02-09T08:19:49Z) - hxtorch: PyTorch for BrainScaleS-2 -- Perceptrons on Analog Neuromorphic
Hardware [0.0]
We present software facilitating the usage of the BrainScaleS-2 analog neuromorphic hardware system as an inference accelerator.
We provide accelerator support for vector-matrix multiplications and convolutions and corresponding software-based autograd functionality.
As an application of the introduced framework, we present a model that classifies activities of daily living with smartphone sensor data.
arXiv Detail & Related papers (2020-06-23T16:33:49Z) - Extending BrainScaleS OS for BrainScaleS-2 [0.0]
We present and walk through the software enhancements that were introduced for the BrainScaleS-2 architecture.
BrainScaleS OS is a software stack designed for the user-friendly operation of the BrainScaleS architecture.
arXiv Detail & Related papers (2020-03-30T18:58:55Z) - AutoML-Zero: Evolving Machine Learning Algorithms From Scratch [76.83052807776276]
We show that it is possible to automatically discover complete machine learning algorithms just using basic mathematical operations as building blocks.
We demonstrate this by introducing a novel framework that significantly reduces human bias through a generic search space.
We believe these preliminary successes in discovering machine learning algorithms from scratch indicate a promising new direction in the field.
arXiv Detail & Related papers (2020-03-06T19:00:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.