Nengo and low-power AI hardware for robust, embedded neurorobotics
- URL: http://arxiv.org/abs/2007.10227v2
- Date: Sat, 29 Aug 2020 19:38:40 GMT
- Title: Nengo and low-power AI hardware for robust, embedded neurorobotics
- Authors: Travis DeWolf and Pawel Jaworski and Chris Eliasmith
- Abstract summary: We identify four primary challenges in building robust, embedded neurorobotic systems.
We present two examples using Nengo to develop neural networks that run on CPU, GPU, and Intel's neuromorphic chip, Loihi.
- Score: 6.574517227976925
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper we demonstrate how the Nengo neural modeling and simulation
libraries enable users to quickly develop robotic perception and action neural
networks for simulation on neuromorphic hardware using familiar tools, such as
Keras and Python. We identify four primary challenges in building robust,
embedded neurorobotic systems: 1) developing infrastructure for interfacing
with the environment and sensors; 2) processing task specific sensory signals;
3) generating robust, explainable control signals; and 4) compiling neural
networks to run on target hardware. Nengo helps to address these challenges by:
1) providing the NengoInterfaces library, which defines a simple but powerful
API for users to interact with simulations and hardware; 2) providing the
NengoDL library, which lets users use the Keras and TensorFlow API to develop
Nengo models; 3) implementing the Neural Engineering Framework, which provides
white-box methods for implementing known functions and circuits; and 4)
providing multiple backend libraries, such as NengoLoihi, that enable users to
compile the same model to different hardware. We present two examples using
Nengo to develop neural networks that run on CPUs, GPUs, and Intel's
neuromorphic chip, Loihi, to demonstrate this workflow. The first example is an
end-to-end spiking neural network that controls a rover simulated in Mujoco.
The network integrates a deep convolutional network that processes visual input
from mounted cameras to track a target, and a control system implementing
steering and drive functions to guide the rover to the target. The second
example augments a force-based operational space controller with neural
adaptive control to improve performance during a reaching task using a
real-world Kinova Jaco2 robotic arm. Code and details are provided with the
intent of enabling other researchers to build their own neurorobotic systems.
Related papers
- Adaptive Robotic Arm Control with a Spiking Recurrent Neural Network on a Digital Accelerator [41.60361484397962]
We present an overview of the system, and a Python framework to use it on a Pynq ZU platform.
We show how the simulated accuracy is preserved with a peak performance of 3.8M events processed per second.
arXiv Detail & Related papers (2024-05-21T14:59:39Z) - Codebook Features: Sparse and Discrete Interpretability for Neural
Networks [43.06828312515959]
We explore whether we can train neural networks to have hidden states that are sparse, discrete, and more interpretable.
Codebook features are produced by finetuning neural networks with vector quantization bottlenecks at each layer.
We find that neural networks can operate under this extreme bottleneck with only modest degradation in performance.
arXiv Detail & Related papers (2023-10-26T08:28:48Z) - Intelligence Processing Units Accelerate Neuromorphic Learning [52.952192990802345]
Spiking neural networks (SNNs) have achieved orders of magnitude improvement in terms of energy consumption and latency.
We present an IPU-optimized release of our custom SNN Python package, snnTorch.
arXiv Detail & Related papers (2022-11-19T15:44:08Z) - Active Predicting Coding: Brain-Inspired Reinforcement Learning for
Sparse Reward Robotic Control Problems [79.07468367923619]
We propose a backpropagation-free approach to robotic control through the neuro-cognitive computational framework of neural generative coding (NGC)
We design an agent built completely from powerful predictive coding/processing circuits that facilitate dynamic, online learning from sparse rewards.
We show that our proposed ActPC agent performs well in the face of sparse (extrinsic) reward signals and is competitive with or outperforms several powerful backprop-based RL approaches.
arXiv Detail & Related papers (2022-09-19T16:49:32Z) - SpikiLi: A Spiking Simulation of LiDAR based Real-time Object Detection
for Autonomous Driving [0.0]
Spiking Neural Networks are a new neural network design approach that promises tremendous improvements in power efficiency, computation efficiency, and processing latency.
We first illustrate the applicability of spiking neural networks to a complex deep learning task namely Lidar based 3D object detection for automated driving.
arXiv Detail & Related papers (2022-06-06T20:05:17Z) - Neuromorphic Artificial Intelligence Systems [58.1806704582023]
Modern AI systems, based on von Neumann architecture and classical neural networks, have a number of fundamental limitations in comparison with the brain.
This article discusses such limitations and the ways they can be mitigated.
It presents an overview of currently available neuromorphic AI projects in which these limitations are overcome.
arXiv Detail & Related papers (2022-05-25T20:16:05Z) - FPGA-optimized Hardware acceleration for Spiking Neural Networks [69.49429223251178]
This work presents the development of a hardware accelerator for an SNN, with off-line training, applied to an image recognition task.
The design targets a Xilinx Artix-7 FPGA, using in total around the 40% of the available hardware resources.
It reduces the classification time by three orders of magnitude, with a small 4.5% impact on the accuracy, if compared to its software, full precision counterpart.
arXiv Detail & Related papers (2022-01-18T13:59:22Z) - Learning from Event Cameras with Sparse Spiking Convolutional Neural
Networks [0.0]
Convolutional neural networks (CNNs) are now the de facto solution for computer vision problems.
We propose an end-to-end biologically inspired approach using event cameras and spiking neural networks (SNNs)
Our method enables the training of sparse spiking neural networks directly on event data, using the popular deep learning framework PyTorch.
arXiv Detail & Related papers (2021-04-26T13:52:01Z) - Deep Imitation Learning for Bimanual Robotic Manipulation [70.56142804957187]
We present a deep imitation learning framework for robotic bimanual manipulation.
A core challenge is to generalize the manipulation skills to objects in different locations.
We propose to (i) decompose the multi-modal dynamics into elemental movement primitives, (ii) parameterize each primitive using a recurrent graph neural network to capture interactions, and (iii) integrate a high-level planner that composes primitives sequentially and a low-level controller to combine primitive dynamics and inverse kinematics control.
arXiv Detail & Related papers (2020-10-11T01:40:03Z) - Exposing Hardware Building Blocks to Machine Learning Frameworks [4.56877715768796]
We focus on how to design topologies that complement such a view of neurons as unique functions.
We develop a library that supports training a neural network with custom sparsity and quantization.
arXiv Detail & Related papers (2020-04-10T14:26:00Z) - Non-linear Neurons with Human-like Apical Dendrite Activations [81.18416067005538]
We show that a standard neuron followed by our novel apical dendrite activation (ADA) can learn the XOR logical function with 100% accuracy.
We conduct experiments on six benchmark data sets from computer vision, signal processing and natural language processing.
arXiv Detail & Related papers (2020-02-02T21:09:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.