Using CSNNs to Perform Event-based Data Processing & Classification on ASL-DVS
- URL: http://arxiv.org/abs/2408.00611v1
- Date: Thu, 1 Aug 2024 14:49:43 GMT
- Title: Using CSNNs to Perform Event-based Data Processing & Classification on ASL-DVS
- Authors: Ria Patel, Sujit Tripathy, Zachary Sublett, Seoyoung An, Riya Patel,
- Abstract summary: We develop a convolutional spiking neural network architecture to learn the spatial and temporal relations in the ASL-DVS gesture dataset.
We performed classification on a pre-processed subset of the full ASL-DVS dataset to identify letter signs and achieved 100% training accuracy.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Recent advancements in bio-inspired visual sensing and neuromorphic computing have led to the development of various highly efficient bio-inspired solutions with real-world applications. One notable application integrates event-based cameras with spiking neural networks (SNNs) to process event-based sequences that are asynchronous and sparse, making them difficult to handle. In this project, we develop a convolutional spiking neural network (CSNN) architecture that leverages convolutional operations and recurrent properties of a spiking neuron to learn the spatial and temporal relations in the ASL-DVS gesture dataset. The ASL-DVS gesture dataset is a neuromorphic dataset containing hand gestures when displaying 24 letters (A to Y, excluding J and Z due to the nature of their symbols) from the American Sign Language (ASL). We performed classification on a pre-processed subset of the full ASL-DVS dataset to identify letter signs and achieved 100\% training accuracy. Specifically, this was achieved by training in the Google Cloud compute platform while using a learning rate of 0.0005, batch size of 25 (total of 20 batches), 200 iterations, and 10 epochs.
Related papers
- Effects of Introducing Synaptic Scaling on Spiking Neural Network Learning [0.0]
Spiking neural networks (SNNs) employing unsupervised learning methods inspired by neural plasticity are expected to be a new framework for artificial intelligence.<n>We investigated the effect of multiple types of neural plasticity, such as spike-time-dependent plasticity (STDP) and synaptic scaling, on the learning in a WTA network composed of spiking neurons.
arXiv Detail & Related papers (2026-01-16T13:11:14Z) - Integration of Contrastive Predictive Coding and Spiking Neural Networks [0.0]
This study examines the integration of Contrastive Predictive Coding (CPC) with Spiking Neural Networks (SNN)<n>The goal is to develop a predictive coding model with greater biological plausibility by processing inputs and outputs in a spike-based system.<n>The study demonstrates that CPC can be effectively combined with SNN, showing that an SNN trained for classification tasks can also function as an encoding mechanism.
arXiv Detail & Related papers (2025-06-10T19:23:08Z) - The Brain's Bitter Lesson: Scaling Speech Decoding With Self-Supervised Learning [3.649801602551928]
We develop a set of neuroscience-inspired self-supervised objectives, together with a neural architecture, for representation learning from heterogeneous recordings.
Results show that representations learned with these objectives scale with data, generalise across subjects, datasets, and tasks, and surpass comparable self-supervised approaches.
arXiv Detail & Related papers (2024-06-06T17:59:09Z) - On Pretraining Data Diversity for Self-Supervised Learning [57.91495006862553]
We explore the impact of training with more diverse datasets on the performance of self-supervised learning (SSL) under a fixed computational budget.
Our findings consistently demonstrate that increasing pretraining data diversity enhances SSL performance, albeit only when the distribution distance to the downstream data is minimal.
arXiv Detail & Related papers (2024-03-20T17:59:58Z) - Deep Learning for real-time neural decoding of grasp [0.0]
We present a Deep Learning-based approach to the decoding of neural signals for grasp type classification.
The main goal of the presented approach is to improve over state-of-the-art decoding accuracy without relying on any prior neuroscience knowledge.
arXiv Detail & Related papers (2023-11-02T08:26:29Z) - Visual Self-supervised Learning Scheme for Dense Prediction Tasks on X-ray Images [3.782392436834913]
Self-supervised learning (SSL) has led to considerable progress in natural language processing (NLP)
However, the incorporation of contrastive learning into existing visual SSL models has led to considerable progress, often surpassing supervised counterparts.
Here, we focus on dense prediction tasks using security inspection x-ray images to evaluate our proposed model, Segment localization (SegLoc)
Based upon the Instance localization (InsLoc) model, SegLoc addresses one of the key challenges of contrastive learning, i.e., false negative pairs of query embeddings.
arXiv Detail & Related papers (2023-10-12T15:42:17Z) - Adaptive Axonal Delays in feedforward spiking neural networks for
accurate spoken word recognition [4.018601183900039]
Spiking neural networks (SNN) are a promising research avenue for building accurate and efficient automatic speech recognition systems.
Recent advances in audio-to-spike encoding and training algorithms enable SNN to be applied in practical tasks.
Our work illustrates the potential of training axonal delays for tasks with complex temporal structures.
arXiv Detail & Related papers (2023-02-16T22:19:04Z) - GNN-SL: Sequence Labeling Based on Nearest Examples via GNN [50.55076156520809]
We introduce graph neural networks sequence labeling (GNN-SL)
GNN-SL augments vanilla sequence labeling model output with similar tagging examples retrieved from the whole training set.
We conduct a variety of experiments on three typical sequence labeling tasks.
GNN-SL achieves results of 96.9 (+0.2) on PKU, 98.3 (+0.4) on CITYU, 98.5 (+0.2) on MSR, and 96.9 (+0.2) on AS for the CWS task.
arXiv Detail & Related papers (2022-12-05T04:22:00Z) - Neural Implicit Dictionary via Mixture-of-Expert Training [111.08941206369508]
We present a generic INR framework that achieves both data and training efficiency by learning a Neural Implicit Dictionary (NID)
Our NID assembles a group of coordinate-based Impworks which are tuned to span the desired function space.
Our experiments show that, NID can improve reconstruction of 2D images or 3D scenes by 2 orders of magnitude faster with up to 98% less input data.
arXiv Detail & Related papers (2022-07-08T05:07:19Z) - DATA: Domain-Aware and Task-Aware Pre-training [94.62676913928831]
We present DATA, a simple yet effective NAS approach specialized for self-supervised learning (SSL)
Our method achieves promising results across a wide range of computation costs on downstream tasks, including image classification, object detection and semantic segmentation.
arXiv Detail & Related papers (2022-03-17T02:38:49Z) - N-Omniglot: a Large-scale Neuromorphic Dataset for Spatio-Temporal
Sparse Few-shot Learning [10.812738608234321]
We provide the first neuromorphic dataset: N- Omniglot, using the Dynamic Vision Sensor (DVS)
It contains 1623 categories of handwritten characters, with only 20 samples per class.
The dataset provides a powerful challenge and a suitable benchmark for developing SNNs algorithm in the few-shot learning domain.
arXiv Detail & Related papers (2021-12-25T12:41:34Z) - Spiking Neural Networks with Improved Inherent Recurrence Dynamics for
Sequential Learning [6.417011237981518]
Spiking neural networks (SNNs) with leaky integrate and fire (LIF) neurons can be operated in an event-driven manner.
We show that SNNs can be trained for sequential tasks and propose modifications to a network of LIF neurons.
We then develop a training scheme to train the proposed SNNs with improved inherent recurrence dynamics.
arXiv Detail & Related papers (2021-09-04T17:13:28Z) - Self-Supervised Learning of Graph Neural Networks: A Unified Review [50.71341657322391]
Self-supervised learning is emerging as a new paradigm for making use of large amounts of unlabeled samples.
We provide a unified review of different ways of training graph neural networks (GNNs) using SSL.
Our treatment of SSL methods for GNNs sheds light on the similarities and differences of various methods, setting the stage for developing new methods and algorithms.
arXiv Detail & Related papers (2021-02-22T03:43:45Z) - Self-supervised Learning on Graphs: Deep Insights and New Direction [66.78374374440467]
Self-supervised learning (SSL) aims to create domain specific pretext tasks on unlabeled data.
There are increasing interests in generalizing deep learning to the graph domain in the form of graph neural networks (GNNs)
arXiv Detail & Related papers (2020-06-17T20:30:04Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.