Neuromorphic Deployment of Spiking Neural Networks for Cognitive Load Classification in Air Traffic Control
- URL: http://arxiv.org/abs/2509.21345v2
- Date: Fri, 03 Oct 2025 14:35:40 GMT
- Title: Neuromorphic Deployment of Spiking Neural Networks for Cognitive Load Classification in Air Traffic Control
- Authors: Jiahui An, Chonghao Cai, Olympia Gallou, Sara Irina Fabrikant, Giacomo Indiveri, Elisa Donati,
- Abstract summary: This paper presents a neuromorphic system for cognitive load classification in a real-world setting, an Air Traffic Control (ATC) task, using a hardware implementation of Spiking Neural Networks (SNNs)<n>EEG and eye-tracking features, extracted from an open-source dataset, were used to train and evaluate both conventional machine learning models and SNNs.<n>To enable deployment on neuromorphic hardware, the model was quantized and implemented on the mixed-signal DYNAP-SE chip.
- Score: 1.2545957939431476
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper presents a neuromorphic system for cognitive load classification in a real-world setting, an Air Traffic Control (ATC) task, using a hardware implementation of Spiking Neural Networks (SNNs). Electroencephalogram (EEG) and eye-tracking features, extracted from an open-source dataset, were used to train and evaluate both conventional machine learning models and SNNs. Among the SNN architectures explored, a minimalistic, single-layer model trained with a biologically inspired delta-rule learning algorithm achieved competitive performance (80.6%). To enable deployment on neuromorphic hardware, the model was quantized and implemented on the mixed-signal DYNAP-SE chip. Despite hardware constraints and analog variability, the chip-deployed SNN maintained a classification accuracy of up to 73.5% using spike-based input. These results demonstrate the feasibility of event-driven neuromorphic systems for ultra-low-power, embedded cognitive state monitoring in dynamic real-world scenarios.
Related papers
- Event-based Heterogeneous Information Processing for Online Vision-based Obstacle Detection and Localization [0.3499870393443268]
This paper introduces a novel framework for robotic vision-based navigation that integrates Hybrid Neural Networks (HNNs) with Spiking Neural Network (SNN)-based filtering.<n>By leveraging the complementary strengths of Artificial Neural Networks (ANNs) and SNNs, the system achieves both accurate environmental understanding and fast, energy-efficient processing.
arXiv Detail & Related papers (2026-01-19T23:09:23Z) - Efficient Memristive Spiking Neural Networks Architecture with Supervised In-Situ STDP Method [0.0]
Memristor-based Spiking Neural Networks (SNNs) with temporal spike encoding enable ultra-low-energy computation.<n>This paper presents a circuit-level memristive spiking neural network (SNN) architecture trained using a proposed novel supervised in-situ learning algorithm.
arXiv Detail & Related papers (2025-07-28T17:09:48Z) - EvSegSNN: Neuromorphic Semantic Segmentation for Event Data [0.6138671548064356]
EvSegSNN is a biologically plausible encoder-decoder U-shaped architecture relying on Parametric Leaky Integrate and Fire neurons.
We introduce an end-to-end biologically inspired semantic segmentation approach by combining Spiking Neural Networks with event cameras.
Experiments conducted on DDD17 demonstrate that EvSegSNN outperforms the closest state-of-the-art model in terms of MIoU.
arXiv Detail & Related papers (2024-06-20T10:36:24Z) - Adaptive Robotic Arm Control with a Spiking Recurrent Neural Network on a Digital Accelerator [41.60361484397962]
We present an overview of the system, and a Python framework to use it on a Pynq ZU platform.
We show how the simulated accuracy is preserved with a peak performance of 3.8M events processed per second.
arXiv Detail & Related papers (2024-05-21T14:59:39Z) - SpikingJelly: An open-source machine learning infrastructure platform
for spike-based intelligence [51.6943465041708]
Spiking neural networks (SNNs) aim to realize brain-inspired intelligence on neuromorphic chips with high energy efficiency.
We contribute a full-stack toolkit for pre-processing neuromorphic datasets, building deep SNNs, optimizing their parameters, and deploying SNNs on neuromorphic chips.
arXiv Detail & Related papers (2023-10-25T13:15:17Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - PC-SNN: Predictive Coding-based Local Hebbian Plasticity Learning in Spiking Neural Networks [9.026880521552153]
Spiking Neural Networks (SNNs) emulate the brain's information processing with unparalleled biological plausibility.<n>We propose PC-SNN, a novel learning framework that integrates predictive coding with SNNs to enable biologically plausible, local Hebbian plasticity.
arXiv Detail & Related papers (2022-11-24T09:56:02Z) - Convolutional Spiking Neural Networks for Detecting Anticipatory Brain Potentials Using Electroencephalogram [0.21847754147782888]
Spiking neural networks (SNNs) are receiving increased attention because they mimic synaptic connections in biological systems and produce spike trains.
Recently, the addition of convolutional layers to combine the feature extraction power of convolutional networks with the computational efficiency of SNNs has been introduced.
This paper studies the feasibility of using a convolutional spiking neural network (CSNN) to detect anticipatory slow cortical potentials.
arXiv Detail & Related papers (2022-08-14T19:04:15Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Exploiting Spiking Dynamics with Spatial-temporal Feature Normalization
in Graph Learning [9.88508686848173]
Biological spiking neurons with intrinsic dynamics underlie the powerful representation and learning capabilities of the brain.
Despite recent tremendous progress in spiking neural networks (SNNs) for handling Euclidean-space tasks, it still remains challenging to exploit SNNs in processing non-Euclidean-space data.
Here we present a general spike-based modeling framework that enables the direct training of SNNs for graph learning.
arXiv Detail & Related papers (2021-06-30T11:20:16Z) - Multi-Sample Online Learning for Probabilistic Spiking Neural Networks [43.8805663900608]
Spiking Neural Networks (SNNs) capture some of the efficiency of biological brains for inference and learning.
This paper introduces an online learning rule based on generalized expectation-maximization (GEM)
Experimental results on structured output memorization and classification on a standard neuromorphic data set demonstrate significant improvements in terms of log-likelihood, accuracy, and calibration.
arXiv Detail & Related papers (2020-07-23T10:03:58Z) - Progressive Tandem Learning for Pattern Recognition with Deep Spiking
Neural Networks [80.15411508088522]
Spiking neural networks (SNNs) have shown advantages over traditional artificial neural networks (ANNs) for low latency and high computational efficiency.
We propose a novel ANN-to-SNN conversion and layer-wise learning framework for rapid and efficient pattern recognition.
arXiv Detail & Related papers (2020-07-02T15:38:44Z) - Recurrent Neural Network Learning of Performance and Intrinsic
Population Dynamics from Sparse Neural Data [77.92736596690297]
We introduce a novel training strategy that allows learning not only the input-output behavior of an RNN but also its internal network dynamics.
We test the proposed method by training an RNN to simultaneously reproduce internal dynamics and output signals of a physiologically-inspired neural model.
Remarkably, we show that the reproduction of the internal dynamics is successful even when the training algorithm relies on the activities of a small subset of neurons.
arXiv Detail & Related papers (2020-05-05T14:16:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.