Efficient Eye-based Emotion Recognition via Neural Architecture Search of Time-to-First-Spike-Coded Spiking Neural Networks
- URL: http://arxiv.org/abs/2512.02459v1
- Date: Tue, 02 Dec 2025 06:35:49 GMT
- Title: Efficient Eye-based Emotion Recognition via Neural Architecture Search of Time-to-First-Spike-Coded Spiking Neural Networks
- Authors: Qianhui Liu, Jing Yang, Miao Yu, Trevor E. Carlson, Gang Pan, Haizhou Li, Zhumin Chen,
- Abstract summary: Time-to-first-spike (TTFS)-coded spiking neural networks (SNNs) offer a promising solution for eye-based emotion recognition.<n>TTFS-ER is the first neural architecture search framework tailored to TTFS SNNs for eye-based emotion recognition.<n>When deployed on neuromorphic hardware, TNAS-ER attains a low latency of 48 ms and an energy consumption of 0.05 J.
- Score: 52.617096567601344
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Eye-based emotion recognition enables eyewear devices to perceive users' emotional states and support emotion-aware interaction, yet deploying such functionality on their resource-limited embedded hardware remains challenging. Time-to-first-spike (TTFS)-coded spiking neural networks (SNNs) offer a promising solution, as each neuron emits at most one binary spike, resulting in extremely sparse and energy-efficient computation. While prior works have primarily focused on improving TTFS SNN training algorithms, the impact of network architecture has been largely overlooked. In this paper, we propose TNAS-ER, the first neural architecture search (NAS) framework tailored to TTFS SNNs for eye-based emotion recognition. TNAS-ER presents a novel ANN-assisted search strategy that leverages a ReLU-based ANN counterpart sharing an identity mapping with the TTFS SNN to guide architecture optimization. TNAS-ER employs an evolutionary algorithm, with weighted and unweighted average recall jointly defined as fitness objectives for emotion recognition. Extensive experiments demonstrate that TNAS-ER achieves high recognition performance with significantly improved efficiency. Furthermore, when deployed on neuromorphic hardware, TNAS-ER attains a low latency of 48 ms and an energy consumption of 0.05 J, confirming its superior energy efficiency and strong potential for practical applications.
Related papers
- LightSNN: Lightweight Architecture Search for Sparse and Accurate Spiking Neural Networks [1.0485739694839666]
Spiking Neural Networks (SNNs) are highly regarded for their energy efficiency, inherent activation sparsity, and suitability for real-time processing in edge devices.<n>Most current SNN methods adopt architectures resembling traditional artificial neural networks (ANNs)<n>We present LightSNN, a rapid and efficient Neural Network Architecture Search (NAS) technique specifically tailored for SNNs.
arXiv Detail & Related papers (2025-03-27T16:38:13Z) - Spiking Meets Attention: Efficient Remote Sensing Image Super-Resolution with Attention Spiking Neural Networks [86.28783985254431]
Spiking neural networks (SNNs) are emerging as a promising alternative to traditional artificial neural networks (ANNs)<n>We propose SpikeSR, which achieves state-of-the-art performance across various remote sensing benchmarks such as AID, DOTA, and DIOR.
arXiv Detail & Related papers (2025-03-06T09:06:06Z) - DRiVE: Dynamic Recognition in VEhicles using snnTorch [0.0]
Spiking Neural Networks (SNNs) mimic biological brain activity, processing data efficiently through an event-driven design.<n>This study combines SNNs with PyTorch's adaptable framework, snnTorch, to test their potential for image-based tasks.<n>We introduce DRiVE, a vehicle detection model that uses spiking neuron dynamics to classify images, achieving 94.8% accuracy and a near-perfect 0.99 AUC score.
arXiv Detail & Related papers (2025-02-04T11:01:13Z) - NeuroNAS: Enhancing Efficiency of Neuromorphic In-Memory Computing for Intelligent Mobile Agents through Hardware-Aware Spiking Neural Architecture Search [6.006032394972252]
Spiking Neural Networks (SNNs) leverage event-based computation to enable ultra-low power/energy machine learning algorithms.<n>NeuroNAS is a novel framework for developing energy-efficient neuromorphic IMC for intelligent mobile agents.
arXiv Detail & Related papers (2024-06-30T09:51:58Z) - LC-TTFS: Towards Lossless Network Conversion for Spiking Neural Networks
with TTFS Coding [55.64533786293656]
We show that our algorithm can achieve a near-perfect mapping between the activation values of an ANN and the spike times of an SNN on a number of challenging AI tasks.
The study paves the way for deploying ultra-low-power TTFS-based SNNs on power-constrained edge computing platforms.
arXiv Detail & Related papers (2023-10-23T14:26:16Z) - A Hybrid Neural Coding Approach for Pattern Recognition with Spiking
Neural Networks [53.31941519245432]
Brain-inspired spiking neural networks (SNNs) have demonstrated promising capabilities in solving pattern recognition tasks.
These SNNs are grounded on homogeneous neurons that utilize a uniform neural coding for information representation.
In this study, we argue that SNN architectures should be holistically designed to incorporate heterogeneous coding schemes.
arXiv Detail & Related papers (2023-05-26T02:52:12Z) - Hybrid SNN-ANN: Energy-Efficient Classification and Object Detection for
Event-Based Vision [64.71260357476602]
Event-based vision sensors encode local pixel-wise brightness changes in streams of events rather than image frames.
Recent progress in object recognition from event-based sensors has come from conversions of deep neural networks.
We propose a hybrid architecture for end-to-end training of deep neural networks for event-based pattern recognition and object detection.
arXiv Detail & Related papers (2021-12-06T23:45:58Z) - Training Energy-Efficient Deep Spiking Neural Networks with
Time-to-First-Spike Coding [29.131030799324844]
Spiking neural networks (SNNs) mimic the operations in the human brain.
Deep neural networks (DNNs) have become a serious problem in deep learning.
This paper presents training methods for energy-efficient deep SNNs with TTFS coding.
arXiv Detail & Related papers (2021-06-04T16:02:27Z) - You Only Spike Once: Improving Energy-Efficient Neuromorphic Inference
to ANN-Level Accuracy [51.861168222799186]
Spiking Neural Networks (SNNs) are a type of neuromorphic, or brain-inspired network.
SNNs are sparse, accessing very few weights, and typically only use addition operations instead of the more power-intensive multiply-and-accumulate operations.
In this work, we aim to overcome the limitations of TTFS-encoded neuromorphic systems.
arXiv Detail & Related papers (2020-06-03T15:55:53Z) - Effective and Efficient Computation with Multiple-timescale Spiking
Recurrent Neural Networks [0.9790524827475205]
We show how a novel type of adaptive spiking recurrent neural network (SRNN) is able to achieve state-of-the-art performance.
We calculate a $>$100x energy improvement for our SRNNs over classical RNNs on the harder tasks.
arXiv Detail & Related papers (2020-05-24T01:04:53Z) - SiamSNN: Siamese Spiking Neural Networks for Energy-Efficient Object
Tracking [20.595208488431766]
SiamSNN is the first deep SNN tracker that achieves short latency and low precision loss on the visual object tracking benchmarks OTB2013, VOT2016, and GOT-10k.
SiamSNN notably achieves low energy consumption and real-time on Neuromorphic chip TrueNorth.
arXiv Detail & Related papers (2020-03-17T08:49:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.