EGNN-C+: Interpretable Evolving Granular Neural Network and Application
in Classification of Weakly-Supervised EEG Data Streams
- URL: http://arxiv.org/abs/2402.17792v1
- Date: Mon, 26 Feb 2024 15:11:41 GMT
- Title: EGNN-C+: Interpretable Evolving Granular Neural Network and Application
in Classification of Weakly-Supervised EEG Data Streams
- Authors: Daniel Leite, Alisson Silva, Gabriella Casalino, Arnab Sharma,
Danielle Fortunato, Axel-Cyrille Ngomo
- Abstract summary: We introduce a modified incremental learning algorithm for evolving Granular Neural Networks (eGNN-C+)
We focus on the classification of emotion-related patterns within electroencephalogram (EEG) signals.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: We introduce a modified incremental learning algorithm for evolving Granular
Neural Network Classifiers (eGNN-C+). We use double-boundary hyper-boxes to
represent granules, and customize the adaptation procedures to enhance the
robustness of outer boxes for data coverage and noise suppression, while
ensuring that inner boxes remain flexible to capture drifts. The classifier
evolves from scratch, incorporates new classes on the fly, and performs local
incremental feature weighting. As an application, we focus on the
classification of emotion-related patterns within electroencephalogram (EEG)
signals. Emotion recognition is crucial for enhancing the realism and
interactivity of computer systems. We extract features from the Fourier
spectrum of EEG signals obtained from 28 individuals engaged in playing
computer games -- a public dataset. Each game elicits a different predominant
emotion: boredom, calmness, horror, or joy. We analyze individual electrodes,
time window lengths, and frequency bands to assess the accuracy and
interpretability of resulting user-independent neural models. The findings
indicate that both brain hemispheres assist classification, especially
electrodes on the temporal (T8) and parietal (P7) areas, alongside
contributions from frontal and occipital electrodes. While patterns may
manifest in any band, the Alpha (8-13Hz), Delta (1-4Hz), and Theta (4-8Hz)
bands, in this order, exhibited higher correspondence with the emotion classes.
The eGNN-C+ demonstrates effectiveness in learning EEG data. It achieves an
accuracy of 81.7% and a 0.0029 II interpretability using 10-second time
windows, even in face of a highly-stochastic time-varying 4-class
classification problem.
Related papers
- Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - DGSD: Dynamical Graph Self-Distillation for EEG-Based Auditory Spatial
Attention Detection [49.196182908826565]
Auditory Attention Detection (AAD) aims to detect target speaker from brain signals in a multi-speaker environment.
Current approaches primarily rely on traditional convolutional neural network designed for processing Euclidean data like images.
This paper proposes a dynamical graph self-distillation (DGSD) approach for AAD, which does not require speech stimuli as input.
arXiv Detail & Related papers (2023-09-07T13:43:46Z) - Unveiling Emotions from EEG: A GRU-Based Approach [2.580765958706854]
Gated Recurrent Unit (GRU) algorithm is tested to see if it can use EEG signals to predict emotional states.
Our publicly accessible dataset consists of resting neutral data as well as EEG recordings from people who were exposed to stimuli evoking happy, neutral, and negative emotions.
arXiv Detail & Related papers (2023-07-20T11:04:46Z) - EEGminer: Discovering Interpretable Features of Brain Activity with
Learnable Filters [72.19032452642728]
We propose a novel differentiable EEG decoding pipeline consisting of learnable filters and a pre-determined feature extraction module.
We demonstrate the utility of our model towards emotion recognition from EEG signals on the SEED dataset and on a new EEG dataset of unprecedented size.
The discovered features align with previous neuroscience studies and offer new insights, such as marked differences in the functional connectivity profile between left and right temporal areas during music listening.
arXiv Detail & Related papers (2021-10-19T14:22:04Z) - SOUL: An Energy-Efficient Unsupervised Online Learning Seizure Detection
Classifier [68.8204255655161]
Implantable devices that record neural activity and detect seizures have been adopted to issue warnings or trigger neurostimulation to suppress seizures.
For an implantable seizure detection system, a low power, at-the-edge, online learning algorithm can be employed to dynamically adapt to neural signal drifts.
SOUL was fabricated in TSMC's 28 nm process occupying 0.1 mm2 and achieves 1.5 nJ/classification energy efficiency, which is at least 24x more efficient than state-of-the-art.
arXiv Detail & Related papers (2021-10-01T23:01:20Z) - Adaptive Gaussian Fuzzy Classifier for Real-Time Emotion Recognition in
Computer Games [0.0]
Fuzzy eGFC is supported by an online semi-supervised learning algorithm to recognize emotion patterns.
We analyze the effect of individual electrodes, time window lengths, and frequency bands on the accuracy of userindependent eGs.
arXiv Detail & Related papers (2021-03-05T06:27:04Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Deep learning-based classification of fine hand movements from low
frequency EEG [5.414308305392762]
The classification of different fine hand movements from EEG signals represents a relevant research challenge.
We trained and tested a newly proposed convolutional neural network (CNN)
CNN achieved good performance in both datasets and they were similar or superior to the baseline models.
arXiv Detail & Related papers (2020-11-13T07:16:06Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - TSception: A Deep Learning Framework for Emotion Detection Using EEG [11.444502210936776]
We propose a deep learning framework, TSception, for emotion detection from electroencephalogram (EEG)
TSception consists of temporal and spatial convolutional layers, which learn discriminative representations in the time and channel domains simultaneously.
TSception achieves a high classification accuracy of 86.03%, which outperforms the prior methods significantly.
arXiv Detail & Related papers (2020-04-02T02:10:07Z) - Effect of Analysis Window and Feature Selection on Classification of
Hand Movements Using EMG Signal [0.20999222360659603]
Recently, research on myoelectric control based on pattern recognition (PR) shows promising results with the aid of machine learning classifiers.
By offering multiple class movements and intuitive control, this method has the potential to power an amputated subject to perform everyday life movements.
We show that effective data preprocessing and optimum feature selection helps to improve the classification accuracy of hand movements.
arXiv Detail & Related papers (2020-02-02T19:03:23Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.