EEG-ITNet: An Explainable Inception Temporal Convolutional Network for
Motor Imagery Classification
- URL: http://arxiv.org/abs/2204.06947v1
- Date: Thu, 14 Apr 2022 13:18:43 GMT
- Title: EEG-ITNet: An Explainable Inception Temporal Convolutional Network for
Motor Imagery Classification
- Authors: Abbas Salami, Javier Andreu-Perez and Helge Gillmeister
- Abstract summary: We propose an end-to-end deep learning architecture called EEG-ITNet.
Our model can extract rich spectral, spatial, and temporal information from multi-channel EEG signals.
EEG-ITNet shows up to 5.9% improvement in the classification accuracy in different scenarios.
- Score: 0.5616884466478884
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In recent years, neural networks and especially deep architectures have
received substantial attention for EEG signal analysis in the field of
brain-computer interfaces (BCIs). In this ongoing research area, the end-to-end
models are more favoured than traditional approaches requiring signal
transformation pre-classification. They can eliminate the need for prior
information from experts and the extraction of handcrafted features. However,
although several deep learning algorithms have been already proposed in the
literature, achieving high accuracies for classifying motor movements or mental
tasks, they often face a lack of interpretability and therefore are not quite
favoured by the neuroscience community. The reasons behind this issue can be
the high number of parameters and the sensitivity of deep neural networks to
capture tiny yet unrelated discriminative features. We propose an end-to-end
deep learning architecture called EEG-ITNet and a more comprehensible method to
visualise the network learned patterns. Using inception modules and causal
convolutions with dilation, our model can extract rich spectral, spatial, and
temporal information from multi-channel EEG signals with less complexity (in
terms of the number of trainable parameters) than other existing end-to-end
architectures, such as EEG-Inception and EEG-TCNet. By an exhaustive evaluation
on dataset 2a from BCI competition IV and OpenBMI motor imagery dataset,
EEG-ITNet shows up to 5.9\% improvement in the classification accuracy in
different scenarios with statistical significance compared to its competitors.
We also comprehensively explain and support the validity of network
illustration from a neuroscientific perspective. We have also made our code
open at https://github.com/AbbasSalami/EEG-ITNet
Related papers
- Assessing Neural Network Representations During Training Using
Noise-Resilient Diffusion Spectral Entropy [55.014926694758195]
Entropy and mutual information in neural networks provide rich information on the learning process.
We leverage data geometry to access the underlying manifold and reliably compute these information-theoretic measures.
We show that they form noise-resistant measures of intrinsic dimensionality and relationship strength in high-dimensional simulated data.
arXiv Detail & Related papers (2023-12-04T01:32:42Z) - A Hybrid End-to-End Spatio-Temporal Attention Neural Network with
Graph-Smooth Signals for EEG Emotion Recognition [1.6328866317851187]
We introduce a deep neural network that acquires interpretable representations by a hybrid structure of network-temporal encoding and recurrent attention blocks.
We demonstrate that our proposed architecture exceeds state-of-the-art results for emotion classification on the publicly available DEAP dataset.
arXiv Detail & Related papers (2023-07-06T15:35:14Z) - EEGSN: Towards Efficient Low-latency Decoding of EEG with Graph Spiking
Neural Networks [4.336065967298193]
A majority of neural networks (SNNs) are trained based on inductive biases that are not necessarily a good fit for several critical tasks that require low-latency and power efficiency.
Here, we propose a graph spiking neural architecture for multi-channel EEG classification (EEGS) that learns the dynamic relational information present in the distributed EEG sensors.
Our method reduced the inference computational complexity by $times 20$ compared to the state-the-art SNNs, while achieved comparable accuracy on motor execution tasks.
arXiv Detail & Related papers (2023-04-15T23:30:17Z) - Deep comparisons of Neural Networks from the EEGNet family [0.0]
We compared 5 well-known neural networks (Shallow ConvNet, Deep ConvNet, EEGNet, EEGNet Fusion, MI-EEGNet) using open-access databases with many subjects next to the BCI Competition 4 2a dataset.
Our metrics showed that the researchers should not avoid Shallow ConvNet and Deep ConvNet because they can perform better than the later published ones from the EEGNet family.
arXiv Detail & Related papers (2023-02-17T10:39:09Z) - An intertwined neural network model for EEG classification in
brain-computer interfaces [0.6696153817334769]
The brain computer interface (BCI) is a nonstimulatory direct and occasionally bidirectional communication link between the brain and a computer or an external device.
We present a deep neural network architecture specifically engineered to provide state-of-the-art performance in multiclass motor imagery classification.
arXiv Detail & Related papers (2022-08-04T09:00:34Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Tensor-CSPNet: A Novel Geometric Deep Learning Framework for Motor
Imagery Classification [14.95694356964053]
We propose a geometric deep learning framework calledCSPNet to characterize EEG signals on symmetric positive definite (SPD)
CSPNet attains or slightly outperforms the current state-of-the-art performance on the cross-validation and holdout scenarios of two MI-EEG datasets.
arXiv Detail & Related papers (2022-02-05T02:52:23Z) - SignalNet: A Low Resolution Sinusoid Decomposition and Estimation
Network [79.04274563889548]
We propose SignalNet, a neural network architecture that detects the number of sinusoids and estimates their parameters from quantized in-phase and quadrature samples.
We introduce a worst-case learning threshold for comparing the results of our network relative to the underlying data distributions.
In simulation, we find that our algorithm is always able to surpass the threshold for three-bit data but often cannot exceed the threshold for one-bit data.
arXiv Detail & Related papers (2021-06-10T04:21:20Z) - EEG-Inception: An Accurate and Robust End-to-End Neural Network for
EEG-based Motor Imagery Classification [123.93460670568554]
This paper proposes a novel convolutional neural network (CNN) architecture for accurate and robust EEG-based motor imagery (MI) classification.
The proposed CNN model, namely EEG-Inception, is built on the backbone of the Inception-Time network.
The proposed network is an end-to-end classification, as it takes the raw EEG signals as the input and does not require complex EEG signal-preprocessing.
arXiv Detail & Related papers (2021-01-24T19:03:10Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Binary Graph Neural Networks [69.51765073772226]
Graph Neural Networks (GNNs) have emerged as a powerful and flexible framework for representation learning on irregular data.
In this paper, we present and evaluate different strategies for the binarization of graph neural networks.
We show that through careful design of the models, and control of the training process, binary graph neural networks can be trained at only a moderate cost in accuracy on challenging benchmarks.
arXiv Detail & Related papers (2020-12-31T18:48:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.