EEG-GMACN: Interpretable EEG Graph Mutual Attention Convolutional Network
- URL: http://arxiv.org/abs/2412.17834v1
- Date: Sun, 15 Dec 2024 13:37:20 GMT
- Title: EEG-GMACN: Interpretable EEG Graph Mutual Attention Convolutional Network
- Authors: Haili Ye, Stephan Goerttler, Fei He,
- Abstract summary: Graph Signal Processing has emerged as a promising method for EEG spatial-temporal analysis.
Existing GSP studies lack interpretability of electrode importance and the credibility of prediction confidence.
This work proposes an EEG Graph Mutual Attention Convolutional Network (EEG-GMACN) to output interpretable electrode graph weights.
- Score: 2.6684288899870543
- License:
- Abstract: Electroencephalogram (EEG) is a valuable technique to record brain electrical activity through electrodes placed on the scalp. Analyzing EEG signals contributes to the understanding of neurological conditions and developing brain-computer interface. Graph Signal Processing (GSP) has emerged as a promising method for EEG spatial-temporal analysis, by further considering the topological relationships between electrodes. However, existing GSP studies lack interpretability of electrode importance and the credibility of prediction confidence. This work proposes an EEG Graph Mutual Attention Convolutional Network (EEG-GMACN), by introducing an 'Inverse Graph Weight Module' to output interpretable electrode graph weights, enhancing the clinical credibility and interpretability of EEG classification results. Additionally, we incorporate a mutual attention mechanism module into the model to improve its capability to distinguish critical electrodes and introduce credibility calibration to assess the uncertainty of prediction results. This study enhances the transparency and effectiveness of EEG analysis, paving the way for its widespread use in clinical and neuroscience research.
Related papers
- CognitionCapturer: Decoding Visual Stimuli From Human EEG Signal With Multimodal Information [61.1904164368732]
We propose CognitionCapturer, a unified framework that fully leverages multimodal data to represent EEG signals.
Specifically, CognitionCapturer trains Modality Experts for each modality to extract cross-modal information from the EEG modality.
The framework does not require any fine-tuning of the generative models and can be extended to incorporate more modalities.
arXiv Detail & Related papers (2024-12-13T16:27:54Z) - Dynamic GNNs for Precise Seizure Detection and Classification from EEG Data [6.401370088497331]
This paper introduces NeuroGNN, a dynamic Graph Neural Network (GNN) framework that captures the interplay between the EEG locations and the semantics of their corresponding brain regions.
Our experiments with real-world data demonstrate that NeuroGNN significantly outperforms existing state-of-the-art models.
arXiv Detail & Related papers (2024-05-08T21:36:49Z) - Polar-Net: A Clinical-Friendly Model for Alzheimer's Disease Detection
in OCTA Images [53.235117594102675]
Optical Coherence Tomography Angiography is a promising tool for detecting Alzheimer's disease (AD) by imaging the retinal microvasculature.
We propose a novel deep-learning framework called Polar-Net to provide interpretable results and leverage clinical prior knowledge.
We show that Polar-Net outperforms existing state-of-the-art methods and provides more valuable pathological evidence for the association between retinal vascular changes and AD.
arXiv Detail & Related papers (2023-11-10T11:49:49Z) - A Knowledge-Driven Cross-view Contrastive Learning for EEG
Representation [48.85731427874065]
This paper proposes a knowledge-driven cross-view contrastive learning framework (KDC2) to extract effective representations from EEG with limited labels.
The KDC2 method creates scalp and neural views of EEG signals, simulating the internal and external representation of brain activity.
By modeling prior neural knowledge based on neural information consistency theory, the proposed method extracts invariant and complementary neural knowledge to generate combined representations.
arXiv Detail & Related papers (2023-09-21T08:53:51Z) - An Interpretable and Attention-based Method for Gaze Estimation Using
Electroencephalography [8.09848629098117]
We leverage a large data set of simultaneously measured Electroencephalography (EEG) and Eye tracking, proposing an interpretable model for gaze estimation from EEG data.
We present a novel attention-based deep learning framework for EEG signal analysis, which allows the network to focus on the most relevant information in the signal and discard problematic channels.
arXiv Detail & Related papers (2023-08-09T16:58:01Z) - Optimized EEG based mood detection with signal processing and deep
neural networks for brain-computer interface [0.0]
The aim of this study is to establish a smart decision-making model to identify EEG's relation with the mood of the subject.
EEG signals of 28 healthy human subjects have been observed with consent and attempts have been made to study and recognise moods.
Using these techniques, up to 96.01% detection accuracy has been obtained.
arXiv Detail & Related papers (2023-03-30T15:23:24Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - Task-oriented Self-supervised Learning for Anomaly Detection in
Electroencephalography [51.45515911920534]
A task-oriented self-supervised learning approach is proposed to train a more effective anomaly detector.
A specific two branch convolutional neural network with larger kernels is designed as the feature extractor.
The effectively designed and trained feature extractor has shown to be able to extract better feature representations from EEGs.
arXiv Detail & Related papers (2022-07-04T13:15:08Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - Attention-based Graph ResNet for Motor Intent Detection from Raw EEG
signals [8.775745069873558]
In previous studies, decoding electroencephalography (EEG) signals has not considered the topological relationship of EEG electrodes.
An attention-based graph residual network, a novel structure of Graph Convolutional Neural Network (GCN), was presented to detect human motor intents.
Deep residual learning with a full-attention architecture was introduced to address the degradation problem concerning deeper networks in raw EEG motor imagery.
arXiv Detail & Related papers (2020-06-25T09:29:48Z) - GCNs-Net: A Graph Convolutional Neural Network Approach for Decoding
Time-resolved EEG Motor Imagery Signals [8.19994663278877]
A novel deep learning framework based on the graph convolutional neural networks (GCNs) is presented to enhance the decoding performance of raw EEG signals.
The introduced approach has been shown to converge for both personalized and group-wise predictions.
arXiv Detail & Related papers (2020-06-16T04:57:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.