Progressive Graph Convolution Network for EEG Emotion Recognition
- URL: http://arxiv.org/abs/2112.09069v1
- Date: Tue, 14 Dec 2021 03:30:13 GMT
- Title: Progressive Graph Convolution Network for EEG Emotion Recognition
- Authors: Yijin Zhou, Fu Li, Yang Li, Youshuo Ji, Guangming Shi, Wenming Zheng,
Lijian Zhang, Yuanfang Chen, Rui Cheng
- Abstract summary: Studies in the area of neuroscience have revealed the relationship between emotional patterns and brain functional regions.
In EEG emotion recognition, we can observe that clearer boundaries exist between coarse-grained emotions than those between fine-grained emotions.
We propose a progressive graph convolution network (PGCN) for capturing this inherent characteristic in EEG emotional signals.
- Score: 35.08010382523394
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Studies in the area of neuroscience have revealed the relationship between
emotional patterns and brain functional regions, demonstrating that dynamic
relationships between different brain regions are an essential factor affecting
emotion recognition determined through electroencephalography (EEG). Moreover,
in EEG emotion recognition, we can observe that clearer boundaries exist
between coarse-grained emotions than those between fine-grained emotions, based
on the same EEG data; this indicates the concurrence of large coarse- and small
fine-grained emotion variations. Thus, the progressive classification process
from coarse- to fine-grained categories may be helpful for EEG emotion
recognition. Consequently, in this study, we propose a progressive graph
convolution network (PGCN) for capturing this inherent characteristic in EEG
emotional signals and progressively learning the discriminative EEG features.
To fit different EEG patterns, we constructed a dual-graph module to
characterize the intrinsic relationship between different EEG channels,
containing the dynamic functional connections and static spatial proximity
information of brain regions from neuroscience research. Moreover, motivated by
the observation of the relationship between coarse- and fine-grained emotions,
we adopt a dual-head module that enables the PGCN to progressively learn more
discriminative EEG features, from coarse-grained (easy) to fine-grained
categories (difficult), referring to the hierarchical characteristic of
emotion. To verify the performance of our model, extensive experiments were
conducted on two public datasets: SEED-IV and multi-modal physiological emotion
database (MPED).
Related papers
- Smile upon the Face but Sadness in the Eyes: Emotion Recognition based on Facial Expressions and Eye Behaviors [63.194053817609024]
We introduce eye behaviors as an important emotional cues for the creation of a new Eye-behavior-aided Multimodal Emotion Recognition dataset.
For the first time, we provide annotations for both Emotion Recognition (ER) and Facial Expression Recognition (FER) in the EMER dataset.
We specifically design a new EMERT architecture to concurrently enhance performance in both ER and FER.
arXiv Detail & Related papers (2024-11-08T04:53:55Z) - A Comprehensive Survey on EEG-Based Emotion Recognition: A Graph-Based Perspective [12.712722204034606]
Electroencephalogram (EEG) based emotion recognition can intuitively respond to emotional patterns in the human brain.
A significant trend is the application of graphs to encapsulate such dependency.
There is neither a comprehensive review nor a tutorial for constructing emotion-relevant graphs in EEG-based emotion recognition.
arXiv Detail & Related papers (2024-08-12T09:29:26Z) - Multi-modal Mood Reader: Pre-trained Model Empowers Cross-Subject Emotion Recognition [23.505616142198487]
We develop a Pre-trained model based Multimodal Mood Reader for cross-subject emotion recognition.
The model learns universal latent representations of EEG signals through pre-training on large scale dataset.
Extensive experiments on public datasets demonstrate Mood Reader's superior performance in cross-subject emotion recognition tasks.
arXiv Detail & Related papers (2024-05-28T14:31:11Z) - Graph Neural Networks in EEG-based Emotion Recognition: A Survey [7.967961714421288]
A significant trend is to develop Graph Neural Networks (GNNs) for EEG-based emotion recognition.
Brain region dependencies in emotional EEG have physiological bases that distinguish GNNs in this field from those in other time series fields.
We analyze and categorize methods from three stages in the framework to provide clear guidance on constructing GNNs in EEG-based emotion recognition.
arXiv Detail & Related papers (2024-02-02T04:30:58Z) - A Knowledge-Driven Cross-view Contrastive Learning for EEG
Representation [48.85731427874065]
This paper proposes a knowledge-driven cross-view contrastive learning framework (KDC2) to extract effective representations from EEG with limited labels.
The KDC2 method creates scalp and neural views of EEG signals, simulating the internal and external representation of brain activity.
By modeling prior neural knowledge based on neural information consistency theory, the proposed method extracts invariant and complementary neural knowledge to generate combined representations.
arXiv Detail & Related papers (2023-09-21T08:53:51Z) - fMRI from EEG is only Deep Learning away: the use of interpretable DL to
unravel EEG-fMRI relationships [68.8204255655161]
We present an interpretable domain grounded solution to recover the activity of several subcortical regions from multichannel EEG data.
We recover individual spatial and time-frequency patterns of scalp EEG predictive of the hemodynamic signal in the subcortical nuclei.
arXiv Detail & Related papers (2022-10-23T15:11:37Z) - Contrastive Learning of Subject-Invariant EEG Representations for
Cross-Subject Emotion Recognition [9.07006689672858]
We propose Contrast Learning method for Inter-Subject Alignment (ISA) for reliable cross-subject emotion recognition.
ISA involves maximizing the similarity in EEG signals across subjects when they received the same stimuli in contrast to different ones.
A convolutional neural network with depthwise spatial convolution and temporal convolution layers was applied to learn inter-subject representations from raw EEG signals.
arXiv Detail & Related papers (2021-09-20T14:13:45Z) - A Circular-Structured Representation for Visual Emotion Distribution
Learning [82.89776298753661]
We propose a well-grounded circular-structured representation to utilize the prior knowledge for visual emotion distribution learning.
To be specific, we first construct an Emotion Circle to unify any emotional state within it.
On the proposed Emotion Circle, each emotion distribution is represented with an emotion vector, which is defined with three attributes.
arXiv Detail & Related papers (2021-06-23T14:53:27Z) - A Novel Transferability Attention Neural Network Model for EEG Emotion
Recognition [51.203579838210885]
We propose a transferable attention neural network (TANN) for EEG emotion recognition.
TANN learns the emotional discriminative information by highlighting the transferable EEG brain regions data and samples adaptively.
This can be implemented by measuring the outputs of multiple brain-region-level discriminators and one single sample-level discriminator.
arXiv Detail & Related papers (2020-09-21T02:42:30Z) - Investigating EEG-Based Functional Connectivity Patterns for Multimodal
Emotion Recognition [8.356765961526955]
We investigate three functional connectivity network features: strength, clustering, coefficient and eigenvector centrality.
The discrimination ability of the EEG connectivity features in emotion recognition is evaluated on three public EEG datasets.
We construct a multimodal emotion recognition model by combining the functional connectivity features from EEG and the features from eye movements or physiological signals.
arXiv Detail & Related papers (2020-04-04T16:51:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.