Transformer-based EEG Decoding: A Survey
- URL: http://arxiv.org/abs/2507.02320v1
- Date: Thu, 03 Jul 2025 05:12:06 GMT
- Title: Transformer-based EEG Decoding: A Survey
- Authors: Haodong Zhang, Hongqi Li,
- Abstract summary: Transformer is renowned for its strong handling capability of sequential data by the attention mechanism.<n>Deep learning approaches have gradually revolutionized the field by providing an end-to-end long-cascaded architecture.
- Score: 2.3288585185469146
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Electroencephalography (EEG) is one of the most common signals used to capture the electrical activity of the brain, and the decoding of EEG, to acquire the user intents, has been at the forefront of brain-computer/machine interfaces (BCIs/BMIs) research. Compared to traditional EEG analysis methods with machine learning, the advent of deep learning approaches have gradually revolutionized the field by providing an end-to-end long-cascaded architecture, which can learn more discriminative features automatically. Among these, Transformer is renowned for its strong handling capability of sequential data by the attention mechanism, and the application of Transformers in various EEG processing tasks is increasingly prevalent. This article delves into a relevant survey, summarizing the latest application of Transformer models in EEG decoding since it appeared. The evolution of the model architecture is followed to sort and organize the related advances, in which we first elucidate the fundamentals of the Transformer that benefits EEG decoding and its direct application. Then, the common hybrid architectures by integrating basic Transformer with other deep learning techniques (convolutional/recurrent/graph/spiking neural netwo-rks, generative adversarial networks, diffusion models, etc.) is overviewed in detail. The research advances of applying the modified intrinsic structures of customized Transformer have also been introduced. Finally, the current challenges and future development prospects in this rapidly evolving field are discussed. This paper aims to help readers gain a clear understanding of the current state of Transformer applications in EEG decoding and to provide valuable insights for future research endeavors.
Related papers
- A Survey on Bridging EEG Signals and Generative AI: From Image and Text to Beyond [4.720913027054481]
Integration of Brain-Computer Interfaces (BCIs) and Generative Artificial Intelligence (GenAI) has opened new frontiers in brain signal decoding.<n>Recent advances in deep learning, including Generative Adversarial Networks (GANs) and Transformer-based Large Language Models (LLMs), have significantly improved EEG-based generation of images, text, and speech.
arXiv Detail & Related papers (2025-02-17T17:16:41Z) - Enhancing Transformers for Generalizable First-Order Logical Entailment [51.04944136538266]
This paper studies the generalizable first-order logical reasoning ability of transformers with their parameterized knowledge.<n>We propose TEGA, a logic-aware architecture that significantly improves the performance in first-order logical entailment.
arXiv Detail & Related papers (2025-01-01T07:05:32Z) - Comprehensive Review of EEG-to-Output Research: Decoding Neural Signals into Images, Videos, and Audio [0.0]
Recent advancements in machine learning and generative modeling have catalyzed the application of EEG in reconstructing perceptual experiences.<n>This paper systematically reviews EEG-to-output research, focusing on state-of-the-art generative methods, evaluation metrics, and data challenges.
arXiv Detail & Related papers (2024-12-28T03:50:56Z) - Automatic Graph Topology-Aware Transformer [50.2807041149784]
We build a comprehensive graph Transformer search space with the micro-level and macro-level designs.
EGTAS evolves graph Transformer topologies at the macro level and graph-aware strategies at the micro level.
We demonstrate the efficacy of EGTAS across a range of graph-level and node-level tasks.
arXiv Detail & Related papers (2024-05-30T07:44:31Z) - EEG-Deformer: A Dense Convolutional Transformer for Brain-computer Interfaces [17.524441950422627]
We introduce EEG-Deformer, which incorporates two main novel components into a CNN-Transformer.
EEG-Deformer learns from neurophysiologically meaningful brain regions for the corresponding cognitive tasks.
arXiv Detail & Related papers (2024-04-25T18:00:46Z) - EEGEncoder: Advancing BCI with Transformer-Based Motor Imagery Classification [11.687193535939798]
Brain-computer interfaces (BCIs) harness electroencephalographic signals for direct neural control of devices.
Traditional machine learning methods for EEG-based motor imagery (MI) classification encounter challenges such as manual feature extraction and susceptibility to noise.
This paper introduces EEGEncoder, a deep learning framework that employs modified transformers and TCNs to surmount these limitations.
arXiv Detail & Related papers (2024-04-23T09:51:24Z) - A Survey on Large Language Models from Concept to Implementation [4.219910716090213]
Recent advancements in Large Language Models (LLMs) have broadened the scope of natural language processing (NLP) applications.
This paper investigates the multifaceted applications of these models, with an emphasis on the GPT series.
This exploration focuses on the transformative impact of artificial intelligence (AI) driven tools in revolutionizing traditional tasks like coding and problem-solving.
arXiv Detail & Related papers (2024-03-27T19:35:41Z) - A Comprehensive Survey on Applications of Transformers for Deep Learning
Tasks [60.38369406877899]
Transformer is a deep neural network that employs a self-attention mechanism to comprehend the contextual relationships within sequential data.
transformer models excel in handling long dependencies between input sequence elements and enable parallel processing.
Our survey encompasses the identification of the top five application domains for transformer-based models.
arXiv Detail & Related papers (2023-06-11T23:13:51Z) - Full Stack Optimization of Transformer Inference: a Survey [58.55475772110702]
Transformer models achieve superior accuracy across a wide range of applications.
The amount of compute and bandwidth required for inference of recent Transformer models is growing at a significant rate.
There has been an increased focus on making Transformer models more efficient.
arXiv Detail & Related papers (2023-02-27T18:18:13Z) - Exploring Structure-aware Transformer over Interaction Proposals for
Human-Object Interaction Detection [119.93025368028083]
We design a novel Transformer-style Human-Object Interaction (HOI) detector, i.e., Structure-aware Transformer over Interaction Proposals (STIP)
STIP decomposes the process of HOI set prediction into two subsequent phases, i.e., an interaction proposal generation is first performed, and then followed by transforming the non-parametric interaction proposals into HOI predictions via a structure-aware Transformer.
The structure-aware Transformer upgrades vanilla Transformer by encoding additionally the holistically semantic structure among interaction proposals as well as the locally spatial structure of human/object within each interaction proposal, so as to strengthen HOI
arXiv Detail & Related papers (2022-06-13T16:21:08Z) - Transformers for prompt-level EMA non-response prediction [62.41658786277712]
Ecological Momentary Assessments (EMAs) are an important psychological data source for measuring cognitive states, affect, behavior, and environmental factors.
Non-response, in which participants fail to respond to EMA prompts, is an endemic problem.
The ability to accurately predict non-response could be utilized to improve EMA delivery and develop compliance interventions.
arXiv Detail & Related papers (2021-11-01T18:38:47Z) - Transformer-based Spatial-Temporal Feature Learning for EEG Decoding [4.8276709243429]
We propose a novel EEG decoding method that mainly relies on the attention mechanism.
We have reached the level of the state-of-the-art in multi-classification of EEG, with fewer parameters.
It has good potential to promote the practicality of brain-computer interface (BCI)
arXiv Detail & Related papers (2021-06-11T00:48:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.