Cross-Subject DD: A Cross-Subject Brain-Computer Interface Algorithm
- URL: http://arxiv.org/abs/2507.05268v1
- Date: Wed, 02 Jul 2025 09:53:15 GMT
- Title: Cross-Subject DD: A Cross-Subject Brain-Computer Interface Algorithm
- Authors: Xiaoyuan Li, Xinru Xue, Bohan Zhang, Ye Sun, Shoushuo Xi, Gang Liu,
- Abstract summary: Brain-computer interface (BCI) based on motor imagery enables direct control of external devices by decoding the electroencephalogram (EEG) generated in the brain during imagined movements.<n>existing BCI models exhibit poor adaptability across subjects, limiting their generalizability and widespread application.<n>This paper proposes a cross-subject BCI algorithm named Cross-Subject DD (CSDD), which constructs a universal BCI model by extracting common features across subjects.
- Score: 6.612070142961214
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: Brain-computer interface (BCI) based on motor imagery (MI) enables direct control of external devices by decoding the electroencephalogram (EEG) generated in the brain during imagined movements. However, due to inter-individual variability in brain activity, existing BCI models exhibit poor adaptability across subjects, thereby limiting their generalizability and widespread application. To address this issue, this paper proposes a cross-subject BCI algorithm named Cross-Subject DD (CSDD), which constructs a universal BCI model by extracting common features across subjects. The specific methods include: 1) training personalized models for each subject; 2) transforming personalized models into relation spectrums; 3) identifying common features through statistical analysis; and 4) constructing a cross-subject universal model based on common features. The experiments utilized the BCIC IV 2a dataset, involving nine subjects. Eight of these subjects were selected for training and extracing the common features, and the cross-subject decoding performance of the model was validated on the remaining subject. The results demonstrate that, compared with existing similar methods, our approach achieves a 3.28% improvement in performance. This paper introduces for the first time a novel method for extracting pure common features and constructing a universal cross-subject BCI model, thereby facilitating broader applications of BCI technology.
Related papers
- AdaBrain-Bench: Benchmarking Brain Foundation Models for Brain-Computer Interface Applications [52.91583053243446]
Non-invasive Brain-Computer Interfaces (BCI) offer a safe and accessible means of connecting the human brain to external devices.<n>Recently, the adoption of self-supervised pre-training is transforming the landscape of non-invasive BCI research.<n>AdaBrain-Bench is a standardized benchmark to evaluate brain foundation models in widespread non-invasive BCI tasks.
arXiv Detail & Related papers (2025-07-14T03:37:41Z) - CSBrain: A Cross-scale Spatiotemporal Brain Foundation Model for EEG Decoding [57.90382885533593]
We propose a Cross-scale Spatiotemporal Brain foundation model for generalized decoding EEG signals.<n>We show that CSBrain consistently outperforms task-specific and foundation model baselines.<n>These results establish cross-scale modeling as a key inductive bias and position CSBrain as a robust backbone for future brain-AI research.
arXiv Detail & Related papers (2025-06-29T03:29:34Z) - AnyBody: A Benchmark Suite for Cross-Embodiment Manipulation [59.671764778486995]
Generalizing control policies to novel embodiments remains a fundamental challenge in enabling scalable and transferable learning in robotics.<n>We introduce a benchmark for learning cross-embodiment manipulation, focusing on two foundational tasks-reach and push-across a diverse range of morphologies.<n>We evaluate the ability of different RL policies to learn from multiple morphologies and to generalize to novel ones.
arXiv Detail & Related papers (2025-05-21T00:21:38Z) - UniBrain: A Unified Model for Cross-Subject Brain Decoding [22.49964298783508]
We present UniBrain, a unified brain decoding model that requires no subject-specific parameters.<n>Our approach includes a group-based extractor to handle variable fMRI signal lengths, a mutual assistance embedder to capture cross-subject commonalities, and a bilevel feature alignment scheme for extracting subject-invariant features.<n>We validate our UniBrain on the brain decoding benchmark, achieving comparable performance to current state-of-the-art subject-specific models with extremely fewer parameters.
arXiv Detail & Related papers (2024-12-27T07:03:47Z) - Quantifying Spatial Domain Explanations in BCI using Earth Mover's Distance [6.038190786160174]
BCIs facilitate unique communication between humans and computers, benefiting severely disabled individuals.
It's crucial to assess and explain BCI performance, offering clear explanations for potential users to avoid frustration when it doesn't work as expected.
arXiv Detail & Related papers (2024-05-02T13:35:15Z) - MindBridge: A Cross-Subject Brain Decoding Framework [60.58552697067837]
Brain decoding aims to reconstruct stimuli from acquired brain signals.
Currently, brain decoding is confined to a per-subject-per-model paradigm.
We present MindBridge, that achieves cross-subject brain decoding by employing only one model.
arXiv Detail & Related papers (2024-04-11T15:46:42Z) - Motor Imagery Decoding Using Ensemble Curriculum Learning and
Collaborative Training [11.157243900163376]
Multi-subject EEG datasets present several kinds of domain shifts.
These domain shifts impede robust cross-subject generalization.
We propose a two-stage model ensemble architecture built with multiple feature extractors.
We demonstrate that our model ensembling approach combines the powers of curriculum learning and collaborative training.
arXiv Detail & Related papers (2022-11-21T13:45:44Z) - 2021 BEETL Competition: Advancing Transfer Learning for Subject
Independence & Heterogenous EEG Data Sets [89.84774119537087]
We design two transfer learning challenges around diagnostics and Brain-Computer-Interfacing (BCI)
Task 1 is centred on medical diagnostics, addressing automatic sleep stage annotation across subjects.
Task 2 is centred on Brain-Computer Interfacing (BCI), addressing motor imagery decoding across both subjects and data sets.
arXiv Detail & Related papers (2022-02-14T12:12:20Z) - Confidence-Aware Subject-to-Subject Transfer Learning for Brain-Computer
Interface [3.2550305883611244]
The inter/intra-subject variability of electroencephalography (EEG) makes the practical use of the brain-computer interface (BCI) difficult.
We propose a BCI framework using only high-confidence subjects for TL training.
In our framework, a deep neural network selects useful subjects for the TL process and excludes noisy subjects, using a co-teaching algorithm based on the small-loss trick.
arXiv Detail & Related papers (2021-12-15T15:23:23Z) - Adversarial Sample Enhanced Domain Adaptation: A Case Study on
Predictive Modeling with Electronic Health Records [57.75125067744978]
We propose a data augmentation method to facilitate domain adaptation.
adversarially generated samples are used during domain adaptation.
Results confirm the effectiveness of our method and the generality on different tasks.
arXiv Detail & Related papers (2021-01-13T03:20:20Z) - Performance of Dual-Augmented Lagrangian Method and Common Spatial
Patterns applied in classification of Motor-Imagery BCI [68.8204255655161]
Motor-imagery based brain-computer interfaces (MI-BCI) have the potential to become ground-breaking technologies for neurorehabilitation.
Due to the noisy nature of the used EEG signal, reliable BCI systems require specialized procedures for features optimization and extraction.
arXiv Detail & Related papers (2020-10-13T20:50:13Z) - Deep brain state classification of MEG data [2.9048924265579124]
This paper uses Magnetoencephalography (MEG) data, provided by the Human Connectome Project (HCP), in combination with various deep artificial neural network models to perform brain decoding.
arXiv Detail & Related papers (2020-07-02T05:51:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.