EEGPT: Unleashing the Potential of EEG Generalist Foundation Model by Autoregressive Pre-training
- URL: http://arxiv.org/abs/2410.19779v1
- Date: Mon, 14 Oct 2024 12:17:54 GMT
- Title: EEGPT: Unleashing the Potential of EEG Generalist Foundation Model by Autoregressive Pre-training
- Authors: Tongtian Yue, Shuning Xue, Xuange Gao, Yepeng Tang, Longteng Guo, Jie Jiang, Jing Liu,
- Abstract summary: EEGPT is the first generalist EEG foundation model designed to address these challenges.
First, we propose an electrode-wise modeling strategy that treats each electrode as a fundamental unit.
Second, we develop the first autoregressive EEG pre-trained model.
Third, we introduce a multi-task transfer learning paradigm using a learnable electrode graph network.
- Score: 9.57946371147345
- License:
- Abstract: Electroencephalogram (EEG) signals are pivotal in providing insights into spontaneous brain activity, highlighting their significant importance in neuroscience research. However, the exploration of versatile EEG models is constrained by diverse data formats, outdated pre-training paradigms, and limited transfer learning methods, only leading to specialist models on single dataset. In this paper, we introduce EEGPT, the first generalist EEG foundation model designed to address these challenges. First, we propose an electrode-wise modeling strategy that treats each electrode as a fundamental unit, enabling the integration of diverse EEG datasets collected from up to 138 electrodes, amassing 37.5M pre-training samples. Second, we develop the first autoregressive EEG pre-trained model, moving away from traditional masked autoencoder approaches to a next signal prediction task that better captures the sequential and temporal dependencies of EEG data. We also explore scaling laws with model up to 1.1B parameters: the largest in EEG research to date. Third, we introduce a multi-task transfer learning paradigm using a learnable electrode graph network shared across tasks, which for the first time confirms multi-task compatibility and synergy. As the first generalist EEG foundation model, EEGPT shows broad compatibility with various signal acquisition devices, subjects, and tasks. It supports up to 138 electrodes and any combination thereof as input. Furthermore, we simultaneously evaluate it on 5 distinct tasks across 12 benchmarks. EEGPT consistently outperforms existing specialist models across all downstream tasks, with its effectiveness further validated through extensive ablation studies. This work sets a new direction for generalist EEG modeling, offering improved scalability, transferability, and adaptability for a wide range of EEG applications. The code and models will be released.
Related papers
- FoME: A Foundation Model for EEG using Adaptive Temporal-Lateral Attention Scaling [19.85701025524892]
FoME (Foundation Model for EEG) is a novel approach using adaptive temporal-lateral attention scaling.
FoME is pre-trained on a diverse 1.7TB dataset of scalp and intracranial EEG recordings, comprising 745M parameters trained for 1,096k steps.
arXiv Detail & Related papers (2024-09-19T04:22:40Z) - EEGMamba: Bidirectional State Space Model with Mixture of Experts for EEG Multi-task Classification [1.4004287903552533]
We introduce EEGMamba, the first universal EEG classification network to truly implement multi-task learning for EEG applications.
EEGMamba seamlessly integrates the Spatio-Temporal-Adaptive (ST- adaptive) module, bidirectional Mamba, and Mixture of Experts (MoE) into a unified framework.
We evaluate our model on eight publicly available EEG datasets, and the experimental results demonstrate its superior performance in four types of tasks.
arXiv Detail & Related papers (2024-07-20T11:15:47Z) - Geodesic Optimization for Predictive Shift Adaptation on EEG data [53.58711912565724]
Domain adaptation methods struggle when distribution shifts occur simultaneously in $X$ and $y$.
This paper proposes a novel method termed Geodesic Optimization for Predictive Shift Adaptation (GOPSA) to address test-time multi-source DA.
GOPSA has the potential to combine the advantages of mixed-effects modeling with machine learning for biomedical applications of EEG.
arXiv Detail & Related papers (2024-07-04T12:15:42Z) - EEGFormer: Towards Transferable and Interpretable Large-Scale EEG
Foundation Model [39.363511340878624]
We present a novel EEG foundation model, namely EEGFormer, pretrained on large-scale compound EEG data.
To validate the effectiveness of our model, we extensively evaluate it on various downstream tasks and assess the performance under different transfer settings.
arXiv Detail & Related papers (2024-01-11T17:36:24Z) - hvEEGNet: exploiting hierarchical VAEs on EEG data for neuroscience
applications [3.031375888004876]
Two main issues challenge the existing DL-based modeling methods for EEG.
High variability between subjects and low signal-to-noise ratio make it difficult to ensure a good quality in the EEG data.
We propose two variational autoencoder models, namely vEEGNet-ver3 and hvEEGNet, to target the problem of high-fidelity EEG reconstruction.
arXiv Detail & Related papers (2023-11-20T15:36:31Z) - Neuro-GPT: Towards A Foundation Model for EEG [0.04188114563181615]
We propose Neuro-GPT, a foundation model consisting of an EEG encoder and a GPT model.
Foundation model is pre-trained on a large-scale data set using a self-supervised task that learns how to reconstruct masked EEG segments.
Experiments demonstrate that applying a foundation model can significantly improve classification performance compared to a model trained from scratch.
arXiv Detail & Related papers (2023-11-07T07:07:18Z) - Task-oriented Self-supervised Learning for Anomaly Detection in
Electroencephalography [51.45515911920534]
A task-oriented self-supervised learning approach is proposed to train a more effective anomaly detector.
A specific two branch convolutional neural network with larger kernels is designed as the feature extractor.
The effectively designed and trained feature extractor has shown to be able to extract better feature representations from EEGs.
arXiv Detail & Related papers (2022-07-04T13:15:08Z) - Mixed Effects Neural ODE: A Variational Approximation for Analyzing the
Dynamics of Panel Data [50.23363975709122]
We propose a probabilistic model called ME-NODE to incorporate (fixed + random) mixed effects for analyzing panel data.
We show that our model can be derived using smooth approximations of SDEs provided by the Wong-Zakai theorem.
We then derive Evidence Based Lower Bounds for ME-NODE, and develop (efficient) training algorithms.
arXiv Detail & Related papers (2022-02-18T22:41:51Z) - A multi-stage machine learning model on diagnosis of esophageal
manometry [50.591267188664666]
The framework includes deep-learning models at the swallow-level stage and feature-based machine learning models at the study-level stage.
This is the first artificial-intelligence-style model to automatically predict CC diagnosis of HRM study from raw multi-swallow data.
arXiv Detail & Related papers (2021-06-25T20:09:23Z) - Adversarial Sample Enhanced Domain Adaptation: A Case Study on
Predictive Modeling with Electronic Health Records [57.75125067744978]
We propose a data augmentation method to facilitate domain adaptation.
adversarially generated samples are used during domain adaptation.
Results confirm the effectiveness of our method and the generality on different tasks.
arXiv Detail & Related papers (2021-01-13T03:20:20Z) - Opportunities and Challenges of Deep Learning Methods for
Electrocardiogram Data: A Systematic Review [62.490310870300746]
The electrocardiogram (ECG) is one of the most commonly used diagnostic tools in medicine and healthcare.
Deep learning methods have achieved promising results on predictive healthcare tasks using ECG signals.
This paper presents a systematic review of deep learning methods for ECG data from both modeling and application perspectives.
arXiv Detail & Related papers (2019-12-28T02:44:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.