Leveraging Generic Time Series Foundation Models for EEG Classification
- URL: http://arxiv.org/abs/2510.27522v1
- Date: Fri, 31 Oct 2025 14:49:23 GMT
- Title: Leveraging Generic Time Series Foundation Models for EEG Classification
- Authors: Théo Gnassounou, Yessin Moakher, Shifeng Xie, Vasilii Feofanov, Ievgen Redko,
- Abstract summary: We investigate the applicability of a recently proposed time series classification foundation model, to a different EEG tasks such as motor imagery classification and sleep stage prediction.<n>We find that both variants yield strong performance, consistently outperforming EEGNet, a widely used convolutional baseline, and CBraMod, the most recent EEG-specific foundation model.<n>Our findings highlight the promise of leveraging cross-domain pretrained models for brain signal analysis, suggesting that EEG may benefit from advances in the broader time series literature.
- Score: 8.938023803309097
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Foundation models for time series are emerging as powerful general-purpose backbones, yet their potential for domain-specific biomedical signals such as electroencephalography (EEG) remains rather unexplored. In this work, we investigate the applicability a recently proposed time series classification foundation model, to a different EEG tasks such as motor imagery classification and sleep stage prediction. We test two pretraining regimes: (a) pretraining on heterogeneous real-world time series from multiple domains, and (b) pretraining on purely synthetic data. We find that both variants yield strong performance, consistently outperforming EEGNet, a widely used convolutional baseline, and CBraMod, the most recent EEG-specific foundation model. These results suggest that generalist time series foundation models, even when pretrained on data of non-neural origin or on synthetic signals, can transfer effectively to EEG. Our findings highlight the promise of leveraging cross-domain pretrained models for brain signal analysis, suggesting that EEG may benefit from advances in the broader time series literature.
Related papers
- REVE: A Foundation Model for EEG -- Adapting to Any Setup with Large-Scale Pretraining on 25,000 Subjects [5.368295573908306]
REVE (Representation for EEG with Versatile Embeddings) is a pretrained model explicitly designed to generalize across diverse EEG signals.<n>We pretrain REVE on over 60,000 hours of EEG data from 92 datasets spanning 25,000 subjects, representing the largest EEG pretraining effort to date.<n>We release code, pretrained weights, and tutorials to support standardized EEG research and accelerate progress in clinical neuroscience.
arXiv Detail & Related papers (2025-10-24T15:52:46Z) - SEMPO: Lightweight Foundation Models for Time Series Forecasting [45.456949943052116]
SEMPO is a lightweight foundation model that requires pretraining on relatively small-scale data, yet exhibits strong general time series forecasting.<n> SEMPO comprises two key modules: 1) energy-aware SpEctral decomposition module, that substantially improves the utilization of pre-training data.<n>Experiments on two large-scale benchmarks covering 16 datasets demonstrate the superior performance of SEMPO in both zero-shot and few-shot forecasting scenarios.
arXiv Detail & Related papers (2025-10-22T15:58:44Z) - WaveNet's Precision in EEG Classification [1.0885910878567457]
This study introduces a WaveNet-based deep learning model designed to automate the classification of EEG signals into physiological, pathological, artifact, and noise categories.<n>The model was trained, validated, and tested on 209,232 samples with a 70/20/10 percent split.<n>WaveNet's architecture, originally developed for raw audio synthesis, is well suited for EEG data due to its use of dilated causal convolutions and residual connections.
arXiv Detail & Related papers (2025-10-10T09:21:21Z) - Counterfactual Probabilistic Diffusion with Expert Models [44.96279296893773]
We propose a time series diffusion-based framework that incorporates guidance from imperfect expert models.<n>Our method, ODE-Diff, bridges mechanistic and data-driven approaches, enabling more reliable and interpretable causal inference.
arXiv Detail & Related papers (2025-08-18T20:44:32Z) - EEGDM: EEG Representation Learning via Generative Diffusion Model [17.595769291603688]
We propose an EEG representation learning framework building upon Generative Diffusion Model (EEGDM)<n>Specifically, we developed a structured state-space model for diffusion pretraining and trained it using Denoising Diffusion Probabilistic Model (DDPM) framework.<n>The resulting latent EEG representations were then used for downstream classification tasks via our proposed latent fusion transformer (LFT)
arXiv Detail & Related papers (2025-08-13T14:40:52Z) - PSDNorm: Test-Time Temporal Normalization for Deep Learning in Sleep Staging [63.05435596565677]
We propose PSDNorm that leverages Monge mapping and temporal context to normalize feature maps in deep learning models for signals.<n> PSDNorm achieves state-of-the-art performance on unseen left-out datasets while being 4-times more data-efficient than BatchNorm.
arXiv Detail & Related papers (2025-03-06T16:20:25Z) - GEFM: Graph-Enhanced EEG Foundation Model [16.335330142000657]
Foundation models offer a promising solution by leveraging large-scale unlabeled data through pre-training.<n>We propose Graph-Enhanced EEG Foundation Model (GEFM), a novel foundation model for EEG that integrates both temporal and inter-channel information.<n>Our architecture combines Graph Neural Networks (GNNs), which effectively capture relational structures, with a masked autoencoder to enable efficient pre-training.
arXiv Detail & Related papers (2024-11-29T06:57:50Z) - Tackling Data Heterogeneity in Federated Time Series Forecasting [61.021413959988216]
Time series forecasting plays a critical role in various real-world applications, including energy consumption prediction, disease transmission monitoring, and weather forecasting.
Most existing methods rely on a centralized training paradigm, where large amounts of data are collected from distributed devices to a central cloud server.
We propose a novel framework, Fed-TREND, to address data heterogeneity by generating informative synthetic data as auxiliary knowledge carriers.
arXiv Detail & Related papers (2024-11-24T04:56:45Z) - BrainGPT: Unleashing the Potential of EEG Generalist Foundation Model by Autoregressive Pre-training [15.135177893151008]
EEGPT is the first generalist EEG foundation model designed to address these challenges.<n>First, we propose an electrode-wise modeling strategy that treats each electrode as a fundamental unit.<n>Second, we develop the first autoregressive EEG pre-trained model.<n>Third, we introduce a multi-task transfer learning paradigm using a learnable electrode graph network.
arXiv Detail & Related papers (2024-10-14T12:17:54Z) - Geodesic Optimization for Predictive Shift Adaptation on EEG data [53.58711912565724]
Domain adaptation methods struggle when distribution shifts occur simultaneously in $X$ and $y$.
This paper proposes a novel method termed Geodesic Optimization for Predictive Shift Adaptation (GOPSA) to address test-time multi-source DA.
GOPSA has the potential to combine the advantages of mixed-effects modeling with machine learning for biomedical applications of EEG.
arXiv Detail & Related papers (2024-07-04T12:15:42Z) - Synthesizing Multimodal Electronic Health Records via Predictive Diffusion Models [69.06149482021071]
We propose a novel EHR data generation model called EHRPD.
It is a diffusion-based model designed to predict the next visit based on the current one while also incorporating time interval estimation.
We conduct experiments on two public datasets and evaluate EHRPD from fidelity, privacy, and utility perspectives.
arXiv Detail & Related papers (2024-06-20T02:20:23Z) - Generative Modeling of Regular and Irregular Time Series Data via Koopman VAEs [50.25683648762602]
We introduce Koopman VAE, a new generative framework that is based on a novel design for the model prior.
Inspired by Koopman theory, we represent the latent conditional prior dynamics using a linear map.
KoVAE outperforms state-of-the-art GAN and VAE methods across several challenging synthetic and real-world time series generation benchmarks.
arXiv Detail & Related papers (2023-10-04T07:14:43Z) - DGSD: Dynamical Graph Self-Distillation for EEG-Based Auditory Spatial
Attention Detection [49.196182908826565]
Auditory Attention Detection (AAD) aims to detect target speaker from brain signals in a multi-speaker environment.
Current approaches primarily rely on traditional convolutional neural network designed for processing Euclidean data like images.
This paper proposes a dynamical graph self-distillation (DGSD) approach for AAD, which does not require speech stimuli as input.
arXiv Detail & Related papers (2023-09-07T13:43:46Z) - EEG-Inception: An Accurate and Robust End-to-End Neural Network for
EEG-based Motor Imagery Classification [123.93460670568554]
This paper proposes a novel convolutional neural network (CNN) architecture for accurate and robust EEG-based motor imagery (MI) classification.
The proposed CNN model, namely EEG-Inception, is built on the backbone of the Inception-Time network.
The proposed network is an end-to-end classification, as it takes the raw EEG signals as the input and does not require complex EEG signal-preprocessing.
arXiv Detail & Related papers (2021-01-24T19:03:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.