Data-Efficient Sleep Staging with Synthetic Time Series Pretraining
- URL: http://arxiv.org/abs/2403.08592v1
- Date: Wed, 13 Mar 2024 14:57:10 GMT
- Title: Data-Efficient Sleep Staging with Synthetic Time Series Pretraining
- Authors: Niklas Grieger, Siamak Mehrkanoon, Stephan Bialonski
- Abstract summary: We propose a pretraining task termed "frequency pretraining" to pretrain a neural network for sleep staging.
Our experiments demonstrate that our method surpasses fully supervised learning in scenarios with limited data and few subjects.
- Score: 1.642094639107215
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Analyzing electroencephalographic (EEG) time series can be challenging,
especially with deep neural networks, due to the large variability among human
subjects and often small datasets. To address these challenges, various
strategies, such as self-supervised learning, have been suggested, but they
typically rely on extensive empirical datasets. Inspired by recent advances in
computer vision, we propose a pretraining task termed "frequency pretraining"
to pretrain a neural network for sleep staging by predicting the frequency
content of randomly generated synthetic time series. Our experiments
demonstrate that our method surpasses fully supervised learning in scenarios
with limited data and few subjects, and matches its performance in regimes with
many subjects. Furthermore, our results underline the relevance of frequency
information for sleep stage scoring, while also demonstrating that deep neural
networks utilize information beyond frequencies to enhance sleep staging
performance, which is consistent with previous research. We anticipate that our
approach will be advantageous across a broad spectrum of applications where EEG
data is limited or derived from a small number of subjects, including the
domain of brain-computer interfaces.
Related papers
- A frugal Spiking Neural Network for unsupervised classification of continuous multivariate temporal data [0.0]
Spiking Neural Networks (SNNs) are neuromorphic and use more biologically plausible neurons with evolving membrane potentials.
We introduce here a frugal single-layer SNN designed for fully unsupervised identification and classification of multivariate temporal patterns in continuous data.
arXiv Detail & Related papers (2024-08-08T08:15:51Z) - NeuroNet: A Novel Hybrid Self-Supervised Learning Framework for Sleep Stage Classification Using Single-Channel EEG [2.3310092106321365]
Sleep stage classification is a pivotal aspect of diagnosing sleep disorders and evaluating sleep quality.
Recent advancements in deep learning have substantially propelled the automation of sleep stage classification.
This paper introduces NeuroNet, a self-supervised learning framework designed to harness unlabeled single-channel sleep electroencephalogram (EEG) signals.
arXiv Detail & Related papers (2024-04-10T18:32:22Z) - Efficient and Effective Time-Series Forecasting with Spiking Neural Networks [47.371024581669516]
Spiking neural networks (SNNs) provide a unique pathway for capturing the intricacies of temporal data.
Applying SNNs to time-series forecasting is challenging due to difficulties in effective temporal alignment, complexities in encoding processes, and the absence of standardized guidelines for model selection.
We propose a framework for SNNs in time-series forecasting tasks, leveraging the efficiency of spiking neurons in processing temporal information.
arXiv Detail & Related papers (2024-02-02T16:23:50Z) - Time Scale Network: A Shallow Neural Network For Time Series Data [18.46091267922322]
Time series data is often composed of information at multiple time scales.
Deep learning strategies exist to capture this information, but many make networks larger, require more data, are more demanding to compute, and are difficult to interpret.
We present a minimal, computationally efficient Time Scale Network combining the translation and dilation sequence used in discrete wavelet transforms with traditional convolutional neural networks and back-propagation.
arXiv Detail & Related papers (2023-11-10T16:39:55Z) - How neural networks learn to classify chaotic time series [77.34726150561087]
We study the inner workings of neural networks trained to classify regular-versus-chaotic time series.
We find that the relation between input periodicity and activation periodicity is key for the performance of LKCNN models.
arXiv Detail & Related papers (2023-06-04T08:53:27Z) - Online Evolutionary Neural Architecture Search for Multivariate
Non-Stationary Time Series Forecasting [72.89994745876086]
This work presents the Online Neuro-Evolution-based Neural Architecture Search (ONE-NAS) algorithm.
ONE-NAS is a novel neural architecture search method capable of automatically designing and dynamically training recurrent neural networks (RNNs) for online forecasting tasks.
Results demonstrate that ONE-NAS outperforms traditional statistical time series forecasting methods.
arXiv Detail & Related papers (2023-02-20T22:25:47Z) - Dissecting U-net for Seismic Application: An In-Depth Study on Deep
Learning Multiple Removal [3.058685580689605]
Seismic processing often requires suppressing multiples that appear when collecting data.
We present a deep learning-based alternative that provides competitive results, while reducing its usage's complexity.
arXiv Detail & Related papers (2022-06-24T07:16:27Z) - Towards an Automatic Analysis of CHO-K1 Suspension Growth in
Microfluidic Single-cell Cultivation [63.94623495501023]
We propose a novel Machine Learning architecture, which allows us to infuse a neural deep network with human-powered abstraction on the level of data.
Specifically, we train a generative model simultaneously on natural and synthetic data, so that it learns a shared representation, from which a target variable, such as the cell count, can be reliably estimated.
arXiv Detail & Related papers (2020-10-20T08:36:51Z) - Deep learning for time series classification [2.0305676256390934]
Time series analysis allows us to visualize and understand the evolution of a process over time.
Time series classification consists of constructing algorithms dedicated to automatically label time series data.
Deep learning has emerged as one of the most effective methods for tackling the supervised classification task.
arXiv Detail & Related papers (2020-10-01T17:38:40Z) - Uncovering the structure of clinical EEG signals with self-supervised
learning [64.4754948595556]
Supervised learning paradigms are often limited by the amount of labeled data that is available.
This phenomenon is particularly problematic in clinically-relevant data, such as electroencephalography (EEG)
By extracting information from unlabeled data, it might be possible to reach competitive performance with deep neural networks.
arXiv Detail & Related papers (2020-07-31T14:34:47Z) - Rectified Linear Postsynaptic Potential Function for Backpropagation in
Deep Spiking Neural Networks [55.0627904986664]
Spiking Neural Networks (SNNs) usetemporal spike patterns to represent and transmit information, which is not only biologically realistic but also suitable for ultra-low-power event-driven neuromorphic implementation.
This paper investigates the contribution of spike timing dynamics to information encoding, synaptic plasticity and decision making, providing a new perspective to design of future DeepSNNs and neuromorphic hardware systems.
arXiv Detail & Related papers (2020-03-26T11:13:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.