SSVEP-DAN: A Data Alignment Network for SSVEP-based Brain Computer
Interfaces
- URL: http://arxiv.org/abs/2311.12666v1
- Date: Tue, 21 Nov 2023 15:18:29 GMT
- Title: SSVEP-DAN: A Data Alignment Network for SSVEP-based Brain Computer
Interfaces
- Authors: Sung-Yu Chen, Chi-Min Chang, Kuan-Jung Chiang, Chun-Shu Wei
- Abstract summary: Steady-state visual-evoked potential (SSVEP)-based brain-computer interfaces (BCIs) offer a non-invasive means of communication through high-speed speller systems.
We present SSVEP-DAN, the first dedicated neural network model designed for aligning SSVEP data across different domains.
- Score: 2.1192321523349404
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Steady-state visual-evoked potential (SSVEP)-based brain-computer interfaces
(BCIs) offer a non-invasive means of communication through high-speed speller
systems. However, their efficiency heavily relies on individual training data
obtained during time-consuming calibration sessions. To address the challenge
of data insufficiency in SSVEP-based BCIs, we present SSVEP-DAN, the first
dedicated neural network model designed for aligning SSVEP data across
different domains, which can encompass various sessions, subjects, or devices.
Our experimental results across multiple cross-domain scenarios demonstrate
SSVEP-DAN's capability to transform existing source SSVEP data into
supplementary calibration data, significantly enhancing SSVEP decoding accuracy
in scenarios with limited calibration data. We envision SSVEP-DAN as a catalyst
for practical SSVEP-based BCI applications with minimal calibration. The source
codes in this work are available at: https://github.com/CECNL/SSVEP-DAN.
Related papers
- On Pretraining Data Diversity for Self-Supervised Learning [57.91495006862553]
We explore the impact of training with more diverse datasets on the performance of self-supervised learning (SSL) under a fixed computational budget.
Our findings consistently demonstrate that increasing pretraining data diversity enhances SSL performance, albeit only when the distribution distance to the downstream data is minimal.
arXiv Detail & Related papers (2024-03-20T17:59:58Z) - Convolutional Monge Mapping Normalization for learning on sleep data [63.22081662149488]
We propose a new method called Convolutional Monge Mapping Normalization (CMMN)
CMMN consists in filtering the signals in order to adapt their power spectrum density (PSD) to a Wasserstein barycenter estimated on training data.
Numerical experiments on sleep EEG data show that CMMN leads to significant and consistent performance gains independent from the neural network architecture.
arXiv Detail & Related papers (2023-05-30T08:24:01Z) - A Transformer-based deep neural network model for SSVEP classification [18.766260137886054]
We propose a deep learning model for SSVEP classification based on Transformer structure in inter-subject classification scenario.
Inspired by previous studies, the model adopts the frequency spectrum of SSVEP data as input, and explores the spectral and spatial domain information for classification.
The proposed models could achieve better results in terms of classification accuracy and information transfer rate, compared with other baseline methods.
arXiv Detail & Related papers (2022-10-09T05:28:35Z) - Dynamic Network-Assisted D2D-Aided Coded Distributed Learning [59.29409589861241]
We propose a novel device-to-device (D2D)-aided coded federated learning method (D2D-CFL) for load balancing across devices.
We derive an optimal compression rate for achieving minimum processing time and establish its connection with the convergence time.
Our proposed method is beneficial for real-time collaborative applications, where the users continuously generate training data.
arXiv Detail & Related papers (2021-11-26T18:44:59Z) - NeuralDP Differentially private neural networks by design [61.675604648670095]
We propose NeuralDP, a technique for privatising activations of some layer within a neural network.
We experimentally demonstrate on two datasets that our method offers substantially improved privacy-utility trade-offs compared to DP-SGD.
arXiv Detail & Related papers (2021-07-30T12:40:19Z) - Sensitivity analysis in differentially private machine learning using
hybrid automatic differentiation [54.88777449903538]
We introduce a novel textithybrid automatic differentiation (AD) system for sensitivity analysis.
This enables modelling the sensitivity of arbitrary differentiable function compositions, such as the training of neural networks on private data.
Our approach can enable the principled reasoning about privacy loss in the setting of data processing.
arXiv Detail & Related papers (2021-07-09T07:19:23Z) - Boosting Template-based SSVEP Decoding by Cross-domain Transfer Learning [2.454595178503407]
We enhance the state-of-the-art template-based SSVEP decoding through incorporating a least-squares transformation (LST)-based transfer learning.
Study results verified the efficacy of LST in obviating the variability of SSVEPs when transferring existing data across domains.
arXiv Detail & Related papers (2021-02-10T00:14:06Z) - A Deep Neural Network for SSVEP-based Brain-Computer Interfaces [3.0595138995552746]
Target identification in brain-computer interface (BCI) spellers refers to the electroencephalogram (EEG) classification for predicting the target character that the subject intends to spell.
In this setting, we address the target identification and propose a novel deep neural network (DNN) architecture.
The proposed DNN processes the multi-channel SSVEP with convolutions across the sub-bands of harmonics, channels, time, and classifies at the fully connected layer.
arXiv Detail & Related papers (2020-11-17T11:11:19Z) - Transfer Learning and SpecAugment applied to SSVEP Based BCI
Classification [1.9336815376402716]
We use deep convolutional neural networks (DCNNs) to classify EEG signals in a single-channel brain-computer interface (BCI)
EEG signals were converted to spectrograms and served as input to train DCNNs using the transfer learning technique.
arXiv Detail & Related papers (2020-10-08T00:30:12Z) - A Principled Approach to Data Valuation for Federated Learning [73.19984041333599]
Federated learning (FL) is a popular technique to train machine learning (ML) models on decentralized data sources.
The Shapley value (SV) defines a unique payoff scheme that satisfies many desiderata for a data value notion.
This paper proposes a variant of the SV amenable to FL, which we call the federated Shapley value.
arXiv Detail & Related papers (2020-09-14T04:37:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.