Federated Motor Imagery Classification for Privacy-Preserving Brain-Computer Interfaces
- URL: http://arxiv.org/abs/2412.01079v1
- Date: Mon, 02 Dec 2024 03:35:27 GMT
- Title: Federated Motor Imagery Classification for Privacy-Preserving Brain-Computer Interfaces
- Authors: Tianwang Jia, Lubin Meng, Siyang Li, Jiajing Liu, Dongrui Wu,
- Abstract summary: This paper proposes Federated classification with local Batch-specific batch normalization and Sharpness-aware minimization.
FedBS protects user EEG data privacy, enabling multiple BCI users to participate in large-scale machine learning model training.
- Score: 14.251784648413276
- License:
- Abstract: Training an accurate classifier for EEG-based brain-computer interface (BCI) requires EEG data from a large number of users, whereas protecting their data privacy is a critical consideration. Federated learning (FL) is a promising solution to this challenge. This paper proposes Federated classification with local Batch-specific batch normalization and Sharpness-aware minimization (FedBS) for privacy protection in EEG-based motor imagery (MI) classification. FedBS utilizes local batch-specific batch normalization to reduce data discrepancies among different clients, and sharpness-aware minimization optimizer in local training to improve model generalization. Experiments on three public MI datasets using three popular deep learning models demonstrated that FedBS outperformed six state-of-the-art FL approaches. Remarkably, it also outperformed centralized training, which does not consider privacy protection at all. In summary, FedBS protects user EEG data privacy, enabling multiple BCI users to participate in large-scale machine learning model training, which in turn improves the BCI decoding accuracy.
Related papers
- Protecting Multiple Types of Privacy Simultaneously in EEG-based Brain-Computer Interfaces [17.24882553037956]
A brain-computer interface (BCI) enables direct communication between the brain and an external device.
EEG is the preferred input signal in non-invasive BCIs, due to its convenience and low cost.
EEG signals inherently carry rich personal information, necessitating privacy protection.
arXiv Detail & Related papers (2024-11-29T06:33:31Z) - Privacy-Preserving Federated Learning with Consistency via Knowledge Distillation Using Conditional Generator [19.00239208095762]
Federated Learning (FL) is gaining popularity as a distributed learning framework that only shares model parameters or updates and keeps private data locally.
We propose FedMD-CG, a novel FL method with highly competitive performance and high-level privacy preservation.
We conduct extensive experiments on various image classification tasks to validate the superiority of FedMD-CG.
arXiv Detail & Related papers (2024-09-11T02:36:36Z) - FewFedPIT: Towards Privacy-preserving and Few-shot Federated Instruction Tuning [54.26614091429253]
Federated instruction tuning (FedIT) is a promising solution, by consolidating collaborative training across multiple data owners.
FedIT encounters limitations such as scarcity of instructional data and risk of exposure to training data extraction attacks.
We propose FewFedPIT, designed to simultaneously enhance privacy protection and model performance of federated few-shot learning.
arXiv Detail & Related papers (2024-03-10T08:41:22Z) - Binary Federated Learning with Client-Level Differential Privacy [7.854806519515342]
Federated learning (FL) is a privacy-preserving collaborative learning framework.
Existing FL systems typically adopt Federated Average (FedAvg) as the training algorithm.
We propose a communication-efficient FL training algorithm with differential privacy guarantee.
arXiv Detail & Related papers (2023-08-07T06:07:04Z) - Can Public Large Language Models Help Private Cross-device Federated Learning? [58.05449579773249]
We study (differentially) private federated learning (FL) of language models.
Public data has been used to improve privacy-utility trade-offs for both large and small language models.
We propose a novel distribution matching algorithm with theoretical grounding to sample public data close to private data distribution.
arXiv Detail & Related papers (2023-05-20T07:55:58Z) - FedML-HE: An Efficient Homomorphic-Encryption-Based Privacy-Preserving Federated Learning System [24.39699808493429]
Federated Learning trains machine learning models on distributed devices by aggregating local model updates instead of local data.
Privacy concerns arise as the aggregated local models on the server may reveal sensitive personal information by inversion attacks.
We present FedML-HE, the first practical federated learning system with efficient HE-based secure model aggregation.
arXiv Detail & Related papers (2023-03-20T02:44:35Z) - FedDBL: Communication and Data Efficient Federated Deep-Broad Learning
for Histopathological Tissue Classification [65.7405397206767]
We propose Federated Deep-Broad Learning (FedDBL) to achieve superior classification performance with limited training samples and only one-round communication.
FedDBL greatly outperforms the competitors with only one-round communication and limited training samples, while it even achieves comparable performance with the ones under multiple-round communications.
Since no data or deep model sharing across different clients, the privacy issue is well-solved and the model security is guaranteed with no model inversion attack risk.
arXiv Detail & Related papers (2023-02-24T14:27:41Z) - FedPDC:Federated Learning for Public Dataset Correction [1.5533842336139065]
Federated learning has lower classification accuracy than traditional machine learning in Non-IID scenarios.
New algorithm FedPDC is proposed to optimize the aggregation mode of local models and the loss function of local training.
In many benchmark experiments, FedPDC can effectively improve the accuracy of the global model in the case of extremely unbalanced data distribution.
arXiv Detail & Related papers (2023-02-24T08:09:23Z) - Mixed Differential Privacy in Computer Vision [133.68363478737058]
AdaMix is an adaptive differentially private algorithm for training deep neural network classifiers using both private and public image data.
A few-shot or even zero-shot learning baseline that ignores private data can outperform fine-tuning on a large private dataset.
arXiv Detail & Related papers (2022-03-22T06:15:43Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.