WebFed: Cross-platform Federated Learning Framework Based on Web Browser
with Local Differential Privacy
- URL: http://arxiv.org/abs/2110.11646v1
- Date: Fri, 22 Oct 2021 08:18:41 GMT
- Title: WebFed: Cross-platform Federated Learning Framework Based on Web Browser
with Local Differential Privacy
- Authors: Zhuotao Lian, Qinglin Yang, Qingkui Zeng, Chunhua Su
- Abstract summary: WebFed is a novel browser-based federated learning framework that takes advantage of the browser's features.
We conduct experiments on heterogeneous devices to evaluate the performance of the proposed WebFed framework.
- Score: 0.7837881800517109
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: For data isolated islands and privacy issues, federated learning has been
extensively invoking much interest since it allows clients to collaborate on
training a global model using their local data without sharing any with a third
party. However, the existing federated learning frameworks always need
sophisticated condition configurations (e.g., sophisticated driver
configuration of standalone graphics card like NVIDIA, compile environment)
that bring much inconvenience for large-scale development and deployment. To
facilitate the deployment of federated learning and the implementation of
related applications, we innovatively propose WebFed, a novel browser-based
federated learning framework that takes advantage of the browser's features
(e.g., Cross-platform, JavaScript Programming Features) and enhances the
privacy protection via local differential privacy mechanism. Finally, We
conduct experiments on heterogeneous devices to evaluate the performance of the
proposed WebFed framework.
Related papers
- FedP3: Federated Personalized and Privacy-friendly Network Pruning under Model Heterogeneity [82.5448598805968]
We present an effective and adaptable federated framework FedP3, representing Federated Personalized and Privacy-friendly network Pruning.
We offer a theoretical interpretation of FedP3 and its locally differential-private variant, DP-FedP3, and theoretically validate their efficiencies.
arXiv Detail & Related papers (2024-04-15T14:14:05Z) - RAIFLE: Reconstruction Attacks on Interaction-based Federated Learning with Adversarial Data Manipulation [14.394939014120451]
We show that users face an elevated risk of having their private interactions reconstructed by the central server.
We introduce RAIFLE, a novel optimization-based attack framework.
Our experiments with federated recommendation and online learning-to-rank scenarios demonstrate that RAIFLE is significantly more powerful than existing reconstruction attacks.
arXiv Detail & Related papers (2023-10-29T21:47:24Z) - Serverless Federated Learning with flwr-serverless [0.0]
We introduce textttflwr-serverless, a wrapper around the Flower Python package to allow for both synchronous and asynchronous federated learning.
Our approach to federated learning allows the process to run without a central server, which increases the domains of application and accessibility of its use.
arXiv Detail & Related papers (2023-10-23T19:49:59Z) - Rethinking Client Drift in Federated Learning: A Logit Perspective [125.35844582366441]
Federated Learning (FL) enables multiple clients to collaboratively learn in a distributed way, allowing for privacy protection.
We find that the difference in logits between the local and global models increases as the model is continuously updated.
We propose a new algorithm, named FedCSD, a Class prototype Similarity Distillation in a federated framework to align the local and global models.
arXiv Detail & Related papers (2023-08-20T04:41:01Z) - Combating Exacerbated Heterogeneity for Robust Models in Federated
Learning [91.88122934924435]
Combination of adversarial training and federated learning can lead to the undesired robustness deterioration.
We propose a novel framework called Slack Federated Adversarial Training (SFAT)
We verify the rationality and effectiveness of SFAT on various benchmarked and real-world datasets.
arXiv Detail & Related papers (2023-03-01T06:16:15Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - Multimodal Federated Learning [9.081857621783811]
In many applications, such as smart homes with IoT devices, local data on clients are generated from different modalities.
Existing federated learning systems only work on local data from a single modality, which limits the scalability of the systems.
We propose a multimodal and semi-supervised federated learning framework that trains autoencoders to extract shared or correlated representations from different local data modalities on clients.
arXiv Detail & Related papers (2021-09-10T12:32:46Z) - FedMix: Approximation of Mixup under Mean Augmented Federated Learning [60.503258658382]
Federated learning (FL) allows edge devices to collectively learn a model without directly sharing data within each device.
Current state-of-the-art algorithms suffer from performance degradation as the heterogeneity of local data across clients increases.
We propose a new augmentation algorithm, named FedMix, which is inspired by a phenomenal yet simple data augmentation method, Mixup.
arXiv Detail & Related papers (2021-07-01T06:14:51Z) - IBM Federated Learning: an Enterprise Framework White Paper V0.1 [28.21579297214125]
Federated Learning (FL) is an approach to conduct machine learning without centralizing training data in a single place.
The framework applies to both Deep Neural Networks as well as traditional'' approaches for the most common machine learning libraries.
arXiv Detail & Related papers (2020-07-22T05:32:00Z) - Federated Residual Learning [53.77128418049985]
We study a new form of federated learning where the clients train personalized local models and make predictions jointly with the server-side shared model.
Using this new federated learning framework, the complexity of the central shared model can be minimized while still gaining all the performance benefits that joint training provides.
arXiv Detail & Related papers (2020-03-28T19:55:24Z) - PrivacyFL: A simulator for privacy-preserving and secure federated
learning [2.578242050187029]
Federated learning is a technique that enables distributed clients to collaboratively learn a shared machine learning model.
PrivacyFL is a privacy-preserving and secure federated learning simulator.
arXiv Detail & Related papers (2020-02-19T20:16:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.