The Impact of Cut Layer Selection in Split Federated Learning
- URL: http://arxiv.org/abs/2412.15536v1
- Date: Fri, 20 Dec 2024 03:52:54 GMT
- Title: The Impact of Cut Layer Selection in Split Federated Learning
- Authors: Justin Dachille, Chao Huang, Xin Liu,
- Abstract summary: Split Federated Learning (SFL) is a distributed machine learning paradigm that combines federated learning and split learning.
In SFL, a neural network is partitioned at a cut layer, with the initial layers deployed on clients and remaining layers on a training server.
- Score: 6.481423646861632
- License:
- Abstract: Split Federated Learning (SFL) is a distributed machine learning paradigm that combines federated learning and split learning. In SFL, a neural network is partitioned at a cut layer, with the initial layers deployed on clients and remaining layers on a training server. There are two main variants of SFL: SFL-V1 where the training server maintains separate server-side models for each client, and SFL-V2 where the training server maintains a single shared model for all clients. While existing studies have focused on algorithm development for SFL, a comprehensive quantitative analysis of how the cut layer selection affects model performance remains unexplored. This paper addresses this gap by providing numerical and theoretical analysis of SFL performance and convergence relative to cut layer selection. We find that SFL-V1 is relatively invariant to the choice of cut layer, which is consistent with our theoretical results. Numerical experiments on four datasets and two neural networks show that the cut layer selection significantly affects the performance of SFL-V2. Moreover, SFL-V2 with an appropriate cut layer selection outperforms FedAvg on heterogeneous data.
Related papers
- How Can Incentives and Cut Layer Selection Influence Data Contribution in Split Federated Learning? [49.16923922018379]
Split Federated Learning (SFL) has emerged as a promising approach by combining the advantages of federated and split learning.
We model the problem using a hierarchical decision-making approach, formulated as a single-leader multi-follower Stackelberg game.
Our findings show that the Stackelberg equilibrium solution maximizes the utility for both the clients and the SFL model owner.
arXiv Detail & Related papers (2024-12-10T06:24:08Z) - Sequential Federated Learning in Hierarchical Architecture on Non-IID Datasets [25.010661914466354]
In a real federated learning (FL) system, communication overhead for passing model parameters between the clients and the parameter (PS) is often a bottleneck.
We propose sequential FL (SFL) HFL for the first time, which removes the central PS and enables the model to be completed only through passing data between two adjacent ESs for each server.
arXiv Detail & Related papers (2024-08-19T07:43:35Z) - Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Convergence Analysis of Split Federated Learning on Heterogeneous Data [10.61370409320618]
Split learning (SFL) is a recent distributed approach for collaborative model training among multiple clients.
In SFL, a global model is typically split into two parts, where clients train one part in a parallel federated manner, and the other trains the other.
We provide convergence analysis of SFL for strongly convex and general objectives on heterogeneous data.
arXiv Detail & Related papers (2024-02-23T07:59:23Z) - Exploring the Privacy-Energy Consumption Tradeoff for Split Federated Learning [51.02352381270177]
Split Federated Learning (SFL) has recently emerged as a promising distributed learning technology.
The choice of the cut layer in SFL can have a substantial impact on the energy consumption of clients and their privacy.
This article provides a comprehensive overview of the SFL process and thoroughly analyze energy consumption and privacy.
arXiv Detail & Related papers (2023-11-15T23:23:42Z) - SplitFed resilience to packet loss: Where to split, that is the question [27.29876880765472]
Split Federated Learning (SFL) aims to reduce the computational power required by each client in FL and parallelize SL while maintaining privacy.
This paper investigates the robustness of SFL against packet loss on communication links.
Experiments are carried out on a segmentation model for human embryo images and indicate the statistically significant advantage of a deeper split point.
arXiv Detail & Related papers (2023-07-25T22:54:47Z) - Hierarchical Personalized Federated Learning Over Massive Mobile Edge
Computing Networks [95.39148209543175]
We propose hierarchical PFL (HPFL), an algorithm for deploying PFL over massive MEC networks.
HPFL combines the objectives of training loss minimization and round latency minimization while jointly determining the optimal bandwidth allocation.
arXiv Detail & Related papers (2023-03-19T06:00:05Z) - Time-sensitive Learning for Heterogeneous Federated Edge Intelligence [52.83633954857744]
We investigate real-time machine learning in a federated edge intelligence (FEI) system.
FEI systems exhibit heterogenous communication and computational resource distribution.
We propose a time-sensitive federated learning (TS-FL) framework to minimize the overall run-time for collaboratively training a shared ML model.
arXiv Detail & Related papers (2023-01-26T08:13:22Z) - Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized
Floating Aggregation Point [51.47520726446029]
cooperative edge learning (CE-FL) is a distributed machine learning architecture.
We model the processes taken during CE-FL, and conduct analytical training.
We show the effectiveness of our framework with the data collected from a real-world testbed.
arXiv Detail & Related papers (2022-03-26T00:41:57Z) - Splitfed learning without client-side synchronization: Analyzing
client-side split network portion size to overall performance [4.689140226545214]
Federated Learning (FL), Split Learning (SL), and SplitFed Learning (SFL) are three recent developments in distributed machine learning.
This paper studies SFL without client-side model synchronization.
It provides only 1%-2% better accuracy than Multi-head Split Learning on the MNIST test set.
arXiv Detail & Related papers (2021-09-19T22:57:23Z) - SplitFed: When Federated Learning Meets Split Learning [16.212941272007285]
Federated learning (FL) and split learning (SL) are two popular distributed machine learning approaches.
This paper presents a novel approach, named splitfed learning (SFL), that amalgamates the two approaches.
SFL provides similar test accuracy and communication efficiency as SL while significantly decreasing its computation time per global epoch than in SL for multiple clients.
arXiv Detail & Related papers (2020-04-25T08:52:50Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.