Fault-Tolerant Vertical Federated Learning on Dynamic Networks
- URL: http://arxiv.org/abs/2312.16638v1
- Date: Wed, 27 Dec 2023 17:00:09 GMT
- Title: Fault-Tolerant Vertical Federated Learning on Dynamic Networks
- Authors: Surojit Ganguli, Zeyu Zhou, Christopher G. Brinton, David I. Inouye
- Abstract summary: Vertical Federated learning (VFL) is a class of FL where each client shares the same sample space but only holds a subset of the features.
This paper defines Internet Learning (IL) including its data splitting and network context.
We propose VFL as a naive baseline and develop several extensions to handle the IL paradigm of learning.
- Score: 17.214112087657206
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Vertical Federated learning (VFL) is a class of FL where each client shares
the same sample space but only holds a subset of the features. While VFL
tackles key privacy challenges of distributed learning, it often assumes
perfect hardware and communication capabilities. This assumption hinders the
broad deployment of VFL, particularly on edge devices, which are heterogeneous
in their in-situ capabilities and will connect/disconnect from the network over
time. To address this gap, we define Internet Learning (IL) including its data
splitting and network context and which puts good performance under extreme
dynamic condition of clients as the primary goal. We propose VFL as a naive
baseline and develop several extensions to handle the IL paradigm of learning.
Furthermore, we implement new methods, propose metrics, and extensively analyze
results based on simulating a sensor network. The results show that the
developed methods are more robust to changes in the network than VFL baseline.
Related papers
- Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Online Vertical Federated Learning for Cooperative Spectrum Sensing [8.081617656116139]
Online vertical federated learning (OVFL) is designed to address the challenges of ongoing data stream and shifting learning goals.
OVFL achieves a sublinear regret bound, thereby evidencing its efficiency.
arXiv Detail & Related papers (2023-12-18T17:19:53Z) - FedConv: Enhancing Convolutional Neural Networks for Handling Data
Heterogeneity in Federated Learning [34.37155882617201]
Federated learning (FL) is an emerging paradigm in machine learning, where a shared model is collaboratively learned using data from multiple devices.
We systematically investigate the impact of different architectural elements, such as activation functions and normalization layers, on the performance within heterogeneous FL.
Our findings indicate that with strategic architectural modifications, pure CNNs can achieve a level of robustness that either matches or even exceeds that of ViTs.
arXiv Detail & Related papers (2023-10-06T17:57:50Z) - FLCC: Efficient Distributed Federated Learning on IoMT over CSMA/CA [0.0]
Federated Learning (FL) has emerged as a promising approach for privacy preservation.
This article investigates the performance of FL on an application that might be used to improve a remote healthcare system over ad hoc networks.
We present two metrics to evaluate the network performance: 1) probability of successful transmission while minimizing the interference, and 2) performance of distributed FL model in terms of accuracy and loss.
arXiv Detail & Related papers (2023-03-29T16:36:42Z) - Federated Learning and Meta Learning: Approaches, Applications, and
Directions [94.68423258028285]
In this tutorial, we present a comprehensive review of FL, meta learning, and federated meta learning (FedMeta)
Unlike other tutorial papers, our objective is to explore how FL, meta learning, and FedMeta methodologies can be designed, optimized, and evolved, and their applications over wireless networks.
arXiv Detail & Related papers (2022-10-24T10:59:29Z) - Low-Latency Cooperative Spectrum Sensing via Truncated Vertical
Federated Learning [51.51440623636274]
We propose a vertical federated learning (VFL) framework to exploit the distributed features across multiple secondary users (SUs) without compromising data privacy.
To accelerate the training process, we propose a truncated vertical federated learning (T-VFL) algorithm.
The convergence performance of T-VFL is provided via mathematical analysis and justified by simulation results.
arXiv Detail & Related papers (2022-08-07T10:39:27Z) - Multi-Edge Server-Assisted Dynamic Federated Learning with an Optimized
Floating Aggregation Point [51.47520726446029]
cooperative edge learning (CE-FL) is a distributed machine learning architecture.
We model the processes taken during CE-FL, and conduct analytical training.
We show the effectiveness of our framework with the data collected from a real-world testbed.
arXiv Detail & Related papers (2022-03-26T00:41:57Z) - DVFL: A Vertical Federated Learning Method for Dynamic Data [2.406222636382325]
This paper studies vertical federated learning (VFL), which tackles the scenarios where collaborating organizations share the same set of users but disjoint features.
We propose a new vertical federation learning method, DVFL, which adapts to dynamic data distribution changes through knowledge distillation.
Our extensive experimental results show that DVFL can not only obtain results close to existing VFL methods in static scenes, but also adapt to changes in data distribution in dynamic scenarios.
arXiv Detail & Related papers (2021-11-05T09:26:09Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z) - Wireless Communications for Collaborative Federated Learning [160.82696473996566]
Internet of Things (IoT) devices may not be able to transmit their collected data to a central controller for training machine learning models.
Google's seminal FL algorithm requires all devices to be directly connected with a central controller.
This paper introduces a novel FL framework, called collaborative FL (CFL), which enables edge devices to implement FL with less reliance on a central controller.
arXiv Detail & Related papers (2020-06-03T20:00:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.