Decoupled Vertical Federated Learning for Practical Training on
Vertically Partitioned Data
- URL: http://arxiv.org/abs/2403.03871v1
- Date: Wed, 6 Mar 2024 17:23:28 GMT
- Title: Decoupled Vertical Federated Learning for Practical Training on
Vertically Partitioned Data
- Authors: Avi Amalanshu, Yash Sirvi, David I. Inouye
- Abstract summary: We propose a blockwise learning approach to Vertical Federated Learning (VFL)
In VFL, a host client owns data labels for each entity and learns a final representation based on intermediate local representations from all guest clients.
We implement DVFL to train split neural networks and show that model performance is comparable to VFL on a variety of classification datasets.
- Score: 9.84489449520821
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Vertical Federated Learning (VFL) is an emergent distributed machine learning
paradigm wherein owners of disjoint features of a common set of entities
collaborate to learn a global model without sharing data. In VFL, a host client
owns data labels for each entity and learns a final representation based on
intermediate local representations from all guest clients. Therefore, the host
is a single point of failure and label feedback can be used by malicious guest
clients to infer private features. Requiring all participants to remain active
and trustworthy throughout the entire training process is generally impractical
and altogether infeasible outside of controlled environments. We propose
Decoupled VFL (DVFL), a blockwise learning approach to VFL. By training each
model on its own objective, DVFL allows for decentralized aggregation and
isolation between feature learning and label supervision. With these
properties, DVFL is fault tolerant and secure. We implement DVFL to train split
neural networks and show that model performance is comparable to VFL on a
variety of classification datasets.
Related papers
- Cooperative Decentralized Backdoor Attacks on Vertical Federated Learning [22.076364118223324]
We propose a novel backdoor attack on vertical Federated Learning (VFL)
Our label inference model augments variational autoencoders with metric learning, which adversaries can train locally.
Our convergence analysis reveals the impact of backdoor perturbations on VFL indicated by a stationarity gap for the trained model.
arXiv Detail & Related papers (2025-01-16T06:22:35Z) - FuseFL: One-Shot Federated Learning through the Lens of Causality with Progressive Model Fusion [48.90879664138855]
One-shot Federated Learning (OFL) significantly reduces communication costs in FL by aggregating trained models only once.
However, the performance of advanced OFL methods is far behind the normal FL.
We propose a novel learning approach to endow OFL with superb performance and low communication and storage costs, termed as FuseFL.
arXiv Detail & Related papers (2024-10-27T09:07:10Z) - SpaFL: Communication-Efficient Federated Learning with Sparse Models and Low computational Overhead [75.87007729801304]
SpaFL: a communication-efficient FL framework is proposed to optimize sparse model structures with low computational overhead.
To optimize the pruning process itself, only thresholds are communicated between a server and clients instead of parameters.
Global thresholds are used to update model parameters by extracting aggregated parameter importance.
arXiv Detail & Related papers (2024-06-01T13:10:35Z) - Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Fault Tolerant Serverless VFL Over Dynamic Device Environment [15.757660512833006]
We study the test time performance of Vertical Federated learning (VFL) under dynamic network conditions, which we call DN-VFL.
We develop a novel DN-VFL approach called Multiple Aggregation with Gossip Rounds and Simulated Faults (MAGS) that synthesizes replication, gossiping, and selective feature omission to improve performance significantly over baselines.
arXiv Detail & Related papers (2023-12-27T17:00:09Z) - BadVFL: Backdoor Attacks in Vertical Federated Learning [22.71527711053385]
Federated learning (FL) enables multiple parties to collaboratively train a machine learning model without sharing their data.
In this paper, we focus on robustness in VFL, in particular, on backdoor attacks.
We present a first-of-its-kind clean-label backdoor attack in VFL, which consists of two phases: a label inference and a backdoor phase.
arXiv Detail & Related papers (2023-04-18T09:22:32Z) - A Fast Blockchain-based Federated Learning Framework with Compressed
Communications [14.344080339573278]
Recently, blockchain-based federated learning (BFL) has attracted intensive research attention.
In this paper, we propose a fast-based BFL called BCFL to improve the training efficiency of BFL in reality.
arXiv Detail & Related papers (2022-08-12T03:04:55Z) - Low-Latency Cooperative Spectrum Sensing via Truncated Vertical
Federated Learning [51.51440623636274]
We propose a vertical federated learning (VFL) framework to exploit the distributed features across multiple secondary users (SUs) without compromising data privacy.
To accelerate the training process, we propose a truncated vertical federated learning (T-VFL) algorithm.
The convergence performance of T-VFL is provided via mathematical analysis and justified by simulation results.
arXiv Detail & Related papers (2022-08-07T10:39:27Z) - Towards Communication-efficient Vertical Federated Learning Training via
Cache-enabled Local Updates [25.85564668511386]
We introduce CELU-VFL, a novel and efficient Vertical Learning framework.
CELU-VFL exploits the local update technique to reduce the cross-party communication rounds.
We show that CELU-VFL can be up to six times faster than the existing works.
arXiv Detail & Related papers (2022-07-29T12:10:36Z) - Desirable Companion for Vertical Federated Learning: New Zeroth-Order
Gradient Based Algorithm [140.25480610981504]
A complete list of metrics to evaluate VFL algorithms should include model applicability, privacy, communication, and computation efficiency.
We propose a novel VFL framework with black-box scalability, which is inseparably inseparably scalable.
arXiv Detail & Related papers (2022-03-19T13:55:47Z) - Achieving Model Fairness in Vertical Federated Learning [47.8598060954355]
Vertical federated learning (VFL) enables multiple enterprises possessing non-overlapped features to strengthen their machine learning models without disclosing their private data and model parameters.
VFL suffers from fairness issues, i.e., the learned model may be unfairly discriminatory over the group with sensitive attributes.
We propose a fair VFL framework to tackle this problem.
arXiv Detail & Related papers (2021-09-17T04:40:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.