Multi-Participant Multi-Class Vertical Federated Learning
- URL: http://arxiv.org/abs/2001.11154v1
- Date: Thu, 30 Jan 2020 02:39:50 GMT
- Title: Multi-Participant Multi-Class Vertical Federated Learning
- Authors: Siwei Feng and Han Yu
- Abstract summary: We propose the Multi-participant Multi-class Vertical Federated Learning (MMVFL) framework for multi-class VFL problems involving multiple parties.
MMVFL enables label sharing from its owner to other VFL participants in a privacypreserving manner.
Experiment results on real-world datasets show that MMVFL can effectively share label information among multiple VFL participants and match multi-class classification performance of existing approaches.
- Score: 16.75182305714081
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Federated learning (FL) is a privacy-preserving paradigm for training
collective machine learning models with locally stored data from multiple
participants. Vertical federated learning (VFL) deals with the case where
participants sharing the same sample ID space but having different feature
spaces, while label information is owned by one participant. Current studies of
VFL only support two participants, and mostly focus on binaryclass logistic
regression problems. In this paper, we propose the Multi-participant
Multi-class Vertical Federated Learning (MMVFL) framework for multi-class VFL
problems involving multiple parties. Extending the idea of multi-view learning
(MVL), MMVFL enables label sharing from its owner to other VFL participants in
a privacypreserving manner. To demonstrate the effectiveness of MMVFL, a
feature selection scheme is incorporated into MMVFL to compare its performance
against supervised feature selection and MVL-based approaches. Experiment
results on real-world datasets show that MMVFL can effectively share label
information among multiple VFL participants and match multi-class
classification performance of existing approaches.
Related papers
- VFL-RPS: Relevant Participant Selection in Vertical Federated Learning [0.06181089784338582]
Federated Learning (FL) allows collaboration between different parties, while ensuring that the data across these parties is not shared.
We propose a novel method VFL-RPS for participant selection in VFL, as a pre-training step.
We show that our method outperforms existing methods for participant selection in VFL.
arXiv Detail & Related papers (2025-02-20T09:05:55Z) - Vertical Federated Learning in Practice: The Good, the Bad, and the Ugly [42.31182713177944]
This survey analyzes the real-world data distributions in potential Vertical Federated Learning (VFL) applications.
We propose a novel data-oriented taxonomy of VFL algorithms based on real VFL data distributions.
Based on these observations, we outline key research directions aimed at bridging the gap between current VFL research and real-world applications.
arXiv Detail & Related papers (2025-02-12T07:03:32Z) - Federated Transformer: Multi-Party Vertical Federated Learning on Practical Fuzzily Linked Data [27.073959939557362]
We introduce the Federated Transformer (FeT), a novel framework that supports multi-party fuzzy VFL with fuzzy identifiers.
Our experiments demonstrate that the FeT surpasses the baseline models by up to 46% in terms of accuracy when scaled to 50 parties.
In two-party fuzzy VFL settings, FeT also shows improved performance and privacy over cutting-edge VFL models.
arXiv Detail & Related papers (2024-10-23T16:00:14Z) - Towards Active Participant Centric Vertical Federated Learning: Some Representations May Be All You Need [0.4711628883579317]
This work introduces a novel approach to VFL, Active Participant Centric VFL ( APC-VFL)
APC-VFL excels in scenarios when data samples among participants are partially aligned at training.
It consistently outperforms other VFL methods across three popular VFL datasets in terms of F1, accuracy and communication costs.
arXiv Detail & Related papers (2024-10-23T08:07:00Z) - Vertical Federated Learning for Effectiveness, Security, Applicability: A Survey [67.48187503803847]
Vertical Federated Learning (VFL) is a privacy-preserving distributed learning paradigm.
Recent research has shown promising results addressing various challenges in VFL.
This survey offers a systematic overview of recent developments.
arXiv Detail & Related papers (2024-05-25T16:05:06Z) - Multi-View Class Incremental Learning [57.14644913531313]
Multi-view learning (MVL) has gained great success in integrating information from multiple perspectives of a dataset to improve downstream task performance.
This paper investigates a novel paradigm called multi-view class incremental learning (MVCIL), where a single model incrementally classifies new classes from a continual stream of views.
arXiv Detail & Related papers (2023-06-16T08:13:41Z) - A Survey on Vertical Federated Learning: From a Layered Perspective [21.639062199459925]
In this paper, we investigate the current work of vertical federated learning (VFL) from a layered perspective.
We design a novel MOSP tree taxonomy to analyze the core component of VFL, i.e., secure vertical federated machine learning algorithm.
Our taxonomy considers four dimensions, i.e., machine learning model (M), protection object (O), security model (S), and privacy-preserving protocol (P)
arXiv Detail & Related papers (2023-04-04T14:33:30Z) - Vertical Semi-Federated Learning for Efficient Online Advertising [50.18284051956359]
Semi-VFL (Vertical Semi-Federated Learning) is proposed to achieve a practical industry application fashion for VFL.
We build an inference-efficient single-party student model applicable to the whole sample space.
New representation distillation methods are designed to extract cross-party feature correlations for both the overlapped and non-overlapped data.
arXiv Detail & Related papers (2022-09-30T17:59:27Z) - A Framework of Meta Functional Learning for Regularising Knowledge
Transfer [89.74127682599898]
This work proposes a novel framework of Meta Functional Learning (MFL) by meta-learning a generalisable functional model from data-rich tasks.
The MFL computes meta-knowledge on functional regularisation generalisable to different learning tasks by which functional training on limited labelled data promotes more discriminative functions to be learned.
arXiv Detail & Related papers (2022-03-28T15:24:09Z) - Efficient Split-Mix Federated Learning for On-Demand and In-Situ
Customization [107.72786199113183]
Federated learning (FL) provides a distributed learning framework for multiple participants to collaborate learning without sharing raw data.
In this paper, we propose a novel Split-Mix FL strategy for heterogeneous participants that, once training is done, provides in-situ customization of model sizes and robustness.
arXiv Detail & Related papers (2022-03-18T04:58:34Z) - Mobility-Aware Cluster Federated Learning in Hierarchical Wireless
Networks [81.83990083088345]
We develop a theoretical model to characterize the hierarchical federated learning (HFL) algorithm in wireless networks.
Our analysis proves that the learning performance of HFL deteriorates drastically with highly-mobile users.
To circumvent these issues, we propose a mobility-aware cluster federated learning (MACFL) algorithm.
arXiv Detail & Related papers (2021-08-20T10:46:58Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.