Data-Free Diversity-Based Ensemble Selection For One-Shot Federated
Learning in Machine Learning Model Market
- URL: http://arxiv.org/abs/2302.11751v1
- Date: Thu, 23 Feb 2023 02:36:27 GMT
- Title: Data-Free Diversity-Based Ensemble Selection For One-Shot Federated
Learning in Machine Learning Model Market
- Authors: Naibo Wang, Wenjie Feng, Fusheng Liu, Moming Duan, See-Kiong Ng
- Abstract summary: We present a novel Data-Free Diversity-Based method called DeDES to address the ensemble selection problem for models generated by one-shot federated learning.
Our method can achieve both better performance and higher efficiency over 5 datasets and 4 different model structures.
- Score: 2.9046424358155236
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The emerging availability of trained machine learning models has put forward
the novel concept of Machine Learning Model Market in which one can harness the
collective intelligence of multiple well-trained models to improve the
performance of the resultant model through one-shot federated learning and
ensemble learning in a data-free manner. However, picking the models available
in the market for ensemble learning is time-consuming, as using all the models
is not always the best approach. It is thus crucial to have an effective
ensemble selection strategy that can find a good subset of the base models for
the ensemble. Conventional ensemble selection techniques are not applicable, as
we do not have access to the local datasets of the parties in the federated
learning setting. In this paper, we present a novel Data-Free Diversity-Based
method called DeDES to address the ensemble selection problem for models
generated by one-shot federated learning in practical applications such as
model markets. Experiments showed that our method can achieve both better
performance and higher efficiency over 5 datasets and 4 different model
structures under the different data-partition strategies.
Related papers
- A Two-Phase Recall-and-Select Framework for Fast Model Selection [13.385915962994806]
We propose a two-phase (coarse-recall and fine-selection) model selection framework.
It aims to enhance the efficiency of selecting a robust model by leveraging the models' training performances on benchmark datasets.
It has been demonstrated that the proposed methodology facilitates the selection of a high-performing model at a rate about 3x times faster than conventional baseline methods.
arXiv Detail & Related papers (2024-03-28T14:44:44Z) - Personalized Federated Learning with Contextual Modulation and
Meta-Learning [2.7716102039510564]
Federated learning has emerged as a promising approach for training machine learning models on decentralized data sources.
We propose a novel framework that combines federated learning with meta-learning techniques to enhance both efficiency and generalization capabilities.
arXiv Detail & Related papers (2023-12-23T08:18:22Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - Fantastic Gains and Where to Find Them: On the Existence and Prospect of
General Knowledge Transfer between Any Pretrained Model [74.62272538148245]
We show that for arbitrary pairings of pretrained models, one model extracts significant data context unavailable in the other.
We investigate if it is possible to transfer such "complementary" knowledge from one model to another without performance degradation.
arXiv Detail & Related papers (2023-10-26T17:59:46Z) - Universal Semi-supervised Model Adaptation via Collaborative Consistency
Training [92.52892510093037]
We introduce a realistic and challenging domain adaptation problem called Universal Semi-supervised Model Adaptation (USMA)
We propose a collaborative consistency training framework that regularizes the prediction consistency between two models.
Experimental results demonstrate the effectiveness of our method on several benchmark datasets.
arXiv Detail & Related papers (2023-07-07T08:19:40Z) - Dataless Knowledge Fusion by Merging Weights of Language Models [51.8162883997512]
Fine-tuning pre-trained language models has become the prevalent paradigm for building downstream NLP models.
This creates a barrier to fusing knowledge across individual models to yield a better single model.
We propose a dataless knowledge fusion method that merges models in their parameter space.
arXiv Detail & Related papers (2022-12-19T20:46:43Z) - Differentiable Model Selection for Ensemble Learning [37.99501959301896]
This paper proposes a novel framework for differentiable model selection integrating machine learning and optimization.
The framework is tailored for ensemble learning, a strategy that combines the outputs of individually pre-trained models, and learns to select appropriate ensemble members for a particular input sample.
arXiv Detail & Related papers (2022-11-01T03:37:49Z) - Synthetic Model Combination: An Instance-wise Approach to Unsupervised
Ensemble Learning [92.89846887298852]
Consider making a prediction over new test data without any opportunity to learn from a training set of labelled data.
Give access to a set of expert models and their predictions alongside some limited information about the dataset used to train them.
arXiv Detail & Related papers (2022-10-11T10:20:31Z) - Model-Contrastive Federated Learning [92.9075661456444]
Federated learning enables multiple parties to collaboratively train a machine learning model without communicating their local data.
We propose MOON: model-contrastive federated learning.
Our experiments show that MOON significantly outperforms the other state-of-the-art federated learning algorithms on various image classification tasks.
arXiv Detail & Related papers (2021-03-30T11:16:57Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.