Federated Whole Prostate Segmentation in MRI with Personalized Neural
Architectures
- URL: http://arxiv.org/abs/2107.08111v1
- Date: Fri, 16 Jul 2021 20:35:29 GMT
- Title: Federated Whole Prostate Segmentation in MRI with Personalized Neural
Architectures
- Authors: Holger R. Roth, Dong Yang, Wenqi Li, Andriy Myronenko, Wentao Zhu,
Ziyue Xu, Xiaosong Wang, Daguang Xu
- Abstract summary: Federated learning (FL) is a way to train machine learning models without the need for centralized datasets.
In this work, we combine FL with an AutoML technique based on local neural architecture search by training a "supernet"
The proposed method is evaluated on four different datasets from 3D prostate MRI and shown to improve the local models' performance after adaptation.
- Score: 11.563695244722613
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Building robust deep learning-based models requires diverse training data,
ideally from several sources. However, these datasets cannot be combined easily
because of patient privacy concerns or regulatory hurdles, especially if
medical data is involved. Federated learning (FL) is a way to train machine
learning models without the need for centralized datasets. Each FL client
trains on their local data while only sharing model parameters with a global
server that aggregates the parameters from all clients. At the same time, each
client's data can exhibit differences and inconsistencies due to the local
variation in the patient population, imaging equipment, and acquisition
protocols. Hence, the federated learned models should be able to adapt to the
local particularities of a client's data. In this work, we combine FL with an
AutoML technique based on local neural architecture search by training a
"supernet". Furthermore, we propose an adaptation scheme to allow for
personalized model architectures at each FL client's site. The proposed method
is evaluated on four different datasets from 3D prostate MRI and shown to
improve the local models' performance after adaptation through selecting an
optimal path through the AutoML supernet.
Related papers
- Multi-Level Additive Modeling for Structured Non-IID Federated Learning [54.53672323071204]
We train models organized in a multi-level structure, called Multi-level Additive Models (MAM)'', for better knowledge-sharing across heterogeneous clients.
In federated MAM (FeMAM), each client is assigned to at most one model per level and its personalized prediction sums up the outputs of models assigned to it across all levels.
Experiments show that FeMAM surpasses existing clustered FL and personalized FL methods in various non-IID settings.
arXiv Detail & Related papers (2024-05-26T07:54:53Z) - FAM: fast adaptive federated meta-learning [10.980548731600116]
We propose a fast adaptive federated meta-learning (FAM) framework for collaboratively learning a single global model.
A skeleton network is grown on each client to train a personalized model by learning additional client-specific parameters from local data.
The personalized client models outperformed the locally trained models, demonstrating the efficacy of the FAM mechanism.
arXiv Detail & Related papers (2023-08-26T22:54:45Z) - Personalized Federated Learning with Multi-branch Architecture [0.0]
Federated learning (FL) enables multiple clients to collaboratively train models without requiring clients to reveal their raw data to each other.
We propose a new PFL method (pFedMB) using multi-branch architecture, which achieves personalization by splitting each layer of a neural network into multiple branches and assigning client-specific weights to each branch.
We experimentally show that pFedMB performs better than the state-of-the-art PFL methods using the CIFAR10 and CIFAR100 datasets.
arXiv Detail & Related papers (2022-11-15T06:30:57Z) - IOP-FL: Inside-Outside Personalization for Federated Medical Image
Segmentation [18.65229252289727]
Federated learning allows multiple medical institutions to collaboratively learn a global model without centralizing client data.
We propose a novel unified framework for both textitInside and Outside model Personalization in FL (IOP-FL)
Our experimental results on two medical image segmentation tasks present significant improvements over SOTA methods on both inside and outside personalization.
arXiv Detail & Related papers (2022-04-16T08:26:19Z) - Acceleration of Federated Learning with Alleviated Forgetting in Local
Training [61.231021417674235]
Federated learning (FL) enables distributed optimization of machine learning models while protecting privacy.
We propose FedReg, an algorithm to accelerate FL with alleviated knowledge forgetting in the local training stage.
Our experiments demonstrate that FedReg not only significantly improves the convergence rate of FL, especially when the neural network architecture is deep.
arXiv Detail & Related papers (2022-03-05T02:31:32Z) - Personalized Federated Learning through Local Memorization [10.925242558525683]
Federated learning allows clients to collaboratively learn statistical models while keeping their data local.
Recent personalized federated learning methods train a separate model for each client while still leveraging the knowledge available at other clients.
We show on a suite of federated datasets that this approach achieves significantly higher accuracy and fairness than state-of-the-art methods.
arXiv Detail & Related papers (2021-11-17T19:40:07Z) - Personalized Retrogress-Resilient Framework for Real-World Medical
Federated Learning [8.240098954377794]
We propose a personalized retrogress-resilient framework to produce a superior personalized model for each client.
Our experiments on real-world dermoscopic FL dataset prove that our personalized retrogress-resilient framework outperforms state-of-the-art FL methods.
arXiv Detail & Related papers (2021-10-01T13:24:29Z) - A Bayesian Federated Learning Framework with Online Laplace
Approximation [144.7345013348257]
Federated learning allows multiple clients to collaboratively learn a globally shared model.
We propose a novel FL framework that uses online Laplace approximation to approximate posteriors on both the client and server side.
We achieve state-of-the-art results on several benchmarks, clearly demonstrating the advantages of the proposed method.
arXiv Detail & Related papers (2021-02-03T08:36:58Z) - Personalized Federated Learning with First Order Model Optimization [76.81546598985159]
We propose an alternative to federated learning, where each client federates with other relevant clients to obtain a stronger model per client-specific objectives.
We do not assume knowledge of underlying data distributions or client similarities, and allow each client to optimize for arbitrary target distributions of interest.
Our method outperforms existing alternatives, while also enabling new features for personalized FL such as transfer outside of local data distributions.
arXiv Detail & Related papers (2020-12-15T19:30:29Z) - Federated Mutual Learning [65.46254760557073]
Federated Mutual Leaning (FML) allows clients training a generalized model collaboratively and a personalized model independently.
The experiments show that FML can achieve better performance than alternatives in typical Federated learning setting.
arXiv Detail & Related papers (2020-06-27T09:35:03Z) - Ensemble Distillation for Robust Model Fusion in Federated Learning [72.61259487233214]
Federated Learning (FL) is a machine learning setting where many devices collaboratively train a machine learning model.
In most of the current training schemes the central model is refined by averaging the parameters of the server model and the updated parameters from the client side.
We propose ensemble distillation for model fusion, i.e. training the central classifier through unlabeled data on the outputs of the models from the clients.
arXiv Detail & Related papers (2020-06-12T14:49:47Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.