Federated Active Learning Framework for Efficient Annotation Strategy in Skin-lesion Classification
- URL: http://arxiv.org/abs/2406.11310v1
- Date: Mon, 17 Jun 2024 08:16:28 GMT
- Title: Federated Active Learning Framework for Efficient Annotation Strategy in Skin-lesion Classification
- Authors: Zhipeng Deng, Yuqiao Yang, Kenji Suzuki,
- Abstract summary: Federated Learning (FL) enables multiple institutes to train models collaboratively without sharing private data.
Active learning (AL) has shown promising performance in reducing the number of data annotations in medical image analysis.
We propose a federated AL (FedAL) framework in which AL is executed periodically and interactively under FL.
- Score: 1.8149633401257899
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated Learning (FL) enables multiple institutes to train models collaboratively without sharing private data. Current FL research focuses on communication efficiency, privacy protection, and personalization and assumes that the data of FL have already been ideally collected. In medical scenarios, however, data annotation demands both expertise and intensive labor, which is a critical problem in FL. Active learning (AL), has shown promising performance in reducing the number of data annotations in medical image analysis. We propose a federated AL (FedAL) framework in which AL is executed periodically and interactively under FL. We exploit a local model in each hospital and a global model acquired from FL to construct an ensemble. We use ensemble-entropy-based AL as an efficient data-annotation strategy in FL. Therefore, our FedAL framework can decrease the amount of annotated data and preserve patient privacy while maintaining the performance of FL. To our knowledge, this is the first FedAL framework applied to medical images. We validated our framework on real-world dermoscopic datasets. Using only 50% of samples, our framework was able to achieve state-of-the-art performance on a skin-lesion classification task. Our framework performed better than several state-of-the-art AL methods under FL and achieved comparable performance to full-data FL.
Related papers
- An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - A Survey on Efficient Federated Learning Methods for Foundation Model
Training [66.19763977571114]
Federated Learning (FL) has become an established technique to facilitate privacy-preserving collaborative training across a multitude of clients.
In the wake of Foundation Models (FM), the reality is different for many deep learning applications.
We discuss the benefits and drawbacks of parameter-efficient fine-tuning (PEFT) for FL applications.
arXiv Detail & Related papers (2024-01-09T10:22:23Z) - Where to Begin? From Random to Foundation Model Instructed
Initialization in Federated Learning for Medical Image Segmentation [11.412151951949102]
In medical image analysis, Federated Learning (FL) is a key technology that enables privacy-preserved, decentralized data processing.
We propose a novel perspective: exploring the impact of using the foundation model with enormous pre-trained knowledge.
arXiv Detail & Related papers (2023-11-27T00:29:10Z) - A Comprehensive View of Personalized Federated Learning on Heterogeneous Clinical Datasets [0.4926316920996346]
Federated learning (FL) is a key approach to overcoming the data silos that so frequently obstruct the training and deployment of machine-learning models in clinical settings.
This work contributes to a growing body of FL research specifically focused on clinical applications along three important directions.
arXiv Detail & Related papers (2023-09-28T20:12:17Z) - ConDistFL: Conditional Distillation for Federated Learning from
Partially Annotated Data [5.210280120905009]
"ConDistFL" is a framework to combine Federated Learning (FL) with knowledge distillation.
We validate our framework on four distinct partially annotated abdominal CT datasets from the MSD and KiTS19 challenges.
Our ablation study suggests that ConDistFL can perform well without frequent aggregation, reducing the communication cost of FL.
arXiv Detail & Related papers (2023-08-08T06:07:49Z) - FL Games: A Federated Learning Framework for Distribution Shifts [71.98708418753786]
Federated learning aims to train predictive models for data that is distributed across clients, under the orchestration of a server.
We propose FL GAMES, a game-theoretic framework for federated learning that learns causal features that are invariant across clients.
arXiv Detail & Related papers (2022-10-31T22:59:03Z) - Federated Learning with Privacy-Preserving Ensemble Attention
Distillation [63.39442596910485]
Federated Learning (FL) is a machine learning paradigm where many local nodes collaboratively train a central model while keeping the training data decentralized.
We propose a privacy-preserving FL framework leveraging unlabeled public data for one-way offline knowledge distillation.
Our technique uses decentralized and heterogeneous local data like existing FL approaches, but more importantly, it significantly reduces the risk of privacy leakage.
arXiv Detail & Related papers (2022-10-16T06:44:46Z) - FLamby: Datasets and Benchmarks for Cross-Silo Federated Learning in
Realistic Healthcare Settings [51.09574369310246]
Federated Learning (FL) is a novel approach enabling several clients holding sensitive data to collaboratively train machine learning models.
We propose a novel cross-silo dataset suite focused on healthcare, FLamby, to bridge the gap between theory and practice of cross-silo FL.
Our flexible and modular suite allows researchers to easily download datasets, reproduce results and re-use the different components for their research.
arXiv Detail & Related papers (2022-10-10T12:17:30Z) - Federated Active Learning (F-AL): an Efficient Annotation Strategy for
Federated Learning [8.060606972572451]
Federated learning (FL) has been intensively investigated in terms of communication efficiency, privacy, and fairness.
We propose to apply active learning (AL) and sampling strategy into the FL framework to reduce the annotation workload.
We empirically demonstrate that the F-AL outperforms baseline methods in image classification tasks.
arXiv Detail & Related papers (2022-02-01T03:17:29Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.