Federated Learning Across Decentralized and Unshared Archives for Remote Sensing Image Classification
- URL: http://arxiv.org/abs/2311.06141v3
- Date: Fri, 14 Jun 2024 15:52:13 GMT
- Title: Federated Learning Across Decentralized and Unshared Archives for Remote Sensing Image Classification
- Authors: Barış Büyüktaş, Gencer Sumbul, Begüm Demir,
- Abstract summary: Federated learning (FL) enables the collaboration of multiple deep learning models to learn from decentralized data archives (i.e., clients) without accessing data on clients.
Although FL offers ample opportunities in knowledge discovery from distributed image archives, it is seldom considered in remote sensing (RS)
We present a comparative study of state-of-the-art FL algorithms for RS image classification problems.
- Score: 2.725507329935916
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Federated learning (FL) enables the collaboration of multiple deep learning models to learn from decentralized data archives (i.e., clients) without accessing data on clients. Although FL offers ample opportunities in knowledge discovery from distributed image archives, it is seldom considered in remote sensing (RS). In this paper, as a first time in RS, we present a comparative study of state-of-the-art FL algorithms for RS image classification problems. To this end, we initially provide a systematic review of the FL algorithms presented in the computer vision and machine learning communities. Then, we select several state-of-the-art FL algorithms based on their effectiveness with respect to training data heterogeneity across clients (known as non-IID data). After presenting an extensive overview of the selected algorithms, a theoretical comparison of the algorithms is conducted based on their: 1) local training complexity; 2) aggregation complexity; 3) learning efficiency; 4) communication cost; and 5) scalability in terms of number of clients. After the theoretical comparison, experimental analyses are presented to compare them under different decentralization scenarios. For the experimental analyses, we focus our attention on multi-label image classification problems in RS. Based on our comprehensive analyses, we finally derive a guideline for selecting suitable FL algorithms in RS. The code of this work is publicly available at https://git.tu-berlin.de/rsim/FL-RS.
Related papers
- A Framework for testing Federated Learning algorithms using an edge-like environment [0.0]
Federated Learning (FL) is a machine learning paradigm in which many clients cooperatively train a single centralized model while keeping their data private and decentralized.
It is non-trivial to accurately evaluate the contributions of local models in global centralized model aggregation.
This is an example of a major challenge in FL, commonly known as data imbalance or class imbalance.
In this work, a framework is proposed and implemented to assess FL algorithms in a more easy and scalable way.
arXiv Detail & Related papers (2024-07-17T19:52:53Z) - Not All Federated Learning Algorithms Are Created Equal: A Performance Evaluation Study [1.9265466185360185]
Federated Learning (FL) emerged as a practical approach to training a model from decentralized data.
To bridge this gap, we conduct extensive performance evaluation on several canonical FL algorithms.
Our comprehensive measurement study reveals that no single algorithm works best across different performance metrics.
arXiv Detail & Related papers (2024-03-26T00:33:49Z) - Communication Efficient ConFederated Learning: An Event-Triggered SAGA
Approach [67.27031215756121]
Federated learning (FL) is a machine learning paradigm that targets model training without gathering the local data over various data sources.
Standard FL, which employs a single server, can only support a limited number of users, leading to degraded learning capability.
In this work, we consider a multi-server FL framework, referred to as emphConfederated Learning (CFL) in order to accommodate a larger number of users.
arXiv Detail & Related papers (2024-02-28T03:27:10Z) - Faster Adaptive Federated Learning [84.38913517122619]
Federated learning has attracted increasing attention with the emergence of distributed data.
In this paper, we propose an efficient adaptive algorithm (i.e., FAFED) based on momentum-based variance reduced technique in cross-silo FL.
arXiv Detail & Related papers (2022-12-02T05:07:50Z) - Efficient Image Representation Learning with Federated Sampled Softmax [2.5557803548119464]
Federated sampled softmax (FedSS) is a resource-efficient approach for learning image representation with Federated Learning.
We show that our method significantly reduces the number of parameters transferred to and optimized by the client devices.
arXiv Detail & Related papers (2022-03-09T17:00:32Z) - Local Learning Matters: Rethinking Data Heterogeneity in Federated
Learning [61.488646649045215]
Federated learning (FL) is a promising strategy for performing privacy-preserving, distributed learning with a network of clients (i.e., edge devices)
arXiv Detail & Related papers (2021-11-28T19:03:39Z) - FedDropoutAvg: Generalizable federated learning for histopathology image
classification [11.509801043891837]
Federated learning (FL) enables collaborative learning of a deep learning model without sharing the data of participating sites.
We propose FedDropoutAvg, a new federated learning approach for training a generalizable model.
We show that the proposed approach is more generalizable than other state-of-the-art federated training approaches.
arXiv Detail & Related papers (2021-11-25T19:30:37Z) - FedCV: A Federated Learning Framework for Diverse Computer Vision Tasks [38.012182901565616]
Federated Learning (FL) is a distributed learning paradigm that can learn a global or personalized model from decentralized datasets on edge devices.
FL has rarely been demonstrated effectively in advanced computer vision tasks such as object detection and image segmentation.
We provide non-I.I.D. benchmarking datasets, models, and various reference FL algorithms.
arXiv Detail & Related papers (2021-11-22T09:26:08Z) - Exploiting Shared Representations for Personalized Federated Learning [54.65133770989836]
We propose a novel federated learning framework and algorithm for learning a shared data representation across clients and unique local heads for each client.
Our algorithm harnesses the distributed computational power across clients to perform many local-updates with respect to the low-dimensional local parameters for every update of the representation.
This result is of interest beyond federated learning to a broad class of problems in which we aim to learn a shared low-dimensional representation among data distributions.
arXiv Detail & Related papers (2021-02-14T05:36:25Z) - FedML: A Research Library and Benchmark for Federated Machine Learning [55.09054608875831]
Federated learning (FL) is a rapidly growing research field in machine learning.
Existing FL libraries cannot adequately support diverse algorithmic development.
We introduce FedML, an open research library and benchmark to facilitate FL algorithm development and fair performance comparison.
arXiv Detail & Related papers (2020-07-27T13:02:08Z) - FedPD: A Federated Learning Framework with Optimal Rates and Adaptivity
to Non-IID Data [59.50904660420082]
Federated Learning (FL) has become a popular paradigm for learning from distributed data.
To effectively utilize data at different devices without moving them to the cloud, algorithms such as the Federated Averaging (FedAvg) have adopted a "computation then aggregation" (CTA) model.
arXiv Detail & Related papers (2020-05-22T23:07:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.