The FeatureCloud AI Store for Federated Learning in Biomedicine and
Beyond
- URL: http://arxiv.org/abs/2105.05734v1
- Date: Wed, 12 May 2021 15:31:46 GMT
- Title: The FeatureCloud AI Store for Federated Learning in Biomedicine and
Beyond
- Authors: Julian Matschinske, Julian Sp\"ath, Reza Nasirigerdeh, Reihaneh
Torkzadehmahani, Anne Hartebrodt, Bal\'azs Orb\'an, S\'andor Fej\'er, Olga
Zolotareva, Mohammad Bakhtiari, B\'ela Bihari, Marcus Bloice, Nina C Donner,
Walid Fdhila, Tobias Frisch, Anne-Christin Hauschild, Dominik Heider, Andreas
Holzinger, Walter H\"otzendorfer, Jan Hospes, Tim Kacprowski, Markus
Kastelitz, Markus List, Rudolf Mayer, M\'onika Moga, Heimo M\"uller,
Anastasia Pustozerova, Richard R\"ottger, Anna Saranti, Harald HHW Schmidt,
Christof Tschohl, Nina K Wenke, Jan Baumbach
- Abstract summary: Privacy-preserving methods, such as Federated Learning (FL), allow for training ML models without sharing sensitive data.
We present the FeatureCloud AI Store for FL as an all-in-one platform for biomedical research and other applications.
- Score: 0.7517525791460022
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Machine Learning (ML) and Artificial Intelligence (AI) have shown promising
results in many areas and are driven by the increasing amount of available
data. However, this data is often distributed across different institutions and
cannot be shared due to privacy concerns. Privacy-preserving methods, such as
Federated Learning (FL), allow for training ML models without sharing sensitive
data, but their implementation is time-consuming and requires advanced
programming skills. Here, we present the FeatureCloud AI Store for FL as an
all-in-one platform for biomedical research and other applications. It removes
large parts of this complexity for developers and end-users by providing an
extensible AI Store with a collection of ready-to-use apps. We show that the
federated apps produce similar results to centralized ML, scale well for a
typical number of collaborators and can be combined with Secure Multiparty
Computation (SMPC), thereby making FL algorithms safely and easily applicable
in biomedical and clinical environments.
Related papers
- Privacy-Preserving Edge Federated Learning for Intelligent Mobile-Health Systems [4.082799056366928]
We propose a privacy-preserving edge FL framework for resource-constrained mobile-health and wearable technologies over the IoT infrastructure.
We evaluate our proposed framework extensively and provide the implementation of our technique on Amazon's AWS cloud platform.
arXiv Detail & Related papers (2024-05-09T08:15:31Z) - FedMM: Federated Multi-Modal Learning with Modality Heterogeneity in
Computational Pathology [3.802258033231335]
Federated Multi-Modal (FedMM) is a learning framework that trains multiple single-modal feature extractors to enhance subsequent classification performance.
FedMM notably outperforms two baselines in accuracy and AUC metrics.
arXiv Detail & Related papers (2024-02-24T16:58:42Z) - OmniForce: On Human-Centered, Large Model Empowered and Cloud-Edge
Collaborative AutoML System [85.8338446357469]
We introduce OmniForce, a human-centered AutoML system that yields both human-assisted ML and ML-assisted human techniques.
We show how OmniForce can put an AutoML system into practice and build adaptive AI in open-environment scenarios.
arXiv Detail & Related papers (2023-03-01T13:35:22Z) - TemporAI: Facilitating Machine Learning Innovation in Time Domain Tasks
for Medicine [91.3755431537592]
TemporAI is an open source Python software library for machine learning (ML) tasks involving data with a time component.
It supports data in time series, static, and eventmodalities and provides an interface for prediction, causal inference, and time-to-event analysis.
arXiv Detail & Related papers (2023-01-28T17:57:53Z) - Scalable Collaborative Learning via Representation Sharing [53.047460465980144]
Federated learning (FL) and Split Learning (SL) are two frameworks that enable collaborative learning while keeping the data private (on device)
In FL, each data holder trains a model locally and releases it to a central server for aggregation.
In SL, the clients must release individual cut-layer activations (smashed data) to the server and wait for its response (during both inference and back propagation).
In this work, we present a novel approach for privacy-preserving machine learning, where the clients collaborate via online knowledge distillation using a contrastive loss.
arXiv Detail & Related papers (2022-11-20T10:49:22Z) - NVIDIA FLARE: Federated Learning from Simulation to Real-World [11.490933081543787]
We created NVIDIA FLARE as an open-source development kit (SDK) to make it easier for data scientists to use FL in their research and real-world applications.
The SDK includes solutions for state-of-the-art FL algorithms and federated machine learning approaches.
arXiv Detail & Related papers (2022-10-24T14:30:50Z) - Federated Learning and Meta Learning: Approaches, Applications, and
Directions [94.68423258028285]
In this tutorial, we present a comprehensive review of FL, meta learning, and federated meta learning (FedMeta)
Unlike other tutorial papers, our objective is to explore how FL, meta learning, and FedMeta methodologies can be designed, optimized, and evolved, and their applications over wireless networks.
arXiv Detail & Related papers (2022-10-24T10:59:29Z) - APPFL: Open-Source Software Framework for Privacy-Preserving Federated
Learning [0.0]
Federated learning (FL) enables training models at different sites and updating the weights from the training instead of transferring data to a central location and training as in classical machine learning.
We introduce APPFL, the Argonne Privacy-Preserving Federated Learning framework.
APPFL allows users to leverage implemented privacy-preserving algorithms, implement new algorithms, and simulate and deploy various FL algorithms with privacy-preserving techniques.
arXiv Detail & Related papers (2022-02-08T06:23:05Z) - Federated Learning: Issues in Medical Application [0.7598921989525735]
Since the federated learning, which makes AI learning possible without moving local data around, was introduced by google in 2017 it has been actively studied particularly in the field of medicine.
The idea of machine learning in AI without collecting data from local clients is very attractive because data remain in local sites.
However, federated learning techniques still have various open issues due to its own characteristics such as non identical distribution, client participation management, and vulnerable environments.
arXiv Detail & Related papers (2021-09-01T06:04:08Z) - FedML: A Research Library and Benchmark for Federated Machine Learning [55.09054608875831]
Federated learning (FL) is a rapidly growing research field in machine learning.
Existing FL libraries cannot adequately support diverse algorithmic development.
We introduce FedML, an open research library and benchmark to facilitate FL algorithm development and fair performance comparison.
arXiv Detail & Related papers (2020-07-27T13:02:08Z) - Federated and continual learning for classification tasks in a society
of devices [59.45414406974091]
Light Federated and Continual Consensus (LFedCon2) is a new federated and continual architecture that uses light, traditional learners.
Our method allows powerless devices (such as smartphones or robots) to learn in real time, locally, continuously, autonomously and from users.
In order to test our proposal, we have applied it in a heterogeneous community of smartphone users to solve the problem of walking recognition.
arXiv Detail & Related papers (2020-06-12T12:37:03Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.