Towards Fleet-wide Sharing of Wind Turbine Condition Information through
Privacy-preserving Federated Learning
- URL: http://arxiv.org/abs/2212.03529v3
- Date: Wed, 12 Jul 2023 13:07:37 GMT
- Title: Towards Fleet-wide Sharing of Wind Turbine Condition Information through
Privacy-preserving Federated Learning
- Authors: Lorin Jenkel, Stefan Jonas, Angela Meyer
- Abstract summary: We present a distributed machine learning approach that preserves the data privacy by leaving the data on the wind turbines while still enabling fleet-wide learning on those local data.
We show that through federated fleet-wide learning, turbines with little or no representative training data can benefit from more accurate normal behavior models.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Terabytes of data are collected by wind turbine manufacturers from their
fleets every day. And yet, a lack of data access and sharing impedes exploiting
the full potential of the data. We present a distributed machine learning
approach that preserves the data privacy by leaving the data on the wind
turbines while still enabling fleet-wide learning on those local data. We show
that through federated fleet-wide learning, turbines with little or no
representative training data can benefit from more accurate normal behavior
models. Customizing the global federated model to individual turbines yields
the highest fault detection accuracy in cases where the monitored target
variable is distributed heterogeneously across the fleet. We demonstrate this
for bearing temperatures, a target variable whose normal behavior can vary
widely depending on the turbine. We show that no turbine experiences a loss in
model performance from participating in the federated learning process,
resulting in superior performance of the federated learning strategy in our
case studies. The distributed learning increases the normal behavior model
training times by about a factor of ten due to increased communication overhead
and slower model convergence.
Related papers
- Wind turbine condition monitoring based on intra- and inter-farm federated learning [0.0]
Many AI applications in wind energy may benefit from using operational data not only from individual wind turbines but from multiple turbines and multiple wind farms.
Federated learning has emerged as a privacy-preserving distributed machine learning approach in this context.
We investigate various federated learning strategies, including collaboration across different wind farms and turbine models, as well as collaboration restricted to the same wind farm and turbine model.
arXiv Detail & Related papers (2024-09-05T16:25:30Z) - An Aggregation-Free Federated Learning for Tackling Data Heterogeneity [50.44021981013037]
Federated Learning (FL) relies on the effectiveness of utilizing knowledge from distributed datasets.
Traditional FL methods adopt an aggregate-then-adapt framework, where clients update local models based on a global model aggregated by the server from the previous training round.
We introduce FedAF, a novel aggregation-free FL algorithm.
arXiv Detail & Related papers (2024-04-29T05:55:23Z) - FedDistill: Global Model Distillation for Local Model De-Biasing in Non-IID Federated Learning [10.641875933652647]
Federated Learning (FL) is a novel approach that allows for collaborative machine learning.
FL faces challenges due to non-uniformly distributed (non-iid) data across clients.
This paper introduces FedDistill, a framework enhancing the knowledge transfer from the global model to local models.
arXiv Detail & Related papers (2024-04-14T10:23:30Z) - Transfer learning applications for anomaly detection in wind turbines [0.0]
Anomaly detection in wind turbines typically involves using normal behaviour models to detect faults early.
This study examines how cross-turbine transfer learning can be applied to autoencoder-based anomaly detection.
arXiv Detail & Related papers (2024-04-03T18:48:45Z) - Federated Learning with Projected Trajectory Regularization [65.6266768678291]
Federated learning enables joint training of machine learning models from distributed clients without sharing their local data.
One key challenge in federated learning is to handle non-identically distributed data across the clients.
We propose a novel federated learning framework with projected trajectory regularization (FedPTR) for tackling the data issue.
arXiv Detail & Related papers (2023-12-22T02:12:08Z) - DFRD: Data-Free Robustness Distillation for Heterogeneous Federated
Learning [20.135235291912185]
Federated Learning (FL) is a privacy-constrained decentralized machine learning paradigm.
We propose a new FL method (namely DFRD) to learn a robust global model in the data-heterogeneous and model-heterogeneous FL scenarios.
arXiv Detail & Related papers (2023-09-24T04:29:22Z) - FedDM: Iterative Distribution Matching for Communication-Efficient
Federated Learning [87.08902493524556]
Federated learning(FL) has recently attracted increasing attention from academia and industry.
We propose FedDM to build the global training objective from multiple local surrogate functions.
In detail, we construct synthetic sets of data on each client to locally match the loss landscape from original data.
arXiv Detail & Related papers (2022-07-20T04:55:18Z) - Beyond Transfer Learning: Co-finetuning for Action Localisation [64.07196901012153]
We propose co-finetuning -- simultaneously training a single model on multiple upstream'' and downstream'' tasks.
We demonstrate that co-finetuning outperforms traditional transfer learning when using the same total amount of data.
We also show how we can easily extend our approach to multiple upstream'' datasets to further improve performance.
arXiv Detail & Related papers (2022-07-08T10:25:47Z) - Fine-tuning Global Model via Data-Free Knowledge Distillation for
Non-IID Federated Learning [86.59588262014456]
Federated Learning (FL) is an emerging distributed learning paradigm under privacy constraint.
We propose a data-free knowledge distillation method to fine-tune the global model in the server (FedFTG)
Our FedFTG significantly outperforms the state-of-the-art (SOTA) FL algorithms and can serve as a strong plugin for enhancing FedAvg, FedProx, FedDyn, and SCAFFOLD.
arXiv Detail & Related papers (2022-03-17T11:18:17Z) - Preservation of the Global Knowledge by Not-True Self Knowledge
Distillation in Federated Learning [8.474470736998136]
In Federated Learning (FL), a strong global model is collaboratively learned by aggregating the clients' locally trained models.
We observe that fitting on biased local distribution shifts the feature on global distribution and results in forgetting of global knowledge.
We propose a simple yet effective framework Federated Local Self-Distillation (FedLSD), which utilizes the global knowledge on locally available data.
arXiv Detail & Related papers (2021-06-06T11:51:47Z) - Towards Fair Federated Learning with Zero-Shot Data Augmentation [123.37082242750866]
Federated learning has emerged as an important distributed learning paradigm, where a server aggregates a global model from many client-trained models while having no access to the client data.
We propose a novel federated learning system that employs zero-shot data augmentation on under-represented data to mitigate statistical heterogeneity and encourage more uniform accuracy performance across clients in federated networks.
We study two variants of this scheme, Fed-ZDAC (federated learning with zero-shot data augmentation at the clients) and Fed-ZDAS (federated learning with zero-shot data augmentation at the server).
arXiv Detail & Related papers (2021-04-27T18:23:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.