FedST: Secure Federated Shapelet Transformation for Time Series
Classification
- URL: http://arxiv.org/abs/2302.10631v3
- Date: Wed, 31 May 2023 13:35:17 GMT
- Title: FedST: Secure Federated Shapelet Transformation for Time Series
Classification
- Authors: Zhiyu Liang, Hongzhi Wang
- Abstract summary: This paper explores how to customize time series classification (TSC) methods with the help of external data in a privacy-preserving federated learning (FL) scenario.
We propose FedST, a novel FL-enabled TSC framework based on a shapelet transformation method.
We conduct extensive experiments using both synthetic and real-world datasets.
- Score: 5.249017312277057
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper explores how to customize time series classification (TSC) methods
with the help of external data in a privacy-preserving federated learning (FL)
scenario. To the best of our knowledge, we are the first to study on this
essential topic. Achieving this goal requires us to seamlessly integrate the
techniques from multiple fields including Data Mining, Machine Learning, and
Security. In this paper, we systematically investigate existing TSC solutions
for the centralized scenario and propose FedST, a novel FL-enabled TSC
framework based on a shapelet transformation method. We recognize the federated
shapelet search step as the kernel of FedST. Thus, we design a basic protocol
for the FedST kernel that we prove to be secure and accurate. However, we
identify that the basic protocol suffers from efficiency bottlenecks and the
centralized acceleration techniques lose their efficacy due to the security
issues. To speed up the federated protocol with security guarantee, we propose
several optimizations tailored for the FL setting. Our theoretical analysis
shows that the proposed methods are secure and more efficient. We conduct
extensive experiments using both synthetic and real-world datasets. Empirical
results show that our FedST solution is effective in terms of TSC accuracy, and
the proposed optimizations can achieve three orders of magnitude of speedup.
Related papers
- FedHQ: Hybrid Runtime Quantization for Federated Learning [18.039278211314205]
Federated Learning (FL) is a decentralized model training approach that preserves data privacy but struggles with low efficiency.<n>This paper proposes a hybrid quantitation approach combining PTQ and QAT for FL systems.<n>Experiments show that FedHQ achieves up to 2.47x times training acceleration and up to 11.15% accuracy improvement and negligible extra overhead.
arXiv Detail & Related papers (2025-05-17T12:30:27Z) - Byzantine-Resilient Over-the-Air Federated Learning under Zero-Trust Architecture [68.83934802584899]
We propose a novel Byzantine-robust FL paradigm for over-the-air transmissions, referred to as federated learning with secure adaptive clustering (FedSAC)
FedSAC aims to protect a portion of the devices from attacks through zero trust architecture (ZTA) based Byzantine identification and adaptive device clustering.
Numerical results substantiate the superiority of the proposed FedSAC over existing methods in terms of both test accuracy and convergence rate.
arXiv Detail & Related papers (2025-03-24T01:56:30Z) - Digital Twin-Assisted Federated Learning with Blockchain in Multi-tier Computing Systems [67.14406100332671]
In Industry 4.0 systems, resource-constrained edge devices engage in frequent data interactions.
This paper proposes a digital twin (DT) and federated digital twin (FL) scheme.
The efficacy of our proposed cooperative interference-based FL process has been verified through numerical analysis.
arXiv Detail & Related papers (2024-11-04T17:48:02Z) - FADAS: Towards Federated Adaptive Asynchronous Optimization [56.09666452175333]
Federated learning (FL) has emerged as a widely adopted training paradigm for privacy-preserving machine learning.
This paper introduces federated adaptive asynchronous optimization, named FADAS, a novel method that incorporates asynchronous updates into adaptive federated optimization with provable guarantees.
We rigorously establish the convergence rate of the proposed algorithms and empirical results demonstrate the superior performance of FADAS over other asynchronous FL baselines.
arXiv Detail & Related papers (2024-07-25T20:02:57Z) - Secure Combination of Untrusted Time information Based on Optimized Dempster-Shafer Theory [24.333157091055327]
Multiple paths scheme is thought as an effective security countermeasure to decrease the influence of Time Delay Attack (TDA)
In this paper, a secure combination algorithm based on Dempster-Shafer theory is proposed for multiple paths method.
Theoretical simulation shows that the proposed algorithm works much better than Fault Tolerant Algorithm (FTA) and the attack detection method based on single path.
arXiv Detail & Related papers (2024-06-19T13:15:12Z) - Enhancing Security in Federated Learning through Adaptive
Consensus-Based Model Update Validation [2.28438857884398]
This paper introduces an advanced approach for fortifying Federated Learning (FL) systems against label-flipping attacks.
We propose a consensus-based verification process integrated with an adaptive thresholding mechanism.
Our results indicate a significant mitigation of label-flipping attacks, bolstering the FL system's resilience.
arXiv Detail & Related papers (2024-03-05T20:54:56Z) - CyclicFL: A Cyclic Model Pre-Training Approach to Efficient Federated Learning [33.250038477336425]
Federated learning (FL) has been proposed to enable distributed learning on Artificial Intelligence Internet of Things (AIoT) devices with guarantees of high-level data privacy.
Existing FL methods suffer from both slow convergence and poor accuracy, especially in non-IID scenarios.
We propose a novel method named CyclicFL, which can quickly derive effective initial models to guide the SGD processes.
arXiv Detail & Related papers (2023-01-28T13:28:34Z) - FedSkip: Combatting Statistical Heterogeneity with Federated Skip
Aggregation [95.85026305874824]
We introduce a data-driven approach called FedSkip to improve the client optima by periodically skipping federated averaging and scattering local models to the cross devices.
We conduct extensive experiments on a range of datasets to demonstrate that FedSkip achieves much higher accuracy, better aggregation efficiency and competing communication efficiency.
arXiv Detail & Related papers (2022-12-14T13:57:01Z) - Faster Adaptive Federated Learning [84.38913517122619]
Federated learning has attracted increasing attention with the emergence of distributed data.
In this paper, we propose an efficient adaptive algorithm (i.e., FAFED) based on momentum-based variance reduced technique in cross-silo FL.
arXiv Detail & Related papers (2022-12-02T05:07:50Z) - ScionFL: Efficient and Robust Secure Quantized Aggregation [36.668162197302365]
We introduce ScionFL, the first secure aggregation framework for federated learning.
It operates efficiently on quantized inputs and simultaneously provides robustness against malicious clients.
We show that with no overhead for clients and moderate overhead for the server, we obtain comparable accuracy for standard FL benchmarks.
arXiv Detail & Related papers (2022-10-13T21:46:55Z) - Byzantine-Robust Federated Learning with Optimal Statistical Rates and
Privacy Guarantees [123.0401978870009]
We propose Byzantine-robust federated learning protocols with nearly optimal statistical rates.
We benchmark against competing protocols and show the empirical superiority of the proposed protocols.
Our protocols with bucketing can be naturally combined with privacy-guaranteeing procedures to introduce security against a semi-honest server.
arXiv Detail & Related papers (2022-05-24T04:03:07Z) - Efficient Few-Shot Object Detection via Knowledge Inheritance [62.36414544915032]
Few-shot object detection (FSOD) aims at learning a generic detector that can adapt to unseen tasks with scarce training samples.
We present an efficient pretrain-transfer framework (PTF) baseline with no computational increment.
We also propose an adaptive length re-scaling (ALR) strategy to alleviate the vector length inconsistency between the predicted novel weights and the pretrained base weights.
arXiv Detail & Related papers (2022-03-23T06:24:31Z) - Data-driven Optimal Power Flow: A Physics-Informed Machine Learning
Approach [6.5382276424254995]
This paper proposes a data-driven approach for optimal power flow (OPF) based on the stacked extreme learning machine (SELM) framework.
A data-driven OPF regression framework is developed that decomposes the OPF model features into three stages.
Numerical results carried out on IEEE and Polish benchmark systems demonstrate that the proposed method outperforms other alternatives.
arXiv Detail & Related papers (2020-05-31T15:41:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.