Privacy-Preserving Data Aggregation Techniques for Enhanced Efficiency and Security in Wireless Sensor Networks: A Comprehensive Analysis and Evaluation
- URL: http://arxiv.org/abs/2403.20120v1
- Date: Fri, 29 Mar 2024 11:09:22 GMT
- Title: Privacy-Preserving Data Aggregation Techniques for Enhanced Efficiency and Security in Wireless Sensor Networks: A Comprehensive Analysis and Evaluation
- Authors: Ayush Rastogi, Harsh Rastogi, Yash Rastogi, Divyansh Dubey,
- Abstract summary: We present a multidimensional, highly effective method for aggregating data for wireless sensor networks while maintaining privacy.
The suggested system is resistant to data loss and secure against both active and passive privacy compromising attacks.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we present a multidimensional, highly effective method for aggregating data for wireless sensor networks while maintaining privacy. The suggested system is resistant to data loss and secure against both active and passive privacy compromising attacks, such as the coalition attack from a rogue base station and kidnapped sensor nodes. With regard to cluster size, it achieves consistent communication overhead, which is helpful in large-scale WSNs. Due to its constant size communication overhead, the suggested strategy outperforms the previous privacy-preserving data aggregation scheme not only in terms of privacy preservation but also in terms of communication complexity and energy costs.
Related papers
- Collaborative Inference over Wireless Channels with Feature Differential Privacy [57.68286389879283]
Collaborative inference among multiple wireless edge devices has the potential to significantly enhance Artificial Intelligence (AI) applications.
transmitting extracted features poses a significant privacy risk, as sensitive personal data can be exposed during the process.
We propose a novel privacy-preserving collaborative inference mechanism, wherein each edge device in the network secures the privacy of extracted features before transmitting them to a central server for inference.
arXiv Detail & Related papers (2024-10-25T18:11:02Z) - Marking the Pace: A Blockchain-Enhanced Privacy-Traceable Strategy for Federated Recommender Systems [11.544642210389894]
Federated recommender systems have been enhanced through data sharing and continuous model updates.
Given the sensitivity of IoT data, transparent data processing in data sharing and model updates is paramount.
Existing methods fall short in tracing the flow of shared data and the evolution of model updates.
We present LIBERATE, a privacy-traceable federated recommender system.
arXiv Detail & Related papers (2024-06-07T07:21:21Z) - Improving Privacy-Preserving Techniques for Smart Grid using Lattice-based Cryptography [1.4856472820492366]
SPDBlock is a blockchain-based solution ensuring privacy, integrity, and resistance to attacks.
It detects and prosecutes malicious entities while efficiently handling multi-dimensional data transmission.
Performance tests reveal SPDBlock's superiority in communication and computational efficiency over traditional schemes.
arXiv Detail & Related papers (2024-04-17T19:51:52Z) - TernaryVote: Differentially Private, Communication Efficient, and
Byzantine Resilient Distributed Optimization on Heterogeneous Data [50.797729676285876]
We propose TernaryVote, which combines a ternary compressor and the majority vote mechanism to realize differential privacy, gradient compression, and Byzantine resilience simultaneously.
We theoretically quantify the privacy guarantee through the lens of the emerging f-differential privacy (DP) and the Byzantine resilience of the proposed algorithm.
arXiv Detail & Related papers (2024-02-16T16:41:14Z) - FedDiSC: A Computation-efficient Federated Learning Framework for Power
Systems Disturbance and Cyber Attack Discrimination [1.0621485365427565]
This paper proposes a novel Federated Learning-based privacy-preserving and communication-efficient attack detection framework, known as FedDiSC.
We put forward a representation learning-based Deep Auto-Encoder network to accurately detect power system and cybersecurity anomalies.
To adapt our proposed framework to the timeliness of real-world cyberattack detection in SGs, we leverage the use of a gradient privacy-preserving quantization scheme known as DP-SIGNSGD.
arXiv Detail & Related papers (2023-04-07T13:43:57Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Privacy-Preserving Joint Edge Association and Power Optimization for the
Internet of Vehicles via Federated Multi-Agent Reinforcement Learning [74.53077322713548]
We investigate the privacy-preserving joint edge association and power allocation problem.
The proposed solution strikes a compelling trade-off, while preserving a higher privacy level than the state-of-the-art solutions.
arXiv Detail & Related papers (2023-01-26T10:09:23Z) - Large-Scale Privacy-Preserving Network Embedding against Private Link
Inference Attacks [12.434976161956401]
We address a novel problem of privacy-preserving network embedding against private link inference attacks.
We propose to perturb the original network by adding or removing links, and expect the embedding generated on the perturbed network can leak little information about private links but hold high utility for various downstream tasks.
arXiv Detail & Related papers (2022-05-28T13:59:39Z) - Decentralized Stochastic Optimization with Inherent Privacy Protection [103.62463469366557]
Decentralized optimization is the basic building block of modern collaborative machine learning, distributed estimation and control, and large-scale sensing.
Since involved data, privacy protection has become an increasingly pressing need in the implementation of decentralized optimization algorithms.
arXiv Detail & Related papers (2022-05-08T14:38:23Z) - Mitigating Leakage from Data Dependent Communications in Decentralized
Computing using Differential Privacy [1.911678487931003]
We propose a general execution model to control the data-dependence of communications in user-side decentralized computations.
Our formal privacy guarantees leverage and extend recent results on privacy amplification by shuffling.
arXiv Detail & Related papers (2021-12-23T08:30:17Z) - Robustness Threats of Differential Privacy [70.818129585404]
We experimentally demonstrate that networks, trained with differential privacy, in some settings might be even more vulnerable in comparison to non-private versions.
We study how the main ingredients of differentially private neural networks training, such as gradient clipping and noise addition, affect the robustness of the model.
arXiv Detail & Related papers (2020-12-14T18:59:24Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.