Privacy-Preserving Billing for Local Energy Markets
- URL: http://arxiv.org/abs/2404.15886v2
- Date: Tue, 17 Sep 2024 12:42:43 GMT
- Title: Privacy-Preserving Billing for Local Energy Markets
- Authors: Eman Alqahtani, Mustafa A. Mustafa,
- Abstract summary: We propose a privacy-preserving billing protocol for local energy markets (PBP-LEM) that takes into account market participants' energy volume deviations from their bids.
PBP-LEM enables a group of market entities to jointly compute participants' bills in a decentralized and privacy-preserving manner without sacrificing correctness.
- Score: 1.1823918493146686
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We propose a privacy-preserving billing protocol for local energy markets (PBP-LEM) that takes into account market participants' energy volume deviations from their bids. PBP-LEM enables a group of market entities to jointly compute participants' bills in a decentralized and privacy-preserving manner without sacrificing correctness. It also mitigates risks on individuals' privacy arising from any potential internal collusion. We first propose an efficient and privacy-preserving individual billing scheme, achieving information-theoretic security, which serves as a building block. PBP-LEM utilizes this scheme, along with other techniques such as multiparty computation, inner product functional encryption and Pedersen commitments to ensure data confidentiality and accuracy. Additionally, we present three approaches, resulting in different levels of privacy protection and performance. We prove that the protocol meets its security and privacy requirements and is feasible for deployment in real LEMs: bills can be computed in less than five minutes for 4,000 users using the most computationally intensive approach, and in just 0.18 seconds using the least intensive one.
Related papers
- PP-LEM: Efficient and Privacy-Preserving Clearance Mechanism for Local Energy Markets [0.0]
PP-LEM incorporates a novel competitive game-theoretical clearance mechanism, modelled as a Stackelberg Game.
Based on this mechanism, a privacy-preserving market model is developed using a partially homomorphic cryptosystem.
arXiv Detail & Related papers (2024-11-26T00:22:31Z) - The Communication-Friendly Privacy-Preserving Machine Learning against Malicious Adversaries [14.232901861974819]
Privacy-preserving machine learning (PPML) is an innovative approach that allows for secure data analysis while safeguarding sensitive information.
We introduce efficient protocol for secure linear function evaluation.
We extend the protocol to handle linear and non-linear layers, ensuring compatibility with a wide range of machine-learning models.
arXiv Detail & Related papers (2024-11-14T08:55:14Z) - Enhancing Feature-Specific Data Protection via Bayesian Coordinate Differential Privacy [55.357715095623554]
Local Differential Privacy (LDP) offers strong privacy guarantees without requiring users to trust external parties.
We propose a Bayesian framework, Bayesian Coordinate Differential Privacy (BCDP), that enables feature-specific privacy quantification.
arXiv Detail & Related papers (2024-10-24T03:39:55Z) - Privacy Preserving Multi-Agent Reinforcement Learning in Supply Chains [5.436598805836688]
This paper addresses privacy concerns in multiagent reinforcement learning (MARL) within the context of supply chains.
We propose a game-theoretic, privacy-related mechanism, utilizing a secure multi-party computation framework in MARL settings.
We present a learning mechanism that carries out floating point operations in a privacy-preserving manner.
arXiv Detail & Related papers (2023-12-09T21:25:21Z) - Libertas: Privacy-Preserving Computation for Decentralised Personal Data Stores [19.54818218429241]
We propose a modular design for integrating Secure Multi-Party Computation with Solid.
Our architecture, Libertas, requires no protocol level changes in the underlying design of Solid.
We show how this can be combined with existing differential privacy techniques to also ensure output privacy.
arXiv Detail & Related papers (2023-09-28T12:07:40Z) - A Randomized Approach for Tight Privacy Accounting [63.67296945525791]
We propose a new differential privacy paradigm called estimate-verify-release (EVR)
EVR paradigm first estimates the privacy parameter of a mechanism, then verifies whether it meets this guarantee, and finally releases the query output.
Our empirical evaluation shows the newly proposed EVR paradigm improves the utility-privacy tradeoff for privacy-preserving machine learning.
arXiv Detail & Related papers (2023-04-17T00:38:01Z) - Breaking the Communication-Privacy-Accuracy Tradeoff with
$f$-Differential Privacy [51.11280118806893]
We consider a federated data analytics problem in which a server coordinates the collaborative data analysis of multiple users with privacy concerns and limited communication capability.
We study the local differential privacy guarantees of discrete-valued mechanisms with finite output space through the lens of $f$-differential privacy (DP)
More specifically, we advance the existing literature by deriving tight $f$-DP guarantees for a variety of discrete-valued mechanisms.
arXiv Detail & Related papers (2023-02-19T16:58:53Z) - Privacy Amplification via Shuffling for Linear Contextual Bandits [51.94904361874446]
We study the contextual linear bandit problem with differential privacy (DP)
We show that it is possible to achieve a privacy/utility trade-off between JDP and LDP by leveraging the shuffle model of privacy.
Our result shows that it is possible to obtain a tradeoff between JDP and LDP by leveraging the shuffle model while preserving local privacy.
arXiv Detail & Related papers (2021-12-11T15:23:28Z) - Secure Federated Learning for Residential Short Term Load Forecasting [0.34123736336071864]
This paper examines a collaborative machine learning method for short-term demand forecasting using smart meter data.
The methods evaluated take into account several scenarios that explore how traditional centralized approaches could be projected in the direction of a decentralized, collaborative and private system.
arXiv Detail & Related papers (2021-11-17T17:27:59Z) - PCAL: A Privacy-preserving Intelligent Credit Risk Modeling Framework
Based on Adversarial Learning [111.19576084222345]
This paper proposes a framework of Privacy-preserving Credit risk modeling based on Adversarial Learning (PCAL)
PCAL aims to mask the private information inside the original dataset, while maintaining the important utility information for the target prediction task performance.
Results indicate that PCAL can learn an effective, privacy-free representation from user data, providing a solid foundation towards privacy-preserving machine learning for credit risk analysis.
arXiv Detail & Related papers (2020-10-06T07:04:59Z) - Private Reinforcement Learning with PAC and Regret Guarantees [69.4202374491817]
We design privacy preserving exploration policies for episodic reinforcement learning (RL)
We first provide a meaningful privacy formulation using the notion of joint differential privacy (JDP)
We then develop a private optimism-based learning algorithm that simultaneously achieves strong PAC and regret bounds, and enjoys a JDP guarantee.
arXiv Detail & Related papers (2020-09-18T20:18:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.