Why are Developers Struggling to Put GDPR into Practice when Developing
Privacy-Preserving Software Systems?
- URL: http://arxiv.org/abs/2008.02987v1
- Date: Fri, 7 Aug 2020 04:34:08 GMT
- Title: Why are Developers Struggling to Put GDPR into Practice when Developing
Privacy-Preserving Software Systems?
- Authors: Abdulrahman Alhazmi and Nalin Asanka Gamagedara Arachchilage
- Abstract summary: General Data Protection Law provides guidelines for developers on how to protect user data.
Previous research has attempted to investigate what hinders developers from embedding privacy into software systems.
This paper investigates the issues that hinder software developers from implementing software applications taking law on-board.
- Score: 3.04585143845864
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The use of software applications is inevitable as they provide different
services to users. The software applications collect, store users' data, and
sometimes share with the third party, even without the user consent. One can
argue that software developers do not implement privacy into the software
applications they develop or take GDPR (General Data Protection Law) law into
account. Failing to do this, may lead to software applications that open up
privacy breaches (e.g. data breach). The GDPR law provides a set of guidelines
for developers and organizations on how to protect user data when they are
interacting with software applications. Previous research has attempted to
investigate what hinders developers from embedding privacy into software
systems. However, there has been no detailed investigation on why they cannot
develop privacy-preserving systems taking GDPR into consideration, which is
imperative to develop software applications that preserve privacy. Therefore,
this paper investigates the issues that hinder software developers from
implementing software applications taking GDPR law on-board. Our study findings
revealed that developers are not familiar with GDPR principles. Even some of
them are, they lack knowledge of the GDPR principles and their techniques to
use when developing privacy-preserving software systems
Related papers
- Interactive GDPR-Compliant Privacy Policy Generation for Software Applications [6.189770781546807]
To use software applications users are sometimes requested to provide their personal information.
As privacy has become a significant concern many protection regulations exist worldwide.
We propose an approach that generates comprehensive and compliant privacy policy.
arXiv Detail & Related papers (2024-10-04T01:22:16Z) - Integrating PETs into Software Applications: A Game-Based Learning Approach [2.7186493234782527]
"PETs-101" is a novel game-based learning framework that motivates developers to integrate PETs into software.
It aims to improve developers' privacy-preserving software development behaviour.
arXiv Detail & Related papers (2024-10-01T13:15:46Z) - An Exploratory Mixed-Methods Study on General Data Protection Regulation (GDPR) Compliance in Open-Source Software [4.2610816955137]
European Union's General Data Protection Regulation require software developers to meet privacy requirements interacting with users' data.
Prior research describes impact of such laws on development, but only when commercial software.
arXiv Detail & Related papers (2024-06-20T20:38:33Z) - A First Look at the General Data Protection Regulation (GDPR) in
Open-Source Software [4.844017045823075]
This poster describes work on regulated data protection in opensource software.
We surveyed open-source developers to understand their experiences.
We call for improved policy-related compliance resources.
arXiv Detail & Related papers (2024-01-26T03:49:13Z) - SoK: Demystifying Privacy Enhancing Technologies Through the Lens of
Software Developers [4.171555557592296]
In the absence of data protection measures, software applications lead to privacy breaches.
This review analyses 39 empirical studies on developers' privacy practices.
It reports the usage of six PETs in software application scenarios.
It discusses challenges developers face when integrating PETs into software.
arXiv Detail & Related papers (2023-12-30T12:24:40Z) - Embedded Software Development with Digital Twins: Specific Requirements
for Small and Medium-Sized Enterprises [55.57032418885258]
Digital twins have the potential for cost-effective software development and maintenance strategies.
We interviewed SMEs about their current development processes.
First results show that real-time requirements prevent, to date, a Software-in-the-Loop development approach.
arXiv Detail & Related papers (2023-09-17T08:56:36Z) - Tight Auditing of Differentially Private Machine Learning [77.38590306275877]
For private machine learning, existing auditing mechanisms are tight.
They only give tight estimates under implausible worst-case assumptions.
We design an improved auditing scheme that yields tight privacy estimates for natural (not adversarially crafted) datasets.
arXiv Detail & Related papers (2023-02-15T21:40:33Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - Analysis of Longitudinal Changes in Privacy Behavior of Android
Applications [79.71330613821037]
In this paper, we examine the trends in how Android apps have changed over time with respect to privacy.
We examine the adoption of HTTPS, whether apps scan the device for other installed apps, the use of permissions for privacy-sensitive data, and the use of unique identifiers.
We find that privacy-related behavior has improved with time as apps continue to receive updates, and that the third-party libraries used by apps are responsible for more issues with privacy.
arXiv Detail & Related papers (2021-12-28T16:21:31Z) - Distributed Machine Learning and the Semblance of Trust [66.1227776348216]
Federated Learning (FL) allows the data owner to maintain data governance and perform model training locally without having to share their data.
FL and related techniques are often described as privacy-preserving.
We explain why this term is not appropriate and outline the risks associated with over-reliance on protocols that were not designed with formal definitions of privacy in mind.
arXiv Detail & Related papers (2021-12-21T08:44:05Z) - PCAL: A Privacy-preserving Intelligent Credit Risk Modeling Framework
Based on Adversarial Learning [111.19576084222345]
This paper proposes a framework of Privacy-preserving Credit risk modeling based on Adversarial Learning (PCAL)
PCAL aims to mask the private information inside the original dataset, while maintaining the important utility information for the target prediction task performance.
Results indicate that PCAL can learn an effective, privacy-free representation from user data, providing a solid foundation towards privacy-preserving machine learning for credit risk analysis.
arXiv Detail & Related papers (2020-10-06T07:04:59Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.