Examining Software Developers' Needs for Privacy Enforcing Techniques: A survey
- URL: http://arxiv.org/abs/2512.14756v1
- Date: Mon, 15 Dec 2025 13:20:14 GMT
- Title: Examining Software Developers' Needs for Privacy Enforcing Techniques: A survey
- Authors: Ioanna Theophilou, Georgia M. Kapitsaki,
- Abstract summary: Data privacy legislation has rendered data privacy law compliance a requirement of all software systems.<n>As data compliance is tightly coupled with legal knowledge, it is not always easy to perform such integrations in software systems.<n>Emerging developer needs that can assist in privacy law compliance have not been examined.
- Score: 2.879036956042182
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: Data privacy legislation, such as GDPR and CCPA/CPRA, has rendered data privacy law compliance a requirement of all software systems. Developers need to implement various kinds of functionalities to cover law needs, including user rights and law principles. As data compliance is tightly coupled with legal knowledge, it is not always easy to perform such integrations in software systems. Prior studies have focused on developers' understanding of privacy principles, such as Privacy by Design, and have examined privacy techniques used in the software industry. Nevertheless, emerging developer needs that can assist in privacy law compliance have not been examined but are useful in understanding what development automation tools, such as Generative AI, need to cover to make the compliance process more straightforward and seamless within the development process. In this work, we present a survey that examines the above needs with the participation of 68 developers, while we have examined which factors affect practitioners' needs. Most developers express a need for more automated tools, while privacy experience increases practitioners' concerns for privacy tools. Our results can assist practitioners in better positioning their development activities within privacy law compliance and point to an urgent need for privacy facilitators.
Related papers
- How to DP-fy Your Data: A Practical Guide to Generating Synthetic Data With Differential Privacy [52.00934156883483]
Differential Privacy (DP) is a framework for reasoning about and limiting information leakage.<n>Differentially Private Synthetic data refers to synthetic data that preserves the overall trends of source data.
arXiv Detail & Related papers (2025-12-02T21:14:39Z) - "I need to learn better searching tactics for privacy policy laws.'' Investigating Software Developers' Behavior When Using Sources on Privacy Issues [8.662963983664223]
Our study highlights major shortcomings in existing support for privacy-related development tasks.<n>Based on our findings, we discuss the need for more accessible, understandable, and actionable privacy resources for developers.
arXiv Detail & Related papers (2025-11-11T09:58:06Z) - Generating Privacy Stories From Software Documentation [1.2094859111770522]
We develop a novel approach based on chain-of-thought prompting (CoT), in-context-learning (ICL), and Large Language Models (LLMs)<n>Our results show that most commonly used LLMs, such as GPT-4o and Llama 3, can identify privacy behaviors and generate privacy user stories with F1 scores exceeding 0.8.
arXiv Detail & Related papers (2025-06-28T20:55:21Z) - SoK: Enhancing Privacy-Preserving Software Development from a Developers' Perspective [1.2016264781280588]
This review aims to identify and analyze empirically validated solutions to help developers in privacy-preserving software development.<n>Findings will provide valuable insights for researchers to improve current privacy-preserving solutions and for practitioners looking for effective and validated solutions to embed privacy into software development.
arXiv Detail & Related papers (2025-04-29T01:38:01Z) - How Privacy-Savvy Are Large Language Models? A Case Study on Compliance and Privacy Technical Review [15.15468770348023]
We evaluate large language models' performance in privacy-related tasks such as privacy information extraction (PIE), legal and regulatory key point detection (KPD), and question answering (QA)<n>Through an empirical assessment, we investigate the capacity of several prominent LLMs, including BERT, GPT-3.5, GPT-4, and custom models, in executing privacy compliance checks and technical privacy reviews.<n>While LLMs show promise in automating privacy reviews and identifying regulatory discrepancies, significant gaps persist in their ability to fully comply with evolving legal standards.
arXiv Detail & Related papers (2024-09-04T01:51:37Z) - Privacy Checklist: Privacy Violation Detection Grounding on Contextual Integrity Theory [43.12744258781724]
We formulate the privacy issue as a reasoning problem rather than simple pattern matching.<n>We develop the first comprehensive checklist that covers social identities, private attributes, and existing privacy regulations.
arXiv Detail & Related papers (2024-08-19T14:48:04Z) - Collection, usage and privacy of mobility data in the enterprise and public administrations [55.2480439325792]
Security measures such as anonymization are needed to protect individuals' privacy.
Within our study, we conducted expert interviews to gain insights into practices in the field.
We survey privacy-enhancing methods in use, which generally do not comply with state-of-the-art standards of differential privacy.
arXiv Detail & Related papers (2024-07-04T08:29:27Z) - Privacy Risks of General-Purpose AI Systems: A Foundation for Investigating Practitioner Perspectives [47.17703009473386]
Powerful AI models have led to impressive leaps in performance across a wide range of tasks.
Privacy concerns have led to a wealth of literature covering various privacy risks and vulnerabilities of AI models.
We conduct a systematic review of these survey papers to provide a concise and usable overview of privacy risks in GPAIS.
arXiv Detail & Related papers (2024-07-02T07:49:48Z) - Evaluating Privacy Perceptions, Experience, and Behavior of Software Development Teams [2.818645620433775]
Our survey includes 362 participants from 23 countries, encompassing roles such as product managers, developers, and testers.
Our results show diverse definitions of privacy across SDLC roles, emphasizing the need for a holistic privacy approach throughout SDLC.
Most participants are more familiar with HIPAA and other regulations, with multi-jurisdictional compliance being their primary concern.
arXiv Detail & Related papers (2024-04-01T17:55:10Z) - Privacy Explanations - A Means to End-User Trust [64.7066037969487]
We looked into how explainability might help to tackle this problem.
We created privacy explanations that aim to help to clarify to end users why and for what purposes specific data is required.
Our findings reveal that privacy explanations can be an important step towards increasing trust in software systems.
arXiv Detail & Related papers (2022-10-18T09:30:37Z) - Distributed Machine Learning and the Semblance of Trust [66.1227776348216]
Federated Learning (FL) allows the data owner to maintain data governance and perform model training locally without having to share their data.
FL and related techniques are often described as privacy-preserving.
We explain why this term is not appropriate and outline the risks associated with over-reliance on protocols that were not designed with formal definitions of privacy in mind.
arXiv Detail & Related papers (2021-12-21T08:44:05Z) - Why are Developers Struggling to Put GDPR into Practice when Developing
Privacy-Preserving Software Systems? [3.04585143845864]
General Data Protection Law provides guidelines for developers on how to protect user data.
Previous research has attempted to investigate what hinders developers from embedding privacy into software systems.
This paper investigates the issues that hinder software developers from implementing software applications taking law on-board.
arXiv Detail & Related papers (2020-08-07T04:34:08Z) - A vision for global privacy bridges: Technical and legal measures for
international data markets [77.34726150561087]
Despite data protection laws and an acknowledged right to privacy, trading personal information has become a business equated with "trading oil"
An open conflict is arising between business demands for data and a desire for privacy.
We propose and test a vision of a personal information market with privacy.
arXiv Detail & Related papers (2020-05-13T13:55:50Z) - Beyond privacy regulations: an ethical approach to data usage in
transportation [64.86110095869176]
We describe how Federated Machine Learning can be applied to the transportation sector.
We see Federated Learning as a method that enables us to process privacy-sensitive data, while respecting customer's privacy.
arXiv Detail & Related papers (2020-04-01T15:10:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.