The Processing goes far beyond "the app" -- Privacy issues of decentralized Digital Contact Tracing using the example of the German Corona-Warn-App (CWA)
- URL: http://arxiv.org/abs/2503.23444v1
- Date: Sun, 30 Mar 2025 13:48:15 GMT
- Title: The Processing goes far beyond "the app" -- Privacy issues of decentralized Digital Contact Tracing using the example of the German Corona-Warn-App (CWA)
- Authors: Rainer Rehak, Christian R. Kuehne,
- Abstract summary: We present the results of a scientific and methodologically clear DPIA of the German German Corona-Warn-App.<n>It shows that even a decentralized architecture involves numerous serious weaknesses and risks.<n>It also found that none of the proposed designs operates on anonymous data or ensures proper anonymisation.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Since SARS-CoV-2 started spreading in Europe in early 2020, there has been a strong call for technical solutions to combat or contain the pandemic, with contact tracing apps at the heart of the debates. The EU's General Data Protection Regulation (GDPR) requires controllers to carry out a data protection impact assessment (DPIA) where their data processing is likely to result in a high risk to the rights and freedoms (Art. 35 GDPR). A DPIA is a structured risk analysis that identifies and evaluates possible consequences of data processing relevant to fundamental rights in advance and describes the measures envisaged to address these risks or expresses the inability to do so. Based on the Standard Data Protection Model (SDM), we present the results of a scientific and methodologically clear DPIA of the German German Corona-Warn-App (CWA). It shows that even a decentralized architecture involves numerous serious weaknesses and risks, including larger ones still left unaddressed in current implementations. It also found that none of the proposed designs operates on anonymous data or ensures proper anonymisation. It also showed that informed consent would not be a legitimate legal ground for the processing. For all points where data subjects' rights are still not sufficiently safeguarded, we briefly outline solutions.
Related papers
- Generative AI for Secure and Privacy-Preserving Mobile Crowdsensing [74.58071278710896]
generative AI has attracted much attention from both academic and industrial fields.
Secure and privacy-preserving mobile crowdsensing (SPPMCS) has been widely applied in data collection/ acquirement.
arXiv Detail & Related papers (2024-05-17T04:00:58Z) - Modelling Technique for GDPR-compliance: Toward a Comprehensive Solution [0.0]
New data protection legislation in the EU/UK has come into force.
Existing threat modelling techniques are not designed to model compliance.
We propose a new data flow integrated with principles of knowledge base for non-compliance threats.
arXiv Detail & Related papers (2024-04-22T08:41:43Z) - Visual Privacy Auditing with Diffusion Models [47.0700328585184]
We introduce a reconstruction attack based on diffusion models (DMs) that only assumes adversary access to real-world image priors.<n>We find that (1) real-world data priors significantly influence reconstruction success, (2) current reconstruction bounds do not model the risk posed by data priors well, and DMs can serve as auditing tools for visualizing privacy leakage.
arXiv Detail & Related papers (2024-03-12T12:18:55Z) - EU law and emotion data [0.0]
Article sheds light on legal implications and challenges surrounding emotion data processing within the EU's legal framework.
We discuss the nuances of different approaches to affective computing and their relevance to the processing of special data.
We highlight some of the consequences, including harm, that processing of emotion data may have for individuals concerned.
arXiv Detail & Related papers (2023-09-19T17:25:02Z) - SILO Language Models: Isolating Legal Risk In a Nonparametric Datastore [159.21914121143885]
We present SILO, a new language model that manages this risk-performance tradeoff during inference.
SILO is built by (1) training a parametric LM on Open License Corpus (OLC), a new corpus we curate with 228B tokens of public domain and permissively licensed text.
Access to the datastore greatly improves out of domain performance, closing 90% of the performance gap with an LM trained on the Pile.
arXiv Detail & Related papers (2023-08-08T17:58:15Z) - Priorities for more effective tech regulation [3.8073142980733]
Report proposes a range of priorities for regulators, academia and the interested public in order to move beyond the status quo.
Current legal practice will not be enough to meaningfully tame egregious data practices.
arXiv Detail & Related papers (2023-02-27T16:53:05Z) - Is Vertical Logistic Regression Privacy-Preserving? A Comprehensive
Privacy Analysis and Beyond [57.10914865054868]
We consider vertical logistic regression (VLR) trained with mini-batch descent gradient.
We provide a comprehensive and rigorous privacy analysis of VLR in a class of open-source Federated Learning frameworks.
arXiv Detail & Related papers (2022-07-19T05:47:30Z) - Distributed Machine Learning and the Semblance of Trust [66.1227776348216]
Federated Learning (FL) allows the data owner to maintain data governance and perform model training locally without having to share their data.
FL and related techniques are often described as privacy-preserving.
We explain why this term is not appropriate and outline the risks associated with over-reliance on protocols that were not designed with formal definitions of privacy in mind.
arXiv Detail & Related papers (2021-12-21T08:44:05Z) - Data Protection Impact Assessment for the Corona App [0.0]
SARS-CoV-2 started spreading in Europe in early 2020 and there has been a strong call for technical solutions to combat or contain the pandemic.
There has been a strong call for technical solutions with contact tracing apps at the heart of debates.
The EU's General Daten Protection Regulation (DPIA) requires controllers to carry out a data protection assessment.
We present a scientific DPIA which thoroughly examines three published contact tracing app designs that are considered to be the most "privacy-friendly"
arXiv Detail & Related papers (2021-01-18T19:23:30Z) - Privacy Preservation in Federated Learning: An insightful survey from
the GDPR Perspective [10.901568085406753]
Article is dedicated to surveying on the state-of-the-art privacy techniques, which can be employed in Federated learning.
Recent research has demonstrated that retaining data and on computation in FL is not enough for privacy-guarantee.
This is because ML model parameters exchanged between parties in an FL system, which can be exploited in some privacy attacks.
arXiv Detail & Related papers (2020-11-10T21:41:25Z) - Second layer data governance for permissioned blockchains: the privacy
management challenge [58.720142291102135]
In pandemic situations, such as the COVID-19 and Ebola outbreak, the action related to sharing health data is crucial to avoid the massive infection and decrease the number of deaths.
In this sense, permissioned blockchain technology emerges to empower users to get their rights providing data ownership, transparency, and security through an immutable, unified, and distributed database ruled by smart contracts.
arXiv Detail & Related papers (2020-10-22T13:19:38Z) - COVI White Paper [67.04578448931741]
Contact tracing is an essential tool to change the course of the Covid-19 pandemic.
We present an overview of the rationale, design, ethical considerations and privacy strategy of COVI,' a Covid-19 public peer-to-peer contact tracing and risk awareness mobile application developed in Canada.
arXiv Detail & Related papers (2020-05-18T07:40:49Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.