Privacy-preserving Object Detection
- URL: http://arxiv.org/abs/2103.06587v1
- Date: Thu, 11 Mar 2021 10:34:54 GMT
- Title: Privacy-preserving Object Detection
- Authors: Peiyang He, Charlie Griffin, Krzysztof Kacprzyk, Artjom Joosen,
Michael Collyer, Aleksandar Shtedritski, Yuki M. Asano
- Abstract summary: We show that for object detection on COCO, both anonymizing the dataset by blurring faces, as well as swapping faces in a balanced manner along the gender and skin tone dimension, can retain object detection performances while preserving privacy and partially balancing bias.
- Score: 52.77024349608834
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Privacy considerations and bias in datasets are quickly becoming
high-priority issues that the computer vision community needs to face. So far,
little attention has been given to practical solutions that do not involve
collection of new datasets. In this work, we show that for object detection on
COCO, both anonymizing the dataset by blurring faces, as well as swapping faces
in a balanced manner along the gender and skin tone dimension, can retain
object detection performances while preserving privacy and partially balancing
bias.
Related papers
- Secure Visual Data Processing via Federated Learning [2.4374097382908477]
This paper addresses the need for privacy-preserving solutions in large-scale visual data processing.
We propose a new approach that combines object detection, federated learning and anonymization.
Our solution is evaluated against traditional centralized models, showing that while there is a slight trade-off in accuracy, the privacy benefits are substantial.
arXiv Detail & Related papers (2025-02-09T09:44:18Z) - Exploring Pose-Based Anomaly Detection for Retail Security: A Real-World Shoplifting Dataset and Benchmark [1.8802008255570537]
Shoplifting poses a significant challenge for retailers, resulting in billions of dollars in annual losses.
This paper frames shoplifting detection as an anomaly detection problem, focusing on the identification of deviations from typical shopping patterns.
We introduce PoseLift, a privacy-preserving dataset specifically designed for shoplifting detection.
arXiv Detail & Related papers (2025-01-11T17:19:53Z) - OOSTraj: Out-of-Sight Trajectory Prediction With Vision-Positioning Denoising [49.86409475232849]
Trajectory prediction is fundamental in computer vision and autonomous driving.
Existing approaches in this field often assume precise and complete observational data.
We present a novel method for out-of-sight trajectory prediction that leverages a vision-positioning technique.
arXiv Detail & Related papers (2024-04-02T18:30:29Z) - SHAN: Object-Level Privacy Detection via Inference on Scene Heterogeneous Graph [5.050631286347773]
Privacy object detection aims to accurately locate private objects in images.
Existing methods suffer from serious deficiencies in accuracy, generalization, and interpretability.
We propose SHAN, Scene Heterogeneous graph Attention Network, a model constructs a scene heterogeneous graph from an image.
arXiv Detail & Related papers (2024-03-14T08:32:14Z) - Diff-Privacy: Diffusion-based Face Privacy Protection [58.1021066224765]
In this paper, we propose a novel face privacy protection method based on diffusion models, dubbed Diff-Privacy.
Specifically, we train our proposed multi-scale image inversion module (MSI) to obtain a set of SDM format conditional embeddings of the original image.
Based on the conditional embeddings, we design corresponding embedding scheduling strategies and construct different energy functions during the denoising process to achieve anonymization and visual identity information hiding.
arXiv Detail & Related papers (2023-09-11T09:26:07Z) - A Survey on Privacy in Graph Neural Networks: Attacks, Preservation, and
Applications [76.88662943995641]
Graph Neural Networks (GNNs) have gained significant attention owing to their ability to handle graph-structured data.
To address this issue, researchers have started to develop privacy-preserving GNNs.
Despite this progress, there is a lack of a comprehensive overview of the attacks and the techniques for preserving privacy in the graph domain.
arXiv Detail & Related papers (2023-08-31T00:31:08Z) - Attribute-preserving Face Dataset Anonymization via Latent Code
Optimization [64.4569739006591]
We present a task-agnostic anonymization procedure that directly optimize the images' latent representation in the latent space of a pre-trained GAN.
We demonstrate through a series of experiments that our method is capable of anonymizing the identity of the images whilst -- crucially -- better-preserving the facial attributes.
arXiv Detail & Related papers (2023-03-20T17:34:05Z) - Anonymization for Skeleton Action Recognition [6.772319578308409]
We propose two variants of anonymization algorithms to protect the potential privacy leakage from the skeleton dataset.
Experimental results show that the anonymized dataset can reduce the risk of privacy leakage while having marginal effects on the action recognition performance.
arXiv Detail & Related papers (2021-11-30T05:13:20Z) - Graph-Homomorphic Perturbations for Private Decentralized Learning [64.26238893241322]
Local exchange of estimates allows inference of data based on private data.
perturbations chosen independently at every agent, resulting in a significant performance loss.
We propose an alternative scheme, which constructs perturbations according to a particular nullspace condition, allowing them to be invisible.
arXiv Detail & Related papers (2020-10-23T10:35:35Z) - Subverting Privacy-Preserving GANs: Hiding Secrets in Sanitized Images [13.690485523871855]
State-of-the-art approaches use privacy-preserving generative adversarial networks (PP-GANs) to enable reliable facial expression recognition without leaking users' identity.
We show that it is possible to hide the sensitive identification data in the sanitized output images of such PP-GANs for later extraction.
arXiv Detail & Related papers (2020-09-19T19:02:17Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.