ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding
Attacks via Patch-agnostic Masking
- URL: http://arxiv.org/abs/2202.01811v1
- Date: Thu, 3 Feb 2022 19:34:25 GMT
- Title: ObjectSeeker: Certifiably Robust Object Detection against Patch Hiding
Attacks via Patch-agnostic Masking
- Authors: Chong Xiang, Alexander Valtchanov, Saeed Mahloujifar, Prateek Mittal
- Abstract summary: Object detectors are found to be vulnerable to physical-world patch hiding attacks.
We propose ObjectSeeker as a framework for building certifiably robust object detectors.
- Score: 95.6347501381882
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Object detectors, which are widely deployed in security-critical systems such
as autonomous vehicles, have been found vulnerable to physical-world patch
hiding attacks. The attacker can use a single physically-realizable adversarial
patch to make the object detector miss the detection of victim objects and
completely undermines the functionality of object detection applications. In
this paper, we propose ObjectSeeker as a defense framework for building
certifiably robust object detectors against patch hiding attacks. The core
operation of ObjectSeeker is patch-agnostic masking: we aim to mask out the
entire adversarial patch without any prior knowledge of the shape, size, and
location of the patch. This masking operation neutralizes the adversarial
effect and allows any vanilla object detector to safely detect objects on the
masked images. Remarkably, we develop a certification procedure to determine if
ObjectSeeker can detect certain objects with a provable guarantee against any
adaptive attacker within the threat model. Our evaluation with two object
detectors and three datasets demonstrates a significant (~10%-40% absolute and
~2-6x relative) improvement in certified robustness over the prior work, as
well as high clean performance (~1% performance drop compared with vanilla
undefended models).
Related papers
- Detector Collapse: Physical-World Backdooring Object Detection to Catastrophic Overload or Blindness in Autonomous Driving [17.637155085620634]
Detector Collapse (DC) is a brand-new backdoor attack paradigm tailored for object detection.
DC is designed to instantly incapacitate detectors (i.e., severely impairing detector's performance and culminating in a denial-of-service)
We introduce a novel poisoning strategy exploiting natural objects, enabling DC to act as a practical backdoor in real-world environments.
arXiv Detail & Related papers (2024-04-17T13:12:14Z) - Mask-based Invisible Backdoor Attacks on Object Detection [0.0]
Deep learning models are vulnerable to backdoor attacks.
In this study, we propose an effective invisible backdoor attack on object detection utilizing a mask-based approach.
arXiv Detail & Related papers (2024-03-20T12:27:30Z) - Segment and Complete: Defending Object Detectors against Adversarial
Patch Attacks with Robust Patch Detection [142.24869736769432]
Adversarial patch attacks pose a serious threat to state-of-the-art object detectors.
We propose Segment and Complete defense (SAC), a framework for defending object detectors against patch attacks.
We show SAC can significantly reduce the targeted attack success rate of physical patch attacks.
arXiv Detail & Related papers (2021-12-08T19:18:48Z) - DetectorGuard: Provably Securing Object Detectors against Localized
Patch Hiding Attacks [28.94435153159868]
State-of-the-art object detectors are vulnerable to localized patch hiding attacks.
We propose the first general framework for building provably robust detectors against the localized patch hiding attack called DetectorGuard.
arXiv Detail & Related papers (2021-02-05T02:02:21Z) - The Translucent Patch: A Physical and Universal Attack on Object
Detectors [48.31712758860241]
We propose a contactless physical patch to fool state-of-the-art object detectors.
The primary goal of our patch is to hide all instances of a selected target class.
We show that our patch was able to prevent the detection of 42.27% of all stop sign instances.
arXiv Detail & Related papers (2020-12-23T07:47:13Z) - Slender Object Detection: Diagnoses and Improvements [74.40792217534]
In this paper, we are concerned with the detection of a particular type of objects with extreme aspect ratios, namely textbfslender objects.
For a classical object detection method, a drastic drop of $18.9%$ mAP on COCO is observed, if solely evaluated on slender objects.
arXiv Detail & Related papers (2020-11-17T09:39:42Z) - Dynamic Adversarial Patch for Evading Object Detection Models [47.32228513808444]
We present an innovative attack method against object detectors applied in a real-world setup.
Our method uses dynamic adversarial patches which are placed at multiple predetermined locations on a target object.
We improved the attack by generating patches that consider the semantic distance between the target object and its classification.
arXiv Detail & Related papers (2020-10-25T08:55:40Z) - Detection as Regression: Certified Object Detection by Median Smoothing [50.89591634725045]
This work is motivated by recent progress on certified classification by randomized smoothing.
We obtain the first model-agnostic, training-free, and certified defense for object detection against $ell$-bounded attacks.
arXiv Detail & Related papers (2020-07-07T18:40:19Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.