They See Me Rollin': Inherent Vulnerability of the Rolling Shutter in
CMOS Image Sensors
- URL: http://arxiv.org/abs/2101.10011v1
- Date: Mon, 25 Jan 2021 11:14:25 GMT
- Title: They See Me Rollin': Inherent Vulnerability of the Rolling Shutter in
CMOS Image Sensors
- Authors: Sebastian K\"ohler, Giulio Lovisotto, Simon Birnbach, Richard Baker,
Ivan Martinovic
- Abstract summary: A camera's electronic rolling shutter can be exploited to inject fine-grained image disruptions.
We show how an adversary can modulate the laser to hide up to 75% of objects perceived by state-of-the-art detectors.
Our results indicate that rolling shutter attacks can substantially reduce the performance and reliability of vision-based intelligent systems.
- Score: 21.5487020124302
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Cameras have become a fundamental component of vision-based intelligent
systems. As a balance between production costs and image quality, most modern
cameras use Complementary Metal-Oxide Semiconductor image sensors that
implement an electronic rolling shutter mechanism, where image rows are
captured consecutively rather than all-at-once.
In this paper, we describe how the electronic rolling shutter can be
exploited using a bright, modulated light source (e.g., an inexpensive,
off-the-shelf laser), to inject fine-grained image disruptions. These
disruptions substantially affect camera-based computer vision systems, where
high-frequency data is crucial in extracting informative features from objects.
We study the fundamental factors affecting a rolling shutter attack, such as
environmental conditions, angle of the incident light, laser to camera
distance, and aiming precision. We demonstrate how these factors affect the
intensity of the injected distortion and how an adversary can take them into
account by modeling the properties of the camera. We introduce a general
pipeline of a practical attack, which consists of: (i) profiling several
properties of the target camera and (ii) partially simulating the attack to
find distortions that satisfy the adversary's goal. Then, we instantiate the
attack to the scenario of object detection, where the adversary's goal is to
maximally disrupt the detection of objects in the image. We show that the
adversary can modulate the laser to hide up to 75% of objects perceived by
state-of-the-art detectors while controlling the amount of perturbation to keep
the attack inconspicuous. Our results indicate that rolling shutter attacks can
substantially reduce the performance and reliability of vision-based
intelligent systems.
Related papers
- Understanding Impacts of Electromagnetic Signal Injection Attacks on Object Detection [33.819549876354515]
This paper quantifies and analyzes the impacts of cyber-physical attacks on object detection models in practice.
Images captured by image sensors may be affected by different factors in real applications, including cyber-physical attacks.
arXiv Detail & Related papers (2024-07-23T09:22:06Z) - Microsaccade-inspired Event Camera for Robotics [42.27082276343167]
We design an event-based perception system capable of simultaneously maintaining low reaction time and stable texture.
The geometrical optics of the rotating wedge prism allows for algorithmic compensation of the additional rotational motion.
Various real-world experiments demonstrate the potential of the system to facilitate robotics perception both for low-level and high-level vision tasks.
arXiv Detail & Related papers (2024-05-28T02:49:46Z) - Deep Learning for Event-based Vision: A Comprehensive Survey and Benchmarks [55.81577205593956]
Event cameras are bio-inspired sensors that capture the per-pixel intensity changes asynchronously.
Deep learning (DL) has been brought to this emerging field and inspired active research endeavors in mining its potential.
arXiv Detail & Related papers (2023-02-17T14:19:28Z) - Adversarially-Aware Robust Object Detector [85.10894272034135]
We propose a Robust Detector (RobustDet) based on adversarially-aware convolution to disentangle gradients for model learning on clean and adversarial images.
Our model effectively disentangles gradients and significantly enhances the detection robustness with maintaining the detection ability on clean images.
arXiv Detail & Related papers (2022-07-13T13:59:59Z) - Lasers to Events: Automatic Extrinsic Calibration of Lidars and Event
Cameras [67.84498757689776]
This paper presents the first direct calibration method between event cameras and lidars.
It removes dependencies on frame-based camera intermediaries and/or highly-accurate hand measurements.
arXiv Detail & Related papers (2022-07-03T11:05:45Z) - Shadows can be Dangerous: Stealthy and Effective Physical-world
Adversarial Attack by Natural Phenomenon [79.33449311057088]
We study a new type of optical adversarial examples, in which the perturbations are generated by a very common natural phenomenon, shadow.
We extensively evaluate the effectiveness of this new attack on both simulated and real-world environments.
arXiv Detail & Related papers (2022-03-08T02:40:18Z) - Signal Injection Attacks against CCD Image Sensors [20.892354746682223]
We show how electromagnetic emanation can be used to manipulate the image information captured by a CCD image sensor.
Our results indicate that the injected distortion can disrupt automated vision-based intelligent systems.
arXiv Detail & Related papers (2021-08-19T19:05:28Z) - Exploring Adversarial Robustness of Multi-Sensor Perception Systems in
Self Driving [87.3492357041748]
In this paper, we showcase practical susceptibilities of multi-sensor detection by placing an adversarial object on top of a host vehicle.
Our experiments demonstrate that successful attacks are primarily caused by easily corrupted image features.
Towards more robust multi-modal perception systems, we show that adversarial training with feature denoising can boost robustness to such attacks significantly.
arXiv Detail & Related papers (2021-01-17T21:15:34Z) - Invisible Perturbations: Physical Adversarial Examples Exploiting the
Rolling Shutter Effect [16.876798038844445]
We generate, for the first time, physical adversarial examples that are invisible to human eyes.
We demonstrate how an attacker can craft a modulated light signal that adversarially illuminates a scene and causes targeted misclassifications.
We conduct a range of simulation and physical experiments with LEDs, demonstrating targeted attack rates up to 84%.
arXiv Detail & Related papers (2020-11-26T16:34:47Z) - From two rolling shutters to one global shutter [57.431998188805665]
We explore a surprisingly simple camera configuration that makes it possible to undo the rolling shutter distortion.
Such a setup is easy and cheap to build and it possesses the geometric constraints needed to correct rolling shutter distortion.
We derive equations that describe the underlying geometry for general and special motions and present an efficient method for finding their solutions.
arXiv Detail & Related papers (2020-06-02T22:18:43Z) - GhostImage: Remote Perception Attacks against Camera-based Image
Classification Systems [6.637193297008101]
In vision-based object classification systems imaging sensors perceive the environment and machine learning is then used to detect and classify objects for decision-making purposes.
We demonstrate how the perception domain can be remotely and unobtrusively exploited to enable an attacker to create spurious objects or alter an existing object.
arXiv Detail & Related papers (2020-01-21T21:58:45Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.