Adversarial Rain Attack and Defensive Deraining for DNN Perception
- URL: http://arxiv.org/abs/2009.09205v2
- Date: Thu, 3 Feb 2022 06:32:48 GMT
- Title: Adversarial Rain Attack and Defensive Deraining for DNN Perception
- Authors: Liming Zhai, Felix Juefei-Xu, Qing Guo, Xiaofei Xie, Lei Ma, Wei Feng,
Shengchao Qin, Yang Liu
- Abstract summary: We propose to combine two totally different studies, i.e., rainy image synthesis and adversarial attack.
We first present an adversarial rain attack, with which we could simulate various rain situations with the guidance of deployed DNNs.
In particular, we design a factor-aware rain generation that synthesizes rain streaks according to the camera exposure process.
We also present a defensive deraining strategy, for which we design an adversarial rain augmentation that uses mixed adversarial rain layers.
- Score: 29.49757380041375
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Rain often poses inevitable threats to deep neural network (DNN) based
perception systems, and a comprehensive investigation of the potential risks of
the rain to DNNs is of great importance. However, it is rather difficult to
collect or synthesize rainy images that can represent all rain situations that
would possibly occur in the real world. To this end, in this paper, we start
from a new perspective and propose to combine two totally different studies,
i.e., rainy image synthesis and adversarial attack. We first present an
adversarial rain attack, with which we could simulate various rain situations
with the guidance of deployed DNNs and reveal the potential threat factors that
can be brought by rain. In particular, we design a factor-aware rain generation
that synthesizes rain streaks according to the camera exposure process and
models the learnable rain factors for adversarial attack. With this generator,
we perform the adversarial rain attack against the image classification and
object detection. To defend the DNNs from the negative rain effect, we also
present a defensive deraining strategy, for which we design an adversarial rain
augmentation that uses mixed adversarial rain layers to enhance deraining
models for downstream DNN perception. Our large-scale evaluation on various
datasets demonstrates that our synthesized rainy images with realistic
appearances not only exhibit strong adversarial capability against DNNs, but
also boost the deraining models for defensive purposes, building the foundation
for further rain-robust perception studies.
Related papers
- TRG-Net: An Interpretable and Controllable Rain Generator [61.2760968459789]
This study proposes a novel deep learning based rain generator, which fully takes the physical generation mechanism underlying rains into consideration.
Its significance lies in that the generator not only elaborately design essential elements of the rain to simulate expected rains, but also finely adapt to complicated and diverse practical rainy images.
Our unpaired generation experiments demonstrate that the rain generated by the proposed rain generator is not only of higher quality, but also more effective for deraining and downstream tasks.
arXiv Detail & Related papers (2024-03-15T03:27:39Z) - Why current rain denoising models fail on CycleGAN created rain images
in autonomous driving [1.4831974871130875]
Rain is artificially added to a set of clear-weather condition images using a Generative Adversarial Network (GAN)
This artificial generation of rain images is sufficiently realistic as in 7 out of 10 cases, human test subjects believed the generated rain images to be real.
In a second step, this paired good/bad weather image data is used to train two rain denoising models, one based primarily on a Convolutional Neural Network (CNN) and the other using a Vision Transformer.
arXiv Detail & Related papers (2023-05-22T12:42:32Z) - Adversarial Attack with Raindrops [7.361748886445515]
Deep neural networks (DNNs) are known to be vulnerable to adversarial examples, but rarely exist in real-world scenarios.
In this paper, we study the adversarial examples caused by raindrops, to demonstrate that there exist plenty of natural phenomena being able to work as adversarial attackers to DNNs.
We present a new approach to generate adversarial raindrops, denoted as AdvRD, using the generative adversarial network (GAN) technique to simulate natural raindrops.
arXiv Detail & Related papers (2023-02-28T03:01:58Z) - Not Just Streaks: Towards Ground Truth for Single Image Deraining [42.15398478201746]
We propose a large-scale dataset of real-world rainy and clean image pairs.
We propose a deep neural network that reconstructs the underlying scene by minimizing a rain-robust loss between rainy and clean images.
arXiv Detail & Related papers (2022-06-22T00:10:06Z) - Towards Robust Rain Removal Against Adversarial Attacks: A Comprehensive
Benchmark Analysis and Beyond [85.06231315901505]
Rain removal aims to remove rain streaks from images/videos and reduce the disruptive effects caused by rain.
This paper makes the first attempt to conduct a comprehensive study on the robustness of deep learning-based rain removal methods against adversarial attacks.
arXiv Detail & Related papers (2022-03-31T10:22:24Z) - Deep Single Image Deraining using An Asymetric Cycle Generative and
Adversarial Framework [16.59494337699748]
We propose a novel Asymetric Cycle Generative and Adrial framework (ACGF) for single image deraining.
ACGF trains on both synthetic and real rainy images while simultaneously capturing both rain streaks and fog features.
Experiments on benchmark rain-fog and rain datasets show that ACGF outperforms state-of-the-art deraining methods.
arXiv Detail & Related papers (2022-02-19T16:14:10Z) - UnfairGAN: An Enhanced Generative Adversarial Network for Raindrop
Removal from A Single Image [8.642603456626391]
UnfairGAN is an enhanced generative adversarial network that can utilize prior high-level information, such as edges and rain estimation, to boost deraining performance.
We show that our proposed method is superior to other state-of-the-art approaches of deraining raindrops regarding quantitative metrics and visual quality.
arXiv Detail & Related papers (2021-10-11T18:02:43Z) - RCDNet: An Interpretable Rain Convolutional Dictionary Network for
Single Image Deraining [49.99207211126791]
We specifically build a novel deep architecture, called rain convolutional dictionary network (RCDNet)
RCDNet embeds the intrinsic priors of rain streaks and has clear interpretability.
By end-to-end training such an interpretable network, all involved rain kernels and proximal operators can be automatically extracted.
arXiv Detail & Related papers (2021-07-14T16:08:11Z) - Dual Attention-in-Attention Model for Joint Rain Streak and Raindrop
Removal [103.4067418083549]
We propose a Dual Attention-in-Attention Model (DAiAM) which includes two DAMs for removing both rain streaks and raindrops simultaneously.
The proposed method not only is capable of removing rain streaks and raindrops simultaneously, but also achieves the state-of-the-art performance on both tasks.
arXiv Detail & Related papers (2021-03-12T03:00:33Z) - From Rain Generation to Rain Removal [67.71728610434698]
We build a full Bayesian generative model for rainy image where the rain layer is parameterized as a generator.
We employ the variational inference framework to approximate the expected statistical distribution of rainy image.
Comprehensive experiments substantiate that the proposed model can faithfully extract the complex rain distribution.
arXiv Detail & Related papers (2020-08-08T18:56:51Z) - Structural Residual Learning for Single Image Rain Removal [48.87977695398587]
This study proposes a new network architecture by enforcing the output residual of the network possess intrinsic rain structures.
Such a structural residual setting guarantees the rain layer extracted by the network finely comply with the prior knowledge of general rain streaks.
arXiv Detail & Related papers (2020-05-19T05:52:13Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.