Ensemble Defense with Data Diversity: Weak Correlation Implies Strong
Robustness
- URL: http://arxiv.org/abs/2106.02867v1
- Date: Sat, 5 Jun 2021 10:56:48 GMT
- Title: Ensemble Defense with Data Diversity: Weak Correlation Implies Strong
Robustness
- Authors: Renjue Li, Hanwei Zhang, Pengfei Yang, Cheng-Chao Huang, Aimin Zhou,
Bai Xue, Lijun Zhang
- Abstract summary: We propose a framework of filter-based ensemble of deep neuralnetworks (DNNs) to defend against adversarial attacks.
Our ensemble models are more robust than those constructed by previous defense methods like adversarial training.
- Score: 15.185132265916106
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In this paper, we propose a framework of filter-based ensemble of deep
neuralnetworks (DNNs) to defend against adversarial attacks. The framework
builds an ensemble of sub-models -- DNNs with differentiated preprocessing
filters. From the theoretical perspective of DNN robustness, we argue that
under the assumption of high quality of the filters, the weaker the
correlations of the sensitivity of the filters are, the more robust the
ensemble model tends to be, and this is corroborated by the experiments of
transfer-based attacks. Correspondingly, we propose a principle that chooses
the specific filters with smaller Pearson correlation coefficients, which
ensures the diversity of the inputs received by DNNs, as well as the
effectiveness of the entire framework against attacks. Our ensemble models are
more robust than those constructed by previous defense methods like adversarial
training, and even competitive with the classical ensemble of adversarial
trained DNNs under adversarial attacks when the attacking radius is large.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.