論文の概要: Towards Certifying $\ell_\infty$ Robustness using Neural Networks with $\ell_\infty$-dist Neurons

• arxiv url: http://arxiv.org/abs/2102.05363v1
• Date: Wed, 10 Feb 2021 10:03:58 GMT
• ステータス: 処理完了
• システム内更新日: 2021-02-11 14:53:21.494899
• Title: Towards Certifying $\ell_\infty$ Robustness using Neural Networks with $\ell_\infty$-dist Neurons
• Title（参考訳）: $\ell_\infty$-distneurnsを用いたニューラルネットワークによるロバストネスの証明に向けて
• Authors: Bohang Zhang, Tianle Cai, Zhou Lu, Di He, Liwei Wang
• Abstract要約: 我々は本質的に$ell_infty$摂動に抵抗する原理的ニューラルネットワークを開発した。 一般的に使用されているデータセット上で、最先端のパフォーマンスを一貫して達成します。
• 参考スコア（独自算出の注目度）: 27.815886593870076
• Abstract: It is well-known that standard neural networks, even with a high classification accuracy, are vulnerable to small $\ell_\infty$-norm bounded adversarial perturbations. Although many attempts have been made, most previous works either can only provide empirical verification of the defense to a particular attack method, or can only develop a certified guarantee of the model robustness in limited scenarios. In this paper, we seek for a new approach to develop a theoretically principled neural network that inherently resists $\ell_\infty$ perturbations. In particular, we design a novel neuron that uses $\ell_\infty$-distance as its basic operation (which we call $\ell_\infty$-dist neuron), and show that any neural network constructed with $\ell_\infty$-dist neurons (called $\ell_{\infty}$-dist net) is naturally a 1-Lipschitz function with respect to $\ell_\infty$-norm. This directly provides a rigorous guarantee of the certified robustness based on the margin of prediction outputs. We also prove that such networks have enough expressive power to approximate any 1-Lipschitz function with robust generalization guarantee. Our experimental results show that the proposed network is promising. Using $\ell_{\infty}$-dist nets as the basic building blocks, we consistently achieve state-of-the-art performance on commonly used datasets: 93.09% certified accuracy on MNIST ($\epsilon=0.3$), 79.23% on Fashion MNIST ($\epsilon=0.1$) and 35.10% on CIFAR-10 ($\epsilon=8/255$).