Beta-CROWN: Efficient Bound Propagation with Per-neuron Split
Constraints for Complete and Incomplete Neural Network Verification
- URL: http://arxiv.org/abs/2103.06624v1
- Date: Thu, 11 Mar 2021 11:56:54 GMT
- Title: Beta-CROWN: Efficient Bound Propagation with Per-neuron Split
Constraints for Complete and Incomplete Neural Network Verification
- Authors: Shiqi Wang, Huan Zhang, Kaidi Xu, Xue Lin, Suman Jana, Cho-Jui Hsieh,
J. Zico Kolter
- Abstract summary: We develop $beta$-CROWN, a bound propagation based verifier that can fully encode per-neuron splits.
$beta$-CROWN is close to three orders of magnitude faster than LP-based BaB methods for robustness verification.
By terminating BaB early, our method can also be used for incomplete verification.
- Score: 151.62491805851107
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Recent works in neural network verification show that cheap incomplete
verifiers such as CROWN, based upon bound propagations, can effectively be used
in Branch-and-Bound (BaB) methods to accelerate complete verification,
achieving significant speedups compared to expensive linear programming (LP)
based techniques. However, they cannot fully handle the per-neuron split
constraints introduced by BaB like LP verifiers do, leading to looser bounds
and hurting their verification efficiency. In this work, we develop
$\beta$-CROWN, a new bound propagation based method that can fully encode
per-neuron splits via optimizable parameters $\beta$. When the optimizable
parameters are jointly optimized in intermediate layers, $\beta$-CROWN has the
potential of producing better bounds than typical LP verifiers with neuron
split constraints, while being efficiently parallelizable on GPUs. Applied to
the complete verification setting, $\beta$-CROWN is close to three orders of
magnitude faster than LP-based BaB methods for robustness verification, and
also over twice faster than state-of-the-art GPU-based complete verifiers with
similar timeout rates. By terminating BaB early, our method can also be used
for incomplete verification. Compared to the state-of-the-art
semidefinite-programming (SDP) based verifier, we show a substantial leap
forward by greatly reducing the gap between verified accuracy and empirical
adversarial attack accuracy, from 35% (SDP) to 12% on an adversarially trained
MNIST network ($\epsilon=0.3$), while being 47 times faster. Our code is
available at https://github.com/KaidiXu/Beta-CROWN
Related papers
- Using a Nearest-Neighbour, BERT-Based Approach for Scalable Clone
Detection [0.0]
SSCD is a BERT-based clone detection approach that targets high recall of Type 3 and Type 4 clones at scale.
It does so by computing a representative embedding for each code fragment and finding similar fragments using a nearest neighbour search.
This paper details the approach and an empirical assessment towards configuring and evaluating that approach in industrial setting.
arXiv Detail & Related papers (2023-09-05T12:38:55Z) - Sound and Complete Verification of Polynomial Networks [55.9260539566555]
Polynomial Networks (PNs) have demonstrated promising performance on face and image recognition recently.
Existing verification algorithms on ReLU neural networks (NNs) based on branch and bound (BaB) techniques cannot be trivially applied to PN verification.
We devise a new bounding method, equipped with BaB for global convergence guarantees, called VPN.
arXiv Detail & Related papers (2022-09-15T11:50:43Z) - General Cutting Planes for Bound-Propagation-Based Neural Network
Verification [144.7290035694459]
We generalize the bound propagation procedure to allow the addition of arbitrary cutting plane constraints.
We find that MIP solvers can generate high-quality cutting planes for strengthening bound-propagation-based verifiers.
Our method is the first verifier that can completely solve the oval20 benchmark and verify twice as many instances on the oval21 benchmark.
arXiv Detail & Related papers (2022-08-11T10:31:28Z) - Improved Branch and Bound for Neural Network Verification via Lagrangian
Decomposition [161.09660864941603]
We improve the scalability of Branch and Bound (BaB) algorithms for formally proving input-output properties of neural networks.
We present a novel activation-based branching strategy and a BaB framework, named Branch and Dual Network Bound (BaDNB)
BaDNB outperforms previous complete verification systems by a large margin, cutting average verification times by factors up to 50 on adversarial properties.
arXiv Detail & Related papers (2021-04-14T09:22:42Z) - Fast and Complete: Enabling Complete Neural Network Verification with
Rapid and Massively Parallel Incomplete Verifiers [112.23981192818721]
We propose to use backward mode linear relaxation based analysis (LiRPA) to replace Linear Programming (LP) during the BaB process.
Unlike LP, LiRPA when applied naively can produce much weaker bounds and even cannot check certain conflicts of sub-domains during splitting.
We demonstrate an order of magnitude speedup compared to existing LP-based approaches.
arXiv Detail & Related papers (2020-11-27T16:42:12Z) - Second-Order Provable Defenses against Adversarial Attacks [63.34032156196848]
We show that if the eigenvalues of the network are bounded, we can compute a certificate in the $l$ norm efficiently using convex optimization.
We achieve certified accuracy of 5.78%, and 44.96%, and 43.19% on 2,59% and 4BP-based methods respectively.
arXiv Detail & Related papers (2020-06-01T05:55:18Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.