Improved Branch and Bound for Neural Network Verification via Lagrangian
Decomposition
- URL: http://arxiv.org/abs/2104.06718v1
- Date: Wed, 14 Apr 2021 09:22:42 GMT
- Title: Improved Branch and Bound for Neural Network Verification via Lagrangian
Decomposition
- Authors: Alessandro De Palma, Rudy Bunel, Alban Desmaison, Krishnamurthy
Dvijotham, Pushmeet Kohli, Philip H.S. Torr, M. Pawan Kumar
- Abstract summary: We improve the scalability of Branch and Bound (BaB) algorithms for formally proving input-output properties of neural networks.
We present a novel activation-based branching strategy and a BaB framework, named Branch and Dual Network Bound (BaDNB)
BaDNB outperforms previous complete verification systems by a large margin, cutting average verification times by factors up to 50 on adversarial properties.
- Score: 161.09660864941603
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: We improve the scalability of Branch and Bound (BaB) algorithms for formally
proving input-output properties of neural networks. First, we propose novel
bounding algorithms based on Lagrangian Decomposition. Previous works have used
off-the-shelf solvers to solve relaxations at each node of the BaB tree, or
constructed weaker relaxations that can be solved efficiently, but lead to
unnecessarily weak bounds. Our formulation restricts the optimization to a
subspace of the dual domain that is guaranteed to contain the optimum,
resulting in accelerated convergence. Furthermore, it allows for a massively
parallel implementation, which is amenable to GPU acceleration via modern deep
learning frameworks. Second, we present a novel activation-based branching
strategy. By coupling an inexpensive heuristic with fast dual bounding, our
branching scheme greatly reduces the size of the BaB tree compared to previous
heuristic methods. Moreover, it performs competitively with a recent strategy
based on learning algorithms, without its large offline training cost. Finally,
we design a BaB framework, named Branch and Dual Network Bound (BaDNB), based
on our novel bounding and branching algorithms. We show that BaDNB outperforms
previous complete verification systems by a large margin, cutting average
verification times by factors up to 50 on adversarial robustness properties.
Related papers
- Neural Network Verification with Branch-and-Bound for General Nonlinearities [63.39918329535165]
Branch-and-bound (BaB) is among the most effective methods for neural network (NN) verification.
We develop a general framework, named GenBaB, to conduct BaB for general nonlinearities in general computational graphs.
We demonstrate the effectiveness of our GenBaB on verifying a wide range of NNs, including networks with activation functions such as Sigmoid, Tanh, Sine and GeLU.
arXiv Detail & Related papers (2024-05-31T17:51:07Z) - FOBNN: Fast Oblivious Binarized Neural Network Inference [12.587981899648419]
We develop a fast oblivious binarized neural network inference framework, FOBNN.
Specifically, we customize binarized convolutional neural networks to enhance oblivious inference, design two fast algorithms for binarized convolutions, and optimize network structures experimentally under constrained costs.
arXiv Detail & Related papers (2024-05-06T03:12:36Z) - The Cascaded Forward Algorithm for Neural Network Training [61.06444586991505]
We propose a new learning framework for neural networks, namely Cascaded Forward (CaFo) algorithm, which does not rely on BP optimization as that in FF.
Unlike FF, our framework directly outputs label distributions at each cascaded block, which does not require generation of additional negative samples.
In our framework each block can be trained independently, so it can be easily deployed into parallel acceleration systems.
arXiv Detail & Related papers (2023-03-17T02:01:11Z) - On Model Compression for Neural Networks: Framework, Algorithm, and Convergence Guarantee [21.818773423324235]
This paper focuses on two model compression techniques: low-rank approximation and weight approximation.
In this paper, a holistic framework is proposed for model compression from a novel perspective of non optimization.
arXiv Detail & Related papers (2023-03-13T02:14:42Z) - Neural Network Branch-and-Bound for Neural Network Verification [26.609606492971967]
We propose a novel machine learning framework that can be used for designing an effective branching strategy.
We learn two graph neural networks (GNNs) that both directly treat the network we want to verify as a graph input.
We show that our GNN models generalize well to harder properties on larger unseen networks.
arXiv Detail & Related papers (2021-07-27T14:42:57Z) - Lower Bounds and Optimal Algorithms for Smooth and Strongly Convex
Decentralized Optimization Over Time-Varying Networks [79.16773494166644]
We consider the task of minimizing the sum of smooth and strongly convex functions stored in a decentralized manner across the nodes of a communication network.
We design two optimal algorithms that attain these lower bounds.
We corroborate the theoretical efficiency of these algorithms by performing an experimental comparison with existing state-of-the-art methods.
arXiv Detail & Related papers (2021-06-08T15:54:44Z) - An Asymptotically Optimal Primal-Dual Incremental Algorithm for
Contextual Linear Bandits [129.1029690825929]
We introduce a novel algorithm improving over the state-of-the-art along multiple dimensions.
We establish minimax optimality for any learning horizon in the special case of non-contextual linear bandits.
arXiv Detail & Related papers (2020-10-23T09:12:47Z) - Lagrangian Decomposition for Neural Network Verification [148.0448557991349]
A fundamental component of neural network verification is the computation of bounds on the values their outputs can take.
We propose a novel approach based on Lagrangian Decomposition.
We show that we obtain bounds comparable with off-the-shelf solvers in a fraction of their running time.
arXiv Detail & Related papers (2020-02-24T17:55:10Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.