Exactly Computing the Local Lipschitz Constant of ReLU Networks
- URL: http://arxiv.org/abs/2003.01219v2
- Date: Mon, 11 Jan 2021 01:46:26 GMT
- Title: Exactly Computing the Local Lipschitz Constant of ReLU Networks
- Authors: Matt Jordan, Alexandros G. Dimakis
- Abstract summary: The local Lipschitz constant of a neural network is a useful metric for robustness, generalization, and fairness evaluation.
We show strong inapproximability results for estimating Lipschitz constants of ReLU networks.
We leverage this algorithm to evaluate the tightness of competing Lipschitz estimators and the effects of regularized training on the Lipschitz constant.
- Score: 98.43114280459271
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The local Lipschitz constant of a neural network is a useful metric with
applications in robustness, generalization, and fairness evaluation. We provide
novel analytic results relating the local Lipschitz constant of nonsmooth
vector-valued functions to a maximization over the norm of the generalized
Jacobian. We present a sufficient condition for which backpropagation always
returns an element of the generalized Jacobian, and reframe the problem over
this broad class of functions. We show strong inapproximability results for
estimating Lipschitz constants of ReLU networks, and then formulate an
algorithm to compute these quantities exactly. We leverage this algorithm to
evaluate the tightness of competing Lipschitz estimators and the effects of
regularized training on the Lipschitz constant.
Related papers
- MIQCQP reformulation of the ReLU neural networks Lipschitz constant
estimation problem [0.0]
We propose new quadratically constrained MIP formulations for the neural network Lipschitz estimation problem.
The solutions of these problems give lower bounds and upper bounds of the Lipschitz constant.
We detail conditions when they coincide with the exact Lipschitz constant.
arXiv Detail & Related papers (2024-02-02T07:55:42Z) - Lipschitz constant estimation for 1D convolutional neural networks [0.0]
We propose a dissipativity-based method for Lipschitz constant estimation of 1D convolutional neural networks (CNNs)
In particular, we analyze the dissipativity properties of convolutional, pooling, and fully connected layers.
arXiv Detail & Related papers (2022-11-28T12:09:06Z) - Lipschitz Continuity Retained Binary Neural Network [52.17734681659175]
We introduce the Lipschitz continuity as the rigorous criteria to define the model robustness for BNN.
We then propose to retain the Lipschitz continuity as a regularization term to improve the model robustness.
Our experiments prove that our BNN-specific regularization method can effectively strengthen the robustness of BNN.
arXiv Detail & Related papers (2022-07-13T22:55:04Z) - Chordal Sparsity for Lipschitz Constant Estimation of Deep Neural
Networks [77.82638674792292]
Lipschitz constants of neural networks allow for guarantees of robustness in image classification, safety in controller design, and generalizability beyond the training data.
As calculating Lipschitz constants is NP-hard, techniques for estimating Lipschitz constants must navigate the trade-off between scalability and accuracy.
In this work, we significantly push the scalability frontier of a semidefinite programming technique known as LipSDP while achieving zero accuracy loss.
arXiv Detail & Related papers (2022-04-02T11:57:52Z) - Training Certifiably Robust Neural Networks with Efficient Local
Lipschitz Bounds [99.23098204458336]
Certified robustness is a desirable property for deep neural networks in safety-critical applications.
We show that our method consistently outperforms state-of-the-art methods on MNIST and TinyNet datasets.
arXiv Detail & Related papers (2021-11-02T06:44:10Z) - LipBaB: Computing exact Lipschitz constant of ReLU networks [0.0]
LipBaB is a framework to compute certified bounds of the local Lipschitz constant of deep neural networks.
Our algorithm can provide provably exact computation of the Lipschitz constant for any p-norm.
arXiv Detail & Related papers (2021-05-12T08:06:11Z) - Analytical bounds on the local Lipschitz constants of ReLU networks [0.0]
We do so by deriving Lipschitz constants and bounds for ReLU, affine-ReLU, and max pooling functions.
Our method produces the largest known bounds on minimum adversarial perturbations for large networks such as AlexNet and VGG-16.
arXiv Detail & Related papers (2021-04-29T21:57:47Z) - On Lipschitz Regularization of Convolutional Layers using Toeplitz
Matrix Theory [77.18089185140767]
Lipschitz regularity is established as a key property of modern deep learning.
computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard.
We introduce a new upper bound for convolutional layers that is both tight and easy to compute.
arXiv Detail & Related papers (2020-06-15T13:23:34Z) - Lipschitz constant estimation of Neural Networks via sparse polynomial
optimization [47.596834444042685]
LiPopt is a framework for computing increasingly tighter upper bounds on the Lipschitz constant of neural networks.
We show how to use the sparse connectivity of a network, to significantly reduce the complexity.
We conduct experiments on networks with random weights as well as networks trained on MNIST.
arXiv Detail & Related papers (2020-04-18T18:55:02Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.