MIQCQP reformulation of the ReLU neural networks Lipschitz constant
estimation problem
- URL: http://arxiv.org/abs/2402.01199v1
- Date: Fri, 2 Feb 2024 07:55:42 GMT
- Title: MIQCQP reformulation of the ReLU neural networks Lipschitz constant
estimation problem
- Authors: Mohammed Sbihi (ENAC), Sophie Jan (IMT), Nicolas Couellan (IMT, ENAC)
- Abstract summary: We propose new quadratically constrained MIP formulations for the neural network Lipschitz estimation problem.
The solutions of these problems give lower bounds and upper bounds of the Lipschitz constant.
We detail conditions when they coincide with the exact Lipschitz constant.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: It is well established that to ensure or certify the robustness of a neural
network, its Lipschitz constant plays a prominent role. However, its
calculation is NP-hard. In this note, by taking into account activation regions
at each layer as new constraints, we propose new quadratically constrained MIP
formulations for the neural network Lipschitz estimation problem. The solutions
of these problems give lower bounds and upper bounds of the Lipschitz constant
and we detail conditions when they coincide with the exact Lipschitz constant.
Related papers
- Novel Quadratic Constraints for Extending LipSDP beyond Slope-Restricted
Activations [52.031701581294804]
Lipschitz bounds for neural networks can be computed with upper time preservation guarantees.
Our paper bridges the gap and extends Lipschitz beyond slope-restricted activation functions.
Our proposed analysis is general and provides a unified approach for estimating $ell$ and $ell_infty$ Lipschitz bounds.
arXiv Detail & Related papers (2024-01-25T09:23:31Z) - Chordal Sparsity for Lipschitz Constant Estimation of Deep Neural
Networks [77.82638674792292]
Lipschitz constants of neural networks allow for guarantees of robustness in image classification, safety in controller design, and generalizability beyond the training data.
As calculating Lipschitz constants is NP-hard, techniques for estimating Lipschitz constants must navigate the trade-off between scalability and accuracy.
In this work, we significantly push the scalability frontier of a semidefinite programming technique known as LipSDP while achieving zero accuracy loss.
arXiv Detail & Related papers (2022-04-02T11:57:52Z) - Training Certifiably Robust Neural Networks with Efficient Local
Lipschitz Bounds [99.23098204458336]
Certified robustness is a desirable property for deep neural networks in safety-critical applications.
We show that our method consistently outperforms state-of-the-art methods on MNIST and TinyNet datasets.
arXiv Detail & Related papers (2021-11-02T06:44:10Z) - Robust Implicit Networks via Non-Euclidean Contractions [63.91638306025768]
Implicit neural networks show improved accuracy and significant reduction in memory consumption.
They can suffer from ill-posedness and convergence instability.
This paper provides a new framework to design well-posed and robust implicit neural networks.
arXiv Detail & Related papers (2021-06-06T18:05:02Z) - LipBaB: Computing exact Lipschitz constant of ReLU networks [0.0]
LipBaB is a framework to compute certified bounds of the local Lipschitz constant of deep neural networks.
Our algorithm can provide provably exact computation of the Lipschitz constant for any p-norm.
arXiv Detail & Related papers (2021-05-12T08:06:11Z) - CLIP: Cheap Lipschitz Training of Neural Networks [0.0]
We investigate a variational regularization method named CLIP for controlling the Lipschitz constant of a neural network.
We mathematically analyze the proposed model, in particular discussing the impact of the chosen regularization parameter on the output of the network.
arXiv Detail & Related papers (2021-03-23T13:29:24Z) - Lipschitz Bounded Equilibrium Networks [3.2872586139884623]
This paper introduces new parameterizations of equilibrium neural networks, i.e. networks defined by implicit equations.
The new parameterization admits a Lipschitz bound during training via unconstrained optimization.
In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.
arXiv Detail & Related papers (2020-10-05T01:00:40Z) - On Lipschitz Regularization of Convolutional Layers using Toeplitz
Matrix Theory [77.18089185140767]
Lipschitz regularity is established as a key property of modern deep learning.
computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard.
We introduce a new upper bound for convolutional layers that is both tight and easy to compute.
arXiv Detail & Related papers (2020-06-15T13:23:34Z) - Exactly Computing the Local Lipschitz Constant of ReLU Networks [98.43114280459271]
The local Lipschitz constant of a neural network is a useful metric for robustness, generalization, and fairness evaluation.
We show strong inapproximability results for estimating Lipschitz constants of ReLU networks.
We leverage this algorithm to evaluate the tightness of competing Lipschitz estimators and the effects of regularized training on the Lipschitz constant.
arXiv Detail & Related papers (2020-03-02T22:15:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.