Lipschitz Bounded Equilibrium Networks
- URL: http://arxiv.org/abs/2010.01732v1
- Date: Mon, 5 Oct 2020 01:00:40 GMT
- Title: Lipschitz Bounded Equilibrium Networks
- Authors: Max Revay, Ruigang Wang, Ian R. Manchester
- Abstract summary: This paper introduces new parameterizations of equilibrium neural networks, i.e. networks defined by implicit equations.
The new parameterization admits a Lipschitz bound during training via unconstrained optimization.
In image classification experiments we show that the Lipschitz bounds are very accurate and improve robustness to adversarial attacks.
- Score: 3.2872586139884623
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This paper introduces new parameterizations of equilibrium neural networks,
i.e. networks defined by implicit equations. This model class includes standard
multilayer and residual networks as special cases. The new parameterization
admits a Lipschitz bound during training via unconstrained optimization: no
projections or barrier functions are required. Lipschitz bounds are a common
proxy for robustness and appear in many generalization bounds. Furthermore,
compared to previous works we show well-posedness (existence of solutions)
under less restrictive conditions on the network weights and more natural
assumptions on the activation functions: that they are monotone and slope
restricted. These results are proved by establishing novel connections with
convex optimization, operator splitting on non-Euclidean spaces, and
contracting neural ODEs. In image classification experiments we show that the
Lipschitz bounds are very accurate and improve robustness to adversarial
attacks.
Related papers
- Novel Quadratic Constraints for Extending LipSDP beyond Slope-Restricted
Activations [52.031701581294804]
Lipschitz bounds for neural networks can be computed with upper time preservation guarantees.
Our paper bridges the gap and extends Lipschitz beyond slope-restricted activation functions.
Our proposed analysis is general and provides a unified approach for estimating $ell$ and $ell_infty$ Lipschitz bounds.
arXiv Detail & Related papers (2024-01-25T09:23:31Z) - Efficient Bound of Lipschitz Constant for Convolutional Layers by Gram
Iteration [122.51142131506639]
We introduce a precise, fast, and differentiable upper bound for the spectral norm of convolutional layers using circulant matrix theory.
We show through a comprehensive set of experiments that our approach outperforms other state-of-the-art methods in terms of precision, computational cost, and scalability.
It proves highly effective for the Lipschitz regularization of convolutional neural networks, with competitive results against concurrent approaches.
arXiv Detail & Related papers (2023-05-25T15:32:21Z) - Efficiently Computing Local Lipschitz Constants of Neural Networks via
Bound Propagation [79.13041340708395]
Lipschitz constants are connected to many properties of neural networks, such as robustness, fairness, and generalization.
Existing methods for computing Lipschitz constants either produce relatively loose upper bounds or are limited to small networks.
We develop an efficient framework for computing the $ell_infty$ local Lipschitz constant of a neural network by tightly upper bounding the norm of Clarke Jacobian.
arXiv Detail & Related papers (2022-10-13T22:23:22Z) - Rethinking Lipschitz Neural Networks for Certified L-infinity Robustness [33.72713778392896]
We study certified $ell_infty$ from a novel perspective of representing Boolean functions.
We develop a unified Lipschitz network that generalizes prior works, and design a practical version that can be efficiently trained.
arXiv Detail & Related papers (2022-10-04T17:55:27Z) - Sparsest Univariate Learning Models Under Lipschitz Constraint [31.28451181040038]
We propose continuous-domain formulations for one-dimensional regression problems.
We control the Lipschitz constant explicitly using a user-defined upper-bound.
We show that both problems admit global minimizers that are continuous and piecewise-linear.
arXiv Detail & Related papers (2021-12-27T07:03:43Z) - Training Certifiably Robust Neural Networks with Efficient Local
Lipschitz Bounds [99.23098204458336]
Certified robustness is a desirable property for deep neural networks in safety-critical applications.
We show that our method consistently outperforms state-of-the-art methods on MNIST and TinyNet datasets.
arXiv Detail & Related papers (2021-11-02T06:44:10Z) - Robust Implicit Networks via Non-Euclidean Contractions [63.91638306025768]
Implicit neural networks show improved accuracy and significant reduction in memory consumption.
They can suffer from ill-posedness and convergence instability.
This paper provides a new framework to design well-posed and robust implicit neural networks.
arXiv Detail & Related papers (2021-06-06T18:05:02Z) - CLIP: Cheap Lipschitz Training of Neural Networks [0.0]
We investigate a variational regularization method named CLIP for controlling the Lipschitz constant of a neural network.
We mathematically analyze the proposed model, in particular discussing the impact of the chosen regularization parameter on the output of the network.
arXiv Detail & Related papers (2021-03-23T13:29:24Z) - On Lipschitz Regularization of Convolutional Layers using Toeplitz
Matrix Theory [77.18089185140767]
Lipschitz regularity is established as a key property of modern deep learning.
computing the exact value of the Lipschitz constant of a neural network is known to be NP-hard.
We introduce a new upper bound for convolutional layers that is both tight and easy to compute.
arXiv Detail & Related papers (2020-06-15T13:23:34Z) - Approximating Lipschitz continuous functions with GroupSort neural
networks [3.416170716497814]
Recent advances in adversarial attacks and Wasserstein GANs have advocated for use of neural networks with restricted Lipschitz constants.
We show in particular how these networks can represent any Lipschitz continuous piecewise linear functions.
We also prove that they are well-suited for approximating Lipschitz continuous functions and exhibit upper bounds on both the depth and size.
arXiv Detail & Related papers (2020-06-09T13:37:43Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.