On the Expected Complexity of Maxout Networks
- URL: http://arxiv.org/abs/2107.00379v1
- Date: Thu, 1 Jul 2021 11:36:32 GMT
- Title: On the Expected Complexity of Maxout Networks
- Authors: Hanna Tseran, Guido Mont\'ufar
- Abstract summary: Recent works have shown that the practical complexity of deep ReLU networks is often far from the theoretical maximum.
In this work we show that this phenomenon also occurs in networks with maxout (multi-argument) activation functions.
We also show that the parameter space has a multitude of full-dimensional regions with widely different complexity, and obtain nontrivial lower bounds on the expected complexity.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Learning with neural networks relies on the complexity of the representable
functions, but more importantly, the particular assignment of typical
parameters to functions of different complexity. Taking the number of
activation regions as a complexity measure, recent works have shown that the
practical complexity of deep ReLU networks is often far from the theoretical
maximum. In this work we show that this phenomenon also occurs in networks with
maxout (multi-argument) activation functions and when considering the decision
boundaries in classification tasks. We also show that the parameter space has a
multitude of full-dimensional regions with widely different complexity, and
obtain nontrivial lower bounds on the expected complexity. Finally, we
investigate different parameter initialization procedures and show that they
can increase the speed of convergence in training.
Related papers
- Spectral complexity of deep neural networks [2.099922236065961]
We use the angular power spectrum of the limiting field to characterize the complexity of the network architecture.
On this basis, we classify neural networks as low-disorder, sparse, or high-disorder.
We show how this classification highlights a number of distinct features for standard activation functions, and in particular, sparsity properties of ReLU networks.
arXiv Detail & Related papers (2024-05-15T17:55:05Z) - Exploring the Complexity of Deep Neural Networks through Functional Equivalence [1.3597551064547502]
We present a novel bound on the covering number for deep neural networks, which reveals that the complexity of neural networks can be reduced.
We demonstrate that functional equivalence benefits optimization, as over parameterized networks tend to be easier to train since increasing network width leads to a diminishing volume of the effective parameter space.
arXiv Detail & Related papers (2023-05-19T04:01:27Z) - Bayes Complexity of Learners vs Overfitting [4.873362301533825]
We show that a new notion of complexity of functions governs a PAC Bayes-like generalization bound.
In contrast to previous works, our notion naturally generalizes to neural networks with several layers.
An upper-bound we derive allows to show a separation in the number of samples needed for good generalization between 2 and 4-layer neural networks.
arXiv Detail & Related papers (2023-03-13T13:07:02Z) - On Rademacher Complexity-based Generalization Bounds for Deep Learning [18.601449856300984]
We show that the Rademacher complexity-based approach can generate non-vacuous generalisation bounds on Convolutional Neural Networks (CNNs)
Our results show that the Rademacher complexity does not depend on the network length for CNNs with some special types of activation functions such as ReLU, Leaky ReLU, Parametric Rectifier Linear Unit, Sigmoid, and Tanh.
arXiv Detail & Related papers (2022-08-08T17:24:04Z) - Deep Architecture Connectivity Matters for Its Convergence: A
Fine-Grained Analysis [94.64007376939735]
We theoretically characterize the impact of connectivity patterns on the convergence of deep neural networks (DNNs) under gradient descent training.
We show that by a simple filtration on "unpromising" connectivity patterns, we can trim down the number of models to evaluate.
arXiv Detail & Related papers (2022-05-11T17:43:54Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z) - What training reveals about neural network complexity [80.87515604428346]
This work explores the hypothesis that the complexity of the function a deep neural network (NN) is learning can be deduced by how fast its weights change during training.
Our results support the hypothesis that good training behavior can be a useful bias towards good generalization.
arXiv Detail & Related papers (2021-06-08T08:58:00Z) - On Function Approximation in Reinforcement Learning: Optimism in the
Face of Large State Spaces [208.67848059021915]
We study the exploration-exploitation tradeoff at the core of reinforcement learning.
In particular, we prove that the complexity of the function class $mathcalF$ characterizes the complexity of the function.
Our regret bounds are independent of the number of episodes.
arXiv Detail & Related papers (2020-11-09T18:32:22Z) - UNIPoint: Universally Approximating Point Processes Intensities [125.08205865536577]
We provide a proof that a class of learnable functions can universally approximate any valid intensity function.
We implement UNIPoint, a novel neural point process model, using recurrent neural networks to parameterise sums of basis function upon each event.
arXiv Detail & Related papers (2020-07-28T09:31:56Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.