Local and global topological complexity measures OF ReLU neural network functions
- URL: http://arxiv.org/abs/2204.06062v2
- Date: Mon, 1 Apr 2024 20:19:49 GMT
- Title: Local and global topological complexity measures OF ReLU neural network functions
- Authors: J. Elisenda Grigsby, Kathryn Lindsey, Marissa Masden,
- Abstract summary: We apply a piecewise-linear (PL) version of Morse theory due to Grunert-Kuhnel-Rote to define and study new local and global notions of topological complexity.
We show how to construct, for each such F, a canonical polytopal complex K(F) and a deformation retract of the domain onto K(F), yielding a convenient compact model for performing calculations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: We apply a generalized piecewise-linear (PL) version of Morse theory due to Grunert-Kuhnel-Rote to define and study new local and global notions of topological complexity for fully-connected feedforward ReLU neural network functions, F: R^n -> R. Along the way, we show how to construct, for each such F, a canonical polytopal complex K(F) and a deformation retract of the domain onto K(F), yielding a convenient compact model for performing calculations. We also give a construction showing that local complexity can be arbitrarily high.
Related papers
- Geometric Neural Process Fields [58.77241763774756]
Geometric Neural Process Fields (G-NPF) is a probabilistic framework for neural radiance fields that explicitly captures uncertainty.
Building on these bases, we design a hierarchical latent variable model, allowing G-NPF to integrate structural information across multiple spatial levels.
Experiments on novel-view synthesis for 3D scenes, as well as 2D image and 1D signal regression, demonstrate the effectiveness of our method.
arXiv Detail & Related papers (2025-02-04T14:17:18Z) - On the Local Complexity of Linear Regions in Deep ReLU Networks [15.335716956682203]
We show theoretically that ReLU networks that learn low-dimensional feature representations have a lower local complexity.
In particular, we show that the local complexity serves as an upper bound on the total variation of the function over the input data distribution.
arXiv Detail & Related papers (2024-12-24T08:42:39Z) - Combinatorial Regularity for Relatively Perfect Discrete Morse Gradient Vector Fields of ReLU Neural Networks [0.0]
ReLU neural networks induce a piecewise linear decomposition of their input space called the canonical polyhedral complex.
It has previously been established that it is decidable whether a ReLU neural network is piecewise linear Morse.
arXiv Detail & Related papers (2024-12-23T21:58:51Z) - Data Topology-Dependent Upper Bounds of Neural Network Widths [52.58441144171022]
We first show that a three-layer neural network can be designed to approximate an indicator function over a compact set.
This is then extended to a simplicial complex, deriving width upper bounds based on its topological structure.
We prove the universal approximation property of three-layer ReLU networks using our topological approach.
arXiv Detail & Related papers (2023-05-25T14:17:15Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Towards Understanding Theoretical Advantages of Complex-Reaction
Networks [77.34726150561087]
We show that a class of functions can be approximated by a complex-reaction network using the number of parameters.
For empirical risk minimization, our theoretical result shows that the critical point set of complex-reaction networks is a proper subset of that of real-valued networks.
arXiv Detail & Related papers (2021-08-15T10:13:49Z) - On the Expected Complexity of Maxout Networks [0.0]
Recent works have shown that the practical complexity of deep ReLU networks is often far from the theoretical maximum.
In this work we show that this phenomenon also occurs in networks with maxout (multi-argument) activation functions.
We also show that the parameter space has a multitude of full-dimensional regions with widely different complexity, and obtain nontrivial lower bounds on the expected complexity.
arXiv Detail & Related papers (2021-07-01T11:36:32Z) - On transversality of bent hyperplane arrangements and the topological
expressiveness of ReLU neural networks [0.0]
We investigate how the architecture of F impacts the geometry and topology of its possible decision regions for binary classification tasks.
We use this obstruction to prove that a decision region of a generic, ReLU network F: Rn -> R with a single hidden layer of dimension can have no more than one bounded connected component.
arXiv Detail & Related papers (2020-08-20T16:06:39Z) - Measuring Model Complexity of Neural Networks with Curve Activation
Functions [100.98319505253797]
We propose the linear approximation neural network (LANN) to approximate a given deep model with curve activation function.
We experimentally explore the training process of neural networks and detect overfitting.
We find that the $L1$ and $L2$ regularizations suppress the increase of model complexity.
arXiv Detail & Related papers (2020-06-16T07:38:06Z) - Local Propagation in Constraint-based Neural Network [77.37829055999238]
We study a constraint-based representation of neural network architectures.
We investigate a simple optimization procedure that is well suited to fulfil the so-called architectural constraints.
arXiv Detail & Related papers (2020-02-18T16:47:38Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.