A Relative Homology Theory of Representation in Neural Networks
- URL: http://arxiv.org/abs/2502.01360v2
- Date: Fri, 14 Feb 2025 10:28:36 GMT
- Title: A Relative Homology Theory of Representation in Neural Networks
- Authors: Kosio Beshkov,
- Abstract summary: Previous research has proven that the set of maps implemented by neural networks with a ReLU activation function is identical to the set of piecewise linear continuous maps.
We leverage these properties to define the equivalence class of inputs $sim_Phi$, which can be split into two sets related to the local rank of $Phi_J$ and the intersections $cap textImPhi_J_i$.
We refer to the latter as the overlap decomposition $O_Phi$ and prove that if the intersections between each polyhedron and the input manifold are
- Score: 0.0
- License:
- Abstract: Previous research has proven that the set of maps implemented by neural networks with a ReLU activation function is identical to the set of piecewise linear continuous maps. Furthermore, such networks induce a hyperplane arrangement splitting the input domain into convex polyhedra $G_J$ over which the network $\Phi$ operates in an affine manner. In this work, we leverage these properties to define the equivalence class of inputs $\sim_\Phi$, which can be split into two sets related to the local rank of $\Phi_J$ and the intersections $\cap \text{Im}\Phi_{J_i}$. We refer to the latter as the overlap decomposition $O_\Phi$ and prove that if the intersections between each polyhedron and the input manifold are convex, the homology groups of neural representations are isomorphic to relative homology groups $H_k(\Phi(M)) \simeq H_k(M,O_\Phi)$. This lets us compute Betti numbers without the choice of an external metric. We develop methods to numerically compute the overlap decomposition through linear programming and a union-find algorithm. Using this framework, we perform several experiments on toy datasets showing that, compared to standard persistent homology, our relative homology-based computation of Betti numbers tracks purely topological rather than geometric features. Finally, we study the evolution of the overlap decomposition during training on various classification problems while varying network width and depth and discuss some shortcomings of our method.
Related papers
- The learned range test method for the inverse inclusion problem [1.7341747123281803]
We show that the reconstruction algorithm based on the range test can be written as a neural network with a specific architecture.
We propose to learn the weights of this network in the framework of supervised learning.
The numerical simulations show that this learned range test method provides accurate and stable reconstructions of polygonal inclusions.
arXiv Detail & Related papers (2024-11-01T09:24:05Z) - Differential Equation Scaling Limits of Shaped and Unshaped Neural Networks [8.716913598251386]
We find similar differential equation based characterization for two types of unshaped networks.
We derive the first order correction to the layerwise correlation.
These results together provide a connection between shaped and unshaped network architectures.
arXiv Detail & Related papers (2023-10-18T16:15:10Z) - Detection-Recovery Gap for Planted Dense Cycles [72.4451045270967]
We consider a model where a dense cycle with expected bandwidth $n tau$ and edge density $p$ is planted in an ErdHos-R'enyi graph $G(n,q)$.
We characterize the computational thresholds for the associated detection and recovery problems for the class of low-degree algorithms.
arXiv Detail & Related papers (2023-02-13T22:51:07Z) - Deep Invertible Approximation of Topologically Rich Maps between
Manifolds [17.60434807901964]
We show how to design neural networks that allow for stable universal approximation of maps between topologically interesting manifold.
By exploiting the topological parallels between locally bilipschitz maps, covering spaces, and local homeomorphisms, we find that a novel network of the form $mathcalT circ p circ mathcalE$ is a universal approximator of local diffeomorphisms.
We also outline possible extensions of our architecture to address molecular imaging of molecules with symmetries.
arXiv Detail & Related papers (2022-10-02T17:14:43Z) - A singular Riemannian geometry approach to Deep Neural Networks II.
Reconstruction of 1-D equivalence classes [78.120734120667]
We build the preimage of a point in the output manifold in the input space.
We focus for simplicity on the case of neural networks maps from n-dimensional real spaces to (n - 1)-dimensional real spaces.
arXiv Detail & Related papers (2021-12-17T11:47:45Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - The decomposition of the higher-order homology embedding constructed
from the $k$-Laplacian [5.076419064097734]
The null space of the $k$-th order Laplacian $mathbfmathcal L_k$ encodes the non-trivial topology of a manifold or a network.
We propose an algorithm to factorize the homology embedding into subspaces corresponding to a manifold's simplest topological components.
arXiv Detail & Related papers (2021-07-23T00:40:01Z) - Distributed Saddle-Point Problems Under Similarity [173.19083235638104]
We show that a given suboptimality $epsilon0$ is achieved master/workers networks in $Omegabig.
We then propose algorithms matching the lower bounds either types of networks (up to log-overs)
We assess effectiveness of the proposed algorithms on a robust logistic regression problem.
arXiv Detail & Related papers (2021-07-22T14:25:16Z) - ResNet-LDDMM: Advancing the LDDMM Framework Using Deep Residual Networks [86.37110868126548]
In this work, we make use of deep residual neural networks to solve the non-stationary ODE (flow equation) based on a Euler's discretization scheme.
We illustrate these ideas on diverse registration problems of 3D shapes under complex topology-preserving transformations.
arXiv Detail & Related papers (2021-02-16T04:07:13Z) - Stochastic Flows and Geometric Optimization on the Orthogonal Group [52.50121190744979]
We present a new class of geometrically-driven optimization algorithms on the orthogonal group $O(d)$.
We show that our methods can be applied in various fields of machine learning including deep, convolutional and recurrent neural networks, reinforcement learning, flows and metric learning.
arXiv Detail & Related papers (2020-03-30T15:37:50Z) - Reduced Dilation-Erosion Perceptron for Binary Classification [1.3706331473063877]
Dilation-erosion perceptron (DEP) is a neural network obtained by a convex combination of a dilation and an erosion.
This paper introduces the reduced dilation-erosion (r-DEP) classifier.
arXiv Detail & Related papers (2020-03-04T19:50:35Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.