Recursive Self-Similarity in Deep Weight Spaces of Neural Architectures: A Fractal and Coarse Geometry Perspective
- URL: http://arxiv.org/abs/2503.14298v1
- Date: Tue, 18 Mar 2025 14:41:23 GMT
- Title: Recursive Self-Similarity in Deep Weight Spaces of Neural Architectures: A Fractal and Coarse Geometry Perspective
- Authors: Ambarish Moharil, Indika Kumara, Damian Andrew Tamburri, Majid Mohammadi, Willem-Jan van den Heuvel,
- Abstract summary: This paper conceptualizes the Deep Weight Spaces as hierarchical, fractal-like, coarse geometric structures observable at discrete integer scales.<n>We introduce a coarse group action termed the fractal transformation, $T_r_k $, acting under the symmetry group $G = (mathbbZ, +) $.<n>This perspective adopts a box count technique, commonly used to assess the hierarchical and scale-related geometry of physical structures.
- Score: 2.9130383514140292
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: This paper conceptualizes the Deep Weight Spaces (DWS) of neural architectures as hierarchical, fractal-like, coarse geometric structures observable at discrete integer scales through recursive dilation. We introduce a coarse group action termed the fractal transformation, $T_{r_k} $, acting under the symmetry group $G = (\mathbb{Z}, +) $, to analyze neural parameter matrices or tensors, by segmenting the underlying discrete grid $\Omega$ into $N(r_k)$ fractals across varying observation scales $ r_k $. This perspective adopts a box count technique, commonly used to assess the hierarchical and scale-related geometry of physical structures, which has been extensively formalized under the topic of fractal geometry. We assess the structural complexity of neural layers by estimating the Hausdorff-Besicovitch dimension of their layers and evaluating a degree of self-similarity. The fractal transformation features key algebraic properties such as linearity, identity, and asymptotic invertibility, which is a signature of coarse structures. We show that the coarse group action exhibits a set of symmetries such as Discrete Scale Invariance (DSI) under recursive dilation, strong invariance followed by weak equivariance to permutations, alongside respecting the scaling equivariance of activation functions, defined by the intertwiner group relations. Our framework targets large-scale structural properties of DWS, deliberately overlooking minor inconsistencies to focus on significant geometric characteristics of neural networks. Experiments on CIFAR-10 using ResNet-18, VGG-16, and a custom CNN validate our approach, demonstrating effective fractal segmentation and structural analysis.
Related papers
- Relative Representations: Topological and Geometric Perspectives [53.88896255693922]
Relative representations are an established approach to zero-shot model stitching.
We introduce a normalization procedure in the relative transformation, resulting in invariance to non-isotropic rescalings and permutations.
Second, we propose to deploy topological densification when fine-tuning relative representations, a topological regularization loss encouraging clustering within classes.
arXiv Detail & Related papers (2024-09-17T08:09:22Z) - Representing Topological Self-Similarity Using Fractal Feature Maps for Accurate Segmentation of Tubular Structures [12.038095281876071]
In this study, we incorporate fractal features into a deep learning model by extending FD to the pixel-level using a sliding window technique.
The resulting fractal feature maps (FFMs) are then incorporated as additional input to the model and additional weight in the loss function.
Experiments on five tubular structure datasets validate the effectiveness and robustness of our approach.
arXiv Detail & Related papers (2024-07-20T05:22:59Z) - Geometric Generative Models based on Morphological Equivariant PDEs and GANs [3.6498648388765513]
We propose a geometric generative model based on an equivariant partial differential equation (PDE) for group convolution neural networks (G-CNNs)
The proposed geometric morphological GAN (GM-GAN) is obtained by using the proposed morphological equivariant convolutions in PDE-G-CNNs.
Preliminary results show that GM-GAN model outperforms classical GAN.
arXiv Detail & Related papers (2024-03-22T01:02:09Z) - Information-Theoretic Thresholds for Planted Dense Cycles [52.076657911275525]
We study a random graph model for small-world networks which are ubiquitous in social and biological sciences.
For both detection and recovery of the planted dense cycle, we characterize the information-theoretic thresholds in terms of $n$, $tau$, and an edge-wise signal-to-noise ratio $lambda$.
arXiv Detail & Related papers (2024-02-01T03:39:01Z) - Affine Invariance in Continuous-Domain Convolutional Neural Networks [6.019182604573028]
This research studies affine invariance on continuous-domain convolutional neural networks.
We introduce a new criterion to assess the similarity of two input signals under affine transformations.
Our research could eventually extend the scope of geometrical transformations that practical deep-learning pipelines can handle.
arXiv Detail & Related papers (2023-11-13T14:17:57Z) - Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - Deep Learning Symmetries and Their Lie Groups, Algebras, and Subalgebras
from First Principles [55.41644538483948]
We design a deep-learning algorithm for the discovery and identification of the continuous group of symmetries present in a labeled dataset.
We use fully connected neural networks to model the transformations symmetry and the corresponding generators.
Our study also opens the door for using a machine learning approach in the mathematical study of Lie groups and their properties.
arXiv Detail & Related papers (2023-01-13T16:25:25Z) - Dist2Cycle: A Simplicial Neural Network for Homology Localization [66.15805004725809]
Simplicial complexes can be viewed as high dimensional generalizations of graphs that explicitly encode multi-way ordered relations.
We propose a graph convolutional model for learning functions parametrized by the $k$-homological features of simplicial complexes.
arXiv Detail & Related papers (2021-10-28T14:59:41Z) - Geometric Deep Learning and Equivariant Neural Networks [0.9381376621526817]
We survey the mathematical foundations of geometric deep learning, focusing on group equivariant and gauge equivariant neural networks.
We develop gauge equivariant convolutional neural networks on arbitrary manifold $mathcalM$ using principal bundles with structure group $K$ and equivariant maps between sections of associated vector bundles.
We analyze several applications of this formalism, including semantic segmentation and object detection networks.
arXiv Detail & Related papers (2021-05-28T15:41:52Z) - Invariant Deep Compressible Covariance Pooling for Aerial Scene
Categorization [80.55951673479237]
We propose a novel invariant deep compressible covariance pooling (IDCCP) to solve nuisance variations in aerial scene categorization.
We conduct extensive experiments on the publicly released aerial scene image data sets and demonstrate the superiority of this method compared with state-of-the-art methods.
arXiv Detail & Related papers (2020-11-11T11:13:07Z) - Fractal Gaussian Networks: A sparse random graph model based on Gaussian
Multiplicative Chaos [12.096252285460814]
We propose a novel network model, called Fractal Gaussian Network (FGN)
FGN embodies well-defined and analytically tractable fractal structures.
arXiv Detail & Related papers (2020-08-07T08:37:36Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.