Improved Robustness and Functional Localization in Topographic CNNs Through Weight Similarity
- URL: http://arxiv.org/abs/2508.00043v1
- Date: Thu, 31 Jul 2025 14:02:40 GMT
- Title: Improved Robustness and Functional Localization in Topographic CNNs Through Weight Similarity
- Authors: Nhut Truong, Uri Hasson,
- Abstract summary: We compare topographic convolutional neural networks trained with two spatial constraints.<n>Weight Similarity (WS) pushes neighboring units to develop similar incoming weights, and Activation Similarity (AS) enforces similarity in unit activations.
- Score: 0.0
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Topographic neural networks are computational models that can simulate the spatial and functional organization of the brain. Topographic constraints in neural networks can be implemented in multiple ways, with potentially different impacts on the representations learned by the network. The impact of such different implementations has not been systematically examined. To this end, here we compare topographic convolutional neural networks trained with two spatial constraints: Weight Similarity (WS), which pushes neighboring units to develop similar incoming weights, and Activation Similarity (AS), which enforces similarity in unit activations. We evaluate the resulting models on classification accuracy, robustness to weight perturbations and input degradation, and the spatial organization of learned representations. Compared to both AS and standard CNNs, WS provided three main advantages: i) improved robustness to noise, also showing higher accuracy under weight corruption; ii) greater input sensitivity, reflected in higher activation variance; and iii) stronger functional localization, with units showing similar activations positioned at closer distances. In addition, WS produced differences in orientation tuning, symmetry sensitivity, and eccentricity profiles of units, indicating an influence of this spatial constraint on the representational geometry of the network. Our findings suggest that during end-to-end training, WS constraints produce more robust representations than AS or non-topographic CNNs. These findings also suggest that weight-based spatial constraints can shape feature learning and functional organization in biophysical inspired models.
Related papers
- TDSNNs: Competitive Topographic Deep Spiking Neural Networks for Visual Cortex Modeling [1.732019193517103]
We propose a novel Spatio-Temporal Constraints loss function for topographic deep spiking neural networks (SNNs)<n>Our results show that STC effectively generates representative topographic features across simulated visual cortical areas.<n>We also reveal that topographic organization facilitates efficient and stable temporal information processing via the spike mechanism in TDSNNs.
arXiv Detail & Related papers (2025-08-06T09:53:39Z) - Self-similarity Analysis in Deep Neural Networks [33.89368443432198]
Current research has found that some deep neural networks exhibit strong hierarchical self-similarity in feature representation or parameter distribution.<n>This paper proposes a complex network method based on the output features of hidden-layer neurons to investigate the self-similarity of feature networks constructed at different hidden layers.
arXiv Detail & Related papers (2025-07-23T09:01:53Z) - Topology-Aware Modeling for Unsupervised Simulation-to-Reality Point Cloud Recognition [63.55828203989405]
We introduce a novel Topology-Aware Modeling (TAM) framework for Sim2Real UDA on object point clouds.<n>Our approach mitigates the domain gap by leveraging global spatial topology, characterized by low-level, high-frequency 3D structures.<n>We propose an advanced self-training strategy that combines cross-domain contrastive learning with self-training.
arXiv Detail & Related papers (2025-06-26T11:53:59Z) - Physics-Driven Autoregressive State Space Models for Medical Image Reconstruction [5.208643222679356]
We introduce a physics-driven autoregressive state-space model (MambaRoll) for medical image reconstruction.<n>MambaRoll consistently outperforms state-of-the-art data-driven and physics-driven methods.
arXiv Detail & Related papers (2024-12-12T14:59:56Z) - Cognitive Networks and Performance Drive fMRI-Based State Classification Using DNN Models [0.0]
We employ two structurally different and complementary DNN-based models to classify individual cognitive states.
We show that despite the architectural differences, both models consistently produce a robust relationship between prediction accuracy and individual cognitive performance.
arXiv Detail & Related papers (2024-08-14T15:25:51Z) - Enhancing lattice kinetic schemes for fluid dynamics with Lattice-Equivariant Neural Networks [79.16635054977068]
We present a new class of equivariant neural networks, dubbed Lattice-Equivariant Neural Networks (LENNs)
Our approach develops within a recently introduced framework aimed at learning neural network-based surrogate models Lattice Boltzmann collision operators.
Our work opens towards practical utilization of machine learning-augmented Lattice Boltzmann CFD in real-world simulations.
arXiv Detail & Related papers (2024-05-22T17:23:15Z) - Improved Generalization of Weight Space Networks via Augmentations [53.87011906358727]
Learning in deep weight spaces (DWS) is an emerging research direction, with applications to 2D and 3D neural fields (INRs, NeRFs)
We empirically analyze the reasons for this overfitting and find that a key reason is the lack of diversity in DWS datasets.
To address this, we explore strategies for data augmentation in weight spaces and propose a MixUp method adapted for weight spaces.
arXiv Detail & Related papers (2024-02-06T15:34:44Z) - Multilayer Multiset Neuronal Networks -- MMNNs [55.2480439325792]
The present work describes multilayer multiset neuronal networks incorporating two or more layers of coincidence similarity neurons.
The work also explores the utilization of counter-prototype points, which are assigned to the image regions to be avoided.
arXiv Detail & Related papers (2023-08-28T12:55:13Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - On the benefits of robust models in modulation recognition [53.391095789289736]
Deep Neural Networks (DNNs) using convolutional layers are state-of-the-art in many tasks in communications.
In other domains, like image classification, DNNs have been shown to be vulnerable to adversarial perturbations.
We propose a novel framework to test the robustness of current state-of-the-art models.
arXiv Detail & Related papers (2021-03-27T19:58:06Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.