Emergence of Grid-like Representations by Training Recurrent Networks
with Conformal Normalization
- URL: http://arxiv.org/abs/2310.19192v2
- Date: Tue, 20 Feb 2024 04:47:50 GMT
- Title: Emergence of Grid-like Representations by Training Recurrent Networks
with Conformal Normalization
- Authors: Dehong Xu, Ruiqi Gao, Wen-Hao Zhang, Xue-Xin Wei, Ying Nian Wu
- Abstract summary: We study the emergence of hexagon grid patterns of grid cells based on a general recurrent neural network model.
We propose a simple yet general conformal normalization of the input velocity of the RNN.
We conduct extensive experiments to verify that conformal normalization is crucial for the emergence of hexagon grid patterns.
- Score: 48.99772993899573
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Grid cells in the entorhinal cortex of mammalian brains exhibit striking
hexagon grid firing patterns in their response maps as the animal (e.g., a rat)
navigates in a 2D open environment. In this paper, we study the emergence of
the hexagon grid patterns of grid cells based on a general recurrent neural
network (RNN) model that captures the navigation process. The responses of grid
cells collectively form a high dimensional vector, representing the 2D
self-position of the agent. As the agent moves, the vector is transformed by an
RNN that takes the velocity of the agent as input. We propose a simple yet
general conformal normalization of the input velocity of the RNN, so that the
local displacement of the position vector in the high-dimensional neural space
is proportional to the local displacement of the agent in the 2D physical
space, regardless of the direction of the input velocity. We apply this
mechanism to both a linear RNN and nonlinear RNNs. Theoretically, we provide an
understanding that explains the connection between conformal normalization and
the emergence of hexagon grid patterns. Empirically, we conduct extensive
experiments to verify that conformal normalization is crucial for the emergence
of hexagon grid patterns, across various types of RNNs. The learned patterns
share similar profiles to biological grid cells, and the topological properties
of the patterns also align with our theoretical understanding.
Related papers
- Recurrent Neural Networks Learn to Store and Generate Sequences using Non-Linear Representations [54.17275171325324]
We present a counterexample to the Linear Representation Hypothesis (LRH)
When trained to repeat an input token sequence, neural networks learn to represent the token at each position with a particular order of magnitude, rather than a direction.
These findings strongly indicate that interpretability research should not be confined to the LRH.
arXiv Detail & Related papers (2024-08-20T15:04:37Z) - An Investigation of Conformal Isometry Hypothesis for Grid Cells [45.67079714578615]
Conformal isometry hypothesis is a potential explanation for hexagonal periodic patterns in grid cell response maps.
We conduct numerical experiments to show that this hypothesis leads to the hexagon periodic patterns of grid cells.
We propose a conformal modulation of the agent's input velocity, enabling the recurrent neural network of grid cells to satisfy the conformal isometry hypothesis automatically.
arXiv Detail & Related papers (2024-05-27T06:31:39Z) - Semisupervised regression in latent structure networks on unknown
manifolds [7.5722195869569]
We consider random dot product graphs, in which an edge is formed between two nodes with probability given by the inner product of their respective latent positions.
We propose a manifold learning and graph embedding technique to predict the response variable on out-of-sample nodes.
arXiv Detail & Related papers (2023-05-04T00:41:04Z) - Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells [52.425628028229156]
We study the properties of grid cells using recurrent network models.
We focus on a simple non-linear recurrent model that underlies the continuous attractor neural networks of grid cells.
arXiv Detail & Related papers (2022-10-06T05:26:49Z) - VNT-Net: Rotational Invariant Vector Neuron Transformers [3.04585143845864]
We introduce a rotational invariant neural network by combining recently introduced vector neurons with self-attention layers.
Experiments demonstrate that our network efficiently handles 3D point cloud objects in arbitrary poses.
arXiv Detail & Related papers (2022-05-19T16:51:56Z) - On Path Integration of Grid Cells: Group Representation and Isotropic
Scaling [135.0473739504851]
We conduct theoretical analysis of a general representation model of path integration by grid cells.
We learn hexagon grid patterns that share similar properties of the grid cells in the rodent brain.
The learned model is capable of accurate long distance path integration.
arXiv Detail & Related papers (2020-06-18T03:44:35Z) - Gauge Equivariant Mesh CNNs: Anisotropic convolutions on geometric
graphs [81.12344211998635]
A common approach to define convolutions on meshes is to interpret them as a graph and apply graph convolutional networks (GCNs)
We propose Gauge Equivariant Mesh CNNs which generalize GCNs to apply anisotropic gauge equivariant kernels.
Our experiments validate the significantly improved expressivity of the proposed model over conventional GCNs and other methods.
arXiv Detail & Related papers (2020-03-11T17:21:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.