Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells
- URL: http://arxiv.org/abs/2210.02684v1
- Date: Thu, 6 Oct 2022 05:26:49 GMT
- Title: Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells
- Authors: Dehong Xu, Ruiqi Gao, Wen-Hao Zhang, Xue-Xin Wei, Ying Nian Wu
- Abstract summary: We study the properties of grid cells using recurrent network models.
We focus on a simple non-linear recurrent model that underlies the continuous attractor neural networks of grid cells.
- Score: 52.425628028229156
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: The activity of the grid cell population in the medial entorhinal cortex
(MEC) of the brain forms a vector representation of the self-position of the
animal. Recurrent neural networks have been developed to explain the properties
of the grid cells by transforming the vector based on the input velocity, so
that the grid cells can perform path integration. In this paper, we investigate
the algebraic, geometric, and topological properties of grid cells using
recurrent network models. Algebraically, we study the Lie group and Lie algebra
of the recurrent transformation as a representation of self-motion.
Geometrically, we study the conformal isometry of the Lie group representation
of the recurrent network where the local displacement of the vector in the
neural space is proportional to the local displacement of the agent in the 2D
physical space. We then focus on a simple non-linear recurrent model that
underlies the continuous attractor neural networks of grid cells. Our numerical
experiments show that conformal isometry leads to hexagon periodic patterns of
the response maps of grid cells and our model is capable of accurate path
integration.
Related papers
- An Investigation of Conformal Isometry Hypothesis for Grid Cells [45.67079714578615]
Conformal isometry hypothesis is a potential explanation for hexagonal periodic patterns in grid cell response maps.
We conduct numerical experiments to show that this hypothesis leads to the hexagon periodic patterns of grid cells.
We propose a conformal modulation of the agent's input velocity, enabling the recurrent neural network of grid cells to satisfy the conformal isometry hypothesis automatically.
arXiv Detail & Related papers (2024-05-27T06:31:39Z) - Image segmentation with traveling waves in an exactly solvable recurrent
neural network [71.74150501418039]
We show that a recurrent neural network can effectively divide an image into groups according to a scene's structural characteristics.
We present a precise description of the mechanism underlying object segmentation in this network.
We then demonstrate a simple algorithm for object segmentation that generalizes across inputs ranging from simple geometric objects in grayscale images to natural images.
arXiv Detail & Related papers (2023-11-28T16:46:44Z) - Emergence of Grid-like Representations by Training Recurrent Networks
with Conformal Normalization [48.99772993899573]
We study the emergence of hexagon grid patterns of grid cells based on a general recurrent neural network model.
We propose a simple yet general conformal normalization of the input velocity of the RNN.
We conduct extensive experiments to verify that conformal normalization is crucial for the emergence of hexagon grid patterns.
arXiv Detail & Related papers (2023-10-29T23:12:56Z) - Semisupervised regression in latent structure networks on unknown
manifolds [7.5722195869569]
We consider random dot product graphs, in which an edge is formed between two nodes with probability given by the inner product of their respective latent positions.
We propose a manifold learning and graph embedding technique to predict the response variable on out-of-sample nodes.
arXiv Detail & Related papers (2023-05-04T00:41:04Z) - LieTransformer: Equivariant self-attention for Lie Groups [49.9625160479096]
Group equivariant neural networks are used as building blocks of group invariant neural networks.
We extend the scope of the literature to self-attention, that is emerging as a prominent building block of deep learning models.
We propose the LieTransformer, an architecture composed of LieSelfAttention layers that are equivariant to arbitrary Lie groups and their discrete subgroups.
arXiv Detail & Related papers (2020-12-20T11:02:49Z) - On Path Integration of Grid Cells: Group Representation and Isotropic
Scaling [135.0473739504851]
We conduct theoretical analysis of a general representation model of path integration by grid cells.
We learn hexagon grid patterns that share similar properties of the grid cells in the rodent brain.
The learned model is capable of accurate long distance path integration.
arXiv Detail & Related papers (2020-06-18T03:44:35Z) - Grid Cells Are Ubiquitous in Neural Networks [0.0]
Grid cells are believed to play an important role in both spatial and non-spatial cognition tasks.
Recent study observed the emergence of grid cells in an LSTM for path integration.
arXiv Detail & Related papers (2020-03-07T01:40:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.