On Path Integration of Grid Cells: Group Representation and Isotropic
Scaling
- URL: http://arxiv.org/abs/2006.10259v6
- Date: Wed, 3 Nov 2021 07:33:35 GMT
- Title: On Path Integration of Grid Cells: Group Representation and Isotropic
Scaling
- Authors: Ruiqi Gao, Jianwen Xie, Xue-Xin Wei, Song-Chun Zhu, Ying Nian Wu
- Abstract summary: We conduct theoretical analysis of a general representation model of path integration by grid cells.
We learn hexagon grid patterns that share similar properties of the grid cells in the rodent brain.
The learned model is capable of accurate long distance path integration.
- Score: 135.0473739504851
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Understanding how grid cells perform path integration calculations remains a
fundamental problem. In this paper, we conduct theoretical analysis of a
general representation model of path integration by grid cells, where the 2D
self-position is encoded as a higher dimensional vector, and the 2D self-motion
is represented by a general transformation of the vector. We identify two
conditions on the transformation. One is a group representation condition that
is necessary for path integration. The other is an isotropic scaling condition
that ensures locally conformal embedding, so that the error in the vector
representation translates conformally to the error in the 2D self-position.
Then we investigate the simplest transformation, i.e., the linear
transformation, uncover its explicit algebraic and geometric structure as
matrix Lie group of rotation, and explore the connection between the isotropic
scaling condition and a special class of hexagon grid patterns. Finally, with
our optimization-based approach, we manage to learn hexagon grid patterns that
share similar properties of the grid cells in the rodent brain. The learned
model is capable of accurate long distance path integration. Code is available
at https://github.com/ruiqigao/grid-cell-path.
Related papers
- Emergence of Grid-like Representations by Training Recurrent Networks
with Conformal Normalization [48.99772993899573]
We study the emergence of hexagon grid patterns of grid cells based on a general recurrent neural network model.
We propose a simple yet general conformal normalization of the input velocity of the RNN.
We conduct extensive experiments to verify that conformal normalization is crucial for the emergence of hexagon grid patterns.
arXiv Detail & Related papers (2023-10-29T23:12:56Z) - Geometric Clifford Algebra Networks [53.456211342585824]
We propose Geometric Clifford Algebra Networks (GCANs) for modeling dynamical systems.
GCANs are based on symmetry group transformations using geometric (Clifford) algebras.
arXiv Detail & Related papers (2023-02-13T18:48:33Z) - Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells [52.425628028229156]
We study the properties of grid cells using recurrent network models.
We focus on a simple non-linear recurrent model that underlies the continuous attractor neural networks of grid cells.
arXiv Detail & Related papers (2022-10-06T05:26:49Z) - Hybrid Model-based / Data-driven Graph Transform for Image Coding [54.31406300524195]
We present a hybrid model-based / data-driven approach to encode an intra-prediction residual block.
The first $K$ eigenvectors of a transform matrix are derived from a statistical model, e.g., the asymmetric discrete sine transform (ADST) for stability.
Using WebP as a baseline image, experimental results show that our hybrid graph transform achieved better energy compaction than default discrete cosine transform (DCT) and better stability than KLT.
arXiv Detail & Related papers (2022-03-02T15:36:44Z) - Frame Averaging for Equivariant Shape Space Learning [85.42901997467754]
A natural way to incorporate symmetries in shape space learning is to ask that the mapping to the shape space (encoder) and mapping from the shape space (decoder) are equivariant to the relevant symmetries.
We present a framework for incorporating equivariance in encoders and decoders by introducing two contributions.
arXiv Detail & Related papers (2021-12-03T06:41:19Z) - 5* Knowledge Graph Embeddings with Projective Transformations [13.723120574076127]
We present a novel knowledge graph embedding model (5*E) in projective geometry.
It supports multiple simultaneous transformations - specifically inversion, reflection, translation, rotation, and homothety.
It outperforms existing approaches on the most widely used link prediction benchmarks.
arXiv Detail & Related papers (2020-06-08T23:28:07Z) - The general theory of permutation equivarant neural networks and higher
order graph variational encoders [6.117371161379209]
We derive formulae for general permutation equivariant layers, including the case where the layer acts on matrices by permuting their rows and columns simultaneously.
This case arises naturally in graph learning and relation learning applications.
We present a second order graph variational encoder, and show that the latent distribution of equivariant generative models must be exchangeable.
arXiv Detail & Related papers (2020-04-08T13:29:56Z) - Bipartite Link Prediction based on Topological Features via 2-hop Path [0.8223798883838329]
Linear-Graph Autoencoder(LGAE) has promising performance on challenging tasks such as link prediction and node clustering.
In this paper, we consider the case of bipartite link predictions where node attributes are unavailable.
Our approach consistently outperforms Graph Autoencoder and Linear Graph Autoencoder model in 10 out of 12 bipartite dataset and reaches competitive performances in 2 other bipartite dataset.
arXiv Detail & Related papers (2020-03-19T05:07:54Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.