Grid Cells Are Ubiquitous in Neural Networks
- URL: http://arxiv.org/abs/2003.03482v2
- Date: Wed, 9 Sep 2020 09:38:15 GMT
- Title: Grid Cells Are Ubiquitous in Neural Networks
- Authors: Li Songlin, Deng Yangdong, Wang Zhihua
- Abstract summary: Grid cells are believed to play an important role in both spatial and non-spatial cognition tasks.
Recent study observed the emergence of grid cells in an LSTM for path integration.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Grid cells are believed to play an important role in both spatial and
non-spatial cognition tasks. A recent study observed the emergence of grid
cells in an LSTM for path integration. The connection between biological and
artificial neural networks underlying the seemingly similarity, as well as the
application domain of grid cells in deep neural networks (DNNs), expect further
exploration. This work demonstrated that grid cells could be replicated in
either pure vision based or vision guided path integration DNNs for navigation
under a proper setting of training parameters. We also show that grid-like
behaviors arise in feedforward DNNs for non-spatial tasks. Our findings support
that the grid coding is an effective representation for both biological and
artificial networks.
Related papers
- Coding schemes in neural networks learning classification tasks [52.22978725954347]
We investigate fully-connected, wide neural networks learning classification tasks.
We show that the networks acquire strong, data-dependent features.
Surprisingly, the nature of the internal representations depends crucially on the neuronal nonlinearity.
arXiv Detail & Related papers (2024-06-24T14:50:05Z) - Fully Spiking Actor Network with Intra-layer Connections for
Reinforcement Learning [51.386945803485084]
We focus on the task where the agent needs to learn multi-dimensional deterministic policies to control.
Most existing spike-based RL methods take the firing rate as the output of SNNs, and convert it to represent continuous action space (i.e., the deterministic policy) through a fully-connected layer.
To develop a fully spiking actor network without any floating-point matrix operations, we draw inspiration from the non-spiking interneurons found in insects.
arXiv Detail & Related papers (2024-01-09T07:31:34Z) - Self-Supervised Learning of Representations for Space Generates
Multi-Modular Grid Cells [16.208253624969142]
mammalian lineage has developed striking spatial representations.
One important spatial representation is the Nobel-prize winning grid cells.
Nobel-prize winning grid cells represent self-location, a local and aperiodic quantity.
arXiv Detail & Related papers (2023-11-04T03:59:37Z) - Inferring Gene Regulatory Neural Networks for Bacterial Decision Making
in Biofilms [4.459301404374565]
Bacterial cells are sensitive to a range of external signals used to learn the environment.
An inherited Gene Regulatory Neural Network (GRNN) behavior enables the cellular decision-making.
GRNNs can perform computational tasks for bio-hybrid computing systems.
arXiv Detail & Related papers (2023-01-10T22:07:33Z) - Functional Connectome: Approximating Brain Networks with Artificial
Neural Networks [1.952097552284465]
We show that trained deep neural networks are able to capture the computations performed by synthetic biological networks with high accuracy.
We show that trained deep neural networks are able to perform zero-shot generalisation in novel environments.
Our study reveals a novel and promising direction in systems neuroscience, and can be expanded upon with a multitude of downstream applications.
arXiv Detail & Related papers (2022-11-23T13:12:13Z) - Conformal Isometry of Lie Group Representation in Recurrent Network of
Grid Cells [52.425628028229156]
We study the properties of grid cells using recurrent network models.
We focus on a simple non-linear recurrent model that underlies the continuous attractor neural networks of grid cells.
arXiv Detail & Related papers (2022-10-06T05:26:49Z) - Deep Neural Networks as Complex Networks [1.704936863091649]
We use Complex Network Theory to represent Deep Neural Networks (DNNs) as directed weighted graphs.
We introduce metrics to study DNNs as dynamical systems, with a granularity that spans from weights to layers, including neurons.
We show that our metrics discriminate low vs. high performing networks.
arXiv Detail & Related papers (2022-09-12T16:26:04Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Grid Cell Path Integration For Movement-Based Visual Object Recognition [0.0]
We show how grid cell-based path integration in a cortical network can support reliable recognition of objects given an arbitrary sequence of inputs.
Our network (GridCellNet) uses grid cell computations to integrate visual information and make predictions based on movements.
arXiv Detail & Related papers (2021-02-17T23:52:57Z) - Exploiting Heterogeneity in Operational Neural Networks by Synaptic
Plasticity [87.32169414230822]
Recently proposed network model, Operational Neural Networks (ONNs), can generalize the conventional Convolutional Neural Networks (CNNs)
In this study the focus is drawn on searching the best-possible operator set(s) for the hidden neurons of the network based on the Synaptic Plasticity paradigm that poses the essential learning theory in biological neurons.
Experimental results over highly challenging problems demonstrate that the elite ONNs even with few neurons and layers can achieve a superior learning performance than GIS-based ONNs.
arXiv Detail & Related papers (2020-08-21T19:03:23Z) - Binarized Graph Neural Network [65.20589262811677]
We develop a binarized graph neural network to learn the binary representations of the nodes with binary network parameters.
Our proposed method can be seamlessly integrated into the existing GNN-based embedding approaches.
Experiments indicate that the proposed binarized graph neural network, namely BGN, is orders of magnitude more efficient in terms of both time and space.
arXiv Detail & Related papers (2020-04-19T09:43:14Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.