Developing hierarchical anticipations via neural network-based event
segmentation
- URL: http://arxiv.org/abs/2206.02042v1
- Date: Sat, 4 Jun 2022 18:54:31 GMT
- Title: Developing hierarchical anticipations via neural network-based event
segmentation
- Authors: Christian Gumbsch, Maurits Adam, Birgit Elsner, Georg Martius, Martin
V.Butz
- Abstract summary: We model the development of hierarchical predictions via autonomously learned latent event codes.
We present a hierarchical recurrent neural network architecture, whose inductive learning biases foster the development of sparsely changing latent state.
A higher level network learns to predict the situations in which the latent states tend to change.
- Score: 14.059479351946386
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Humans can make predictions on various time scales and hierarchical levels.
Thereby, the learning of event encodings seems to play a crucial role. In this
work we model the development of hierarchical predictions via autonomously
learned latent event codes. We present a hierarchical recurrent neural network
architecture, whose inductive learning biases foster the development of
sparsely changing latent state that compress sensorimotor sequences. A higher
level network learns to predict the situations in which the latent states tend
to change. Using a simulated robotic manipulator, we demonstrate that the
system (i) learns latent states that accurately reflect the event structure of
the data, (ii) develops meaningful temporal abstract predictions on the higher
level, and (iii) generates goal-anticipatory behavior similar to gaze behavior
found in eye-tracking studies with infants. The architecture offers a step
towards autonomous, self-motivated learning of compressed hierarchical
encodings of gathered experiences and the exploitation of these encodings for
the generation of highly versatile, adaptive behavior.
Related papers
- Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - A developmental approach for training deep belief networks [0.46699574490885926]
Deep belief networks (DBNs) are neural networks that can extract rich internal representations of the environment from the sensory data.
We present iDBN, an iterative learning algorithm for DBNs that allows to jointly update the connection weights across all layers of the hierarchy.
Our work paves the way to the use of iDBN for modeling neurocognitive development.
arXiv Detail & Related papers (2022-07-12T11:37:58Z) - Data-driven emergence of convolutional structure in neural networks [83.4920717252233]
We show how fully-connected neural networks solving a discrimination task can learn a convolutional structure directly from their inputs.
By carefully designing data models, we show that the emergence of this pattern is triggered by the non-Gaussian, higher-order local structure of the inputs.
arXiv Detail & Related papers (2022-02-01T17:11:13Z) - Differentiable Generalised Predictive Coding [2.868176771215219]
This paper deals with differentiable dynamical models congruent with neural process theories that cast brain function as the hierarchical refinement of an internal generative model explaining observations.
Our work extends existing implementations of gradient-based predictive coding and allows to integrate deep neural networks for non-linear state parameterization.
arXiv Detail & Related papers (2021-12-02T22:02:56Z) - Backprop-Free Reinforcement Learning with Active Neural Generative
Coding [84.11376568625353]
We propose a computational framework for learning action-driven generative models without backpropagation of errors (backprop) in dynamic environments.
We develop an intelligent agent that operates even with sparse rewards, drawing inspiration from the cognitive theory of planning as inference.
The robust performance of our agent offers promising evidence that a backprop-free approach for neural inference and learning can drive goal-directed behavior.
arXiv Detail & Related papers (2021-07-10T19:02:27Z) - Towards a Predictive Processing Implementation of the Common Model of
Cognition [79.63867412771461]
We describe an implementation of the common model of cognition grounded in neural generative coding and holographic associative memory.
The proposed system creates the groundwork for developing agents that learn continually from diverse tasks as well as model human performance at larger scales.
arXiv Detail & Related papers (2021-05-15T22:55:23Z) - Latent Event-Predictive Encodings through Counterfactual Regularization [0.9449650062296823]
We introduce a SUrprise-GAted Recurrent neural network (SUGAR) using a novel form of counterfactual regularization.
We test the model on a hierarchical sequence prediction task, where sequences are generated by alternating hidden graph structures.
arXiv Detail & Related papers (2021-05-12T18:30:09Z) - Learning to Abstract and Predict Human Actions [60.85905430007731]
We model the hierarchical structure of human activities in videos and demonstrate the power of such structure in action prediction.
We propose Hierarchical-Refresher-Anticipator, a multi-level neural machine that can learn the structure of human activities by observing a partial hierarchy of events and roll-out such structure into a future prediction in multiple levels of abstraction.
arXiv Detail & Related papers (2020-08-20T23:57:58Z) - Self-organization of multi-layer spiking neural networks [4.859525864236446]
A key mechanism that enables the formation of complex architecture in the developing brain is the emergence of traveling-temporal waves of neuronal activity.
We propose a modular tool-kit in the form of a dynamical system that can be seamlessly stacked to assemble multi-layer neural networks.
Our framework leads to the self-organization of a wide variety of architectures, ranging from multi-layer perceptrons to autoencoders.
arXiv Detail & Related papers (2020-06-12T01:44:48Z) - Towards a Neural Model for Serial Order in Frontal Cortex: a Brain
Theory from Memory Development to Higher-Level Cognition [53.816853325427424]
We propose that the immature prefrontal cortex (PFC) use its primary functionality of detecting hierarchical patterns in temporal signals.
Our hypothesis is that the PFC detects the hierarchical structure in temporal sequences in the form of ordinal patterns and use them to index information hierarchically in different parts of the brain.
By doing so, it gives the tools to the language-ready brain for manipulating abstract knowledge and planning temporally ordered information.
arXiv Detail & Related papers (2020-05-22T14:29:51Z) - Fostering Event Compression using Gated Surprise [0.5801044612920815]
generative, event-predictive models are formed by segmenting sensorimotor data into chunks of contextual experiences.
Here, we introduce a hierarchical, surprise-gated recurrent neural network architecture, which models this process.
Our model shows to develop distinct event compressions and achieves the best performance on multiple event processing tasks.
arXiv Detail & Related papers (2020-05-12T11:57:46Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.