Learning beyond sensations: how dreams organize neuronal representations
- URL: http://arxiv.org/abs/2308.01830v2
- Date: Tue, 5 Dec 2023 12:20:33 GMT
- Title: Learning beyond sensations: how dreams organize neuronal representations
- Authors: Nicolas Deperrois, Mihai A. Petrovici, Walter Senn, and Jakob Jordan
- Abstract summary: We discuss two complementary learning principles that organize representations through the generation of virtual experiences.
These principles are compatible with known cortical structure and dynamics and the phenomenology of sleep.
- Score: 1.749248408967819
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: Semantic representations in higher sensory cortices form the basis for
robust, yet flexible behavior. These representations are acquired over the
course of development in an unsupervised fashion and continuously maintained
over an organism's lifespan. Predictive learning theories propose that these
representations emerge from predicting or reconstructing sensory inputs.
However, brains are known to generate virtual experiences, such as during
imagination and dreaming, that go beyond previously experienced inputs. Here,
we suggest that virtual experiences may be just as relevant as actual sensory
inputs in shaping cortical representations. In particular, we discuss two
complementary learning principles that organize representations through the
generation of virtual experiences. First, "adversarial dreaming" proposes that
creative dreams support a cortical implementation of adversarial learning in
which feedback and feedforward pathways engage in a productive game of trying
to fool each other. Second, "contrastive dreaming" proposes that the invariance
of neuronal representations to irrelevant factors of variation is acquired by
trying to map similar virtual experiences together via a contrastive learning
process. These principles are compatible with known cortical structure and
dynamics and the phenomenology of sleep thus providing promising directions to
explain cortical learning beyond the classical predictive learning paradigm.
Related papers
- How does the primate brain combine generative and discriminative
computations in vision? [4.691670689443386]
Two contrasting conceptions of the inference process have each been influential in research on biological vision and machine vision.
We show that vision inverts a generative model through an interrogation of the evidence in a process often thought to involve top-down predictions of sensory data.
We explain and clarify the terminology, review the key empirical evidence, and propose an empirical research program that transcends and sets the stage for revealing the mysterious hybrid algorithm of primate vision.
arXiv Detail & Related papers (2024-01-11T16:07:58Z) - Contrastive-Signal-Dependent Plasticity: Self-Supervised Learning in Spiking Neural Circuits [61.94533459151743]
This work addresses the challenge of designing neurobiologically-motivated schemes for adjusting the synapses of spiking networks.
Our experimental simulations demonstrate a consistent advantage over other biologically-plausible approaches when training recurrent spiking networks.
arXiv Detail & Related papers (2023-03-30T02:40:28Z) - Rejecting Cognitivism: Computational Phenomenology for Deep Learning [5.070542698701158]
We propose a non-representationalist framework for deep learning relying on a novel method: computational phenomenology.
We reject the modern cognitivist interpretation of deep learning, according to which artificial neural networks encode representations of external entities.
arXiv Detail & Related papers (2023-02-16T20:05:06Z) - Consciousness is entailed by compositional learning of new causal structures in deep predictive processing systems [0.0]
In humans, such learning includes specific declarative memory formation and is closely associated with consciousness.
We extend predictive processing by adding online, single-example new structure learning via hierarchical binding of unpredicted inferences.
Our proposal naturally unifies the feature binding, recurrent processing, predictive processing, and global theories of consciousness.
arXiv Detail & Related papers (2023-01-17T17:06:48Z) - A-ACT: Action Anticipation through Cycle Transformations [89.83027919085289]
We take a step back to analyze how the human capability to anticipate the future can be transferred to machine learning algorithms.
A recent study on human psychology explains that, in anticipating an occurrence, the human brain counts on both systems.
In this work, we study the impact of each system for the task of action anticipation and introduce a paradigm to integrate them in a learning framework.
arXiv Detail & Related papers (2022-04-02T21:50:45Z) - The world seems different in a social context: a neural network analysis
of human experimental data [57.729312306803955]
We show that it is possible to replicate human behavioral data in both individual and social task settings by modifying the precision of prior and sensory signals.
An analysis of the neural activation traces of the trained networks provides evidence that information is coded in fundamentally different ways in the network in the individual and in the social conditions.
arXiv Detail & Related papers (2022-03-03T17:19:12Z) - How to build a cognitive map: insights from models of the hippocampal
formation [0.45880283710344055]
The concept of a cognitive map has emerged as one of the leading metaphors for these capacities.
unravelling the learning and neural representation of such a map has become a central focus of neuroscience.
arXiv Detail & Related papers (2022-02-03T16:49:37Z) - Memory semantization through perturbed and adversarial dreaming [0.7874708385247353]
We propose that rapid-eye-movement (REM) dreaming is essential for efficient memory semantization.
We implement a cortical architecture with hierarchically organized feedforward and feedback pathways, inspired by generative adversarial networks (GANs)
Our results suggest that adversarial dreaming during REM sleep is essential for extracting memory contents, while dreaming during NREM sleep improves the robustness of the latent representation to noisy sensory inputs.
arXiv Detail & Related papers (2021-09-09T13:31:13Z) - Constellation: Learning relational abstractions over objects for
compositional imagination [64.99658940906917]
We introduce Constellation, a network that learns relational abstractions of static visual scenes.
This work is a first step in the explicit representation of visual relationships and using them for complex cognitive procedures.
arXiv Detail & Related papers (2021-07-23T11:59:40Z) - Backprop-Free Reinforcement Learning with Active Neural Generative
Coding [84.11376568625353]
We propose a computational framework for learning action-driven generative models without backpropagation of errors (backprop) in dynamic environments.
We develop an intelligent agent that operates even with sparse rewards, drawing inspiration from the cognitive theory of planning as inference.
The robust performance of our agent offers promising evidence that a backprop-free approach for neural inference and learning can drive goal-directed behavior.
arXiv Detail & Related papers (2021-07-10T19:02:27Z) - Towards a Neural Model for Serial Order in Frontal Cortex: a Brain
Theory from Memory Development to Higher-Level Cognition [53.816853325427424]
We propose that the immature prefrontal cortex (PFC) use its primary functionality of detecting hierarchical patterns in temporal signals.
Our hypothesis is that the PFC detects the hierarchical structure in temporal sequences in the form of ordinal patterns and use them to index information hierarchically in different parts of the brain.
By doing so, it gives the tools to the language-ready brain for manipulating abstract knowledge and planning temporally ordered information.
arXiv Detail & Related papers (2020-05-22T14:29:51Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.