How the (Tensor-) Brain uses Embeddings and Embodiment to Encode Senses and Decode Symbols
- URL: http://arxiv.org/abs/2409.12846v1
- Date: Thu, 19 Sep 2024 15:45:38 GMT
- Title: How the (Tensor-) Brain uses Embeddings and Embodiment to Encode Senses and Decode Symbols
- Authors: Volker Tresp, Hang Li,
- Abstract summary: We provide an overview of the tensor brain model, including recent developments.
The representation layer is a model for the subsymbolic global workspace from consciousness research.
The index layer contains symbols for concepts, time instances, and predicates.
- Score: 26.516135696182392
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: The tensor brain has been introduced as a computational model for perception and memory. We provide an overview of the tensor brain model, including recent developments. The tensor brain has two major layers: the representation layer and the index layer. The representation layer is a model for the subsymbolic global workspace from consciousness research. The state of the representation layer is the cognitive brain state. The index layer contains symbols for concepts, time instances, and predicates. In a bottom-up operation, the cognitive brain state is encoded by the index layer as symbolic labels. In a top-down operation, symbols are decoded and written to the representation layer. This feeds to earlier processing layers as embodiment. The top-down operation became the basis for semantic memory. The embedding vector of a concept forms the connection weights between its index and the representation layer. The embedding is the signature or ``DNA'' of a concept, which is decoded by the brain when its index is activated. It integrates all that is known about a concept from different experiences, modalities, and symbolic decodings. Although being computational, it has been suggested that the tensor brain might be related to the actual operation of the brain. The sequential nature of symbol generation might have been a prerequisite to the generation of natural language. We describe an attention mechanism and discuss multitasking by multiplexing. We emphasize the inherent multimodality of the tensor brain. Finally, we discuss embedded and symbolic reasoning.
Related papers
- Connectivity-Inspired Network for Context-Aware Recognition [1.049712834719005]
We focus on the effect of incorporating circuit motifs found in biological brains to address visual recognition.
Our convolutional architecture is inspired by the connectivity of human cortical and subcortical streams.
We present a new plug-and-play module to model context awareness.
arXiv Detail & Related papers (2024-09-06T15:42:10Z) - Discrete, compositional, and symbolic representations through attractor dynamics [51.20712945239422]
We introduce a novel neural systems model that integrates attractor dynamics with symbolic representations to model cognitive processes akin to the probabilistic language of thought (PLoT)
Our model segments the continuous representational space into discrete basins, with attractor states corresponding to symbolic sequences, that reflect the semanticity and compositionality characteristic of symbolic systems through unsupervised learning, rather than relying on pre-defined primitives.
This approach establishes a unified framework that integrates both symbolic and sub-symbolic processing through neural dynamics, a neuroplausible substrate with proven expressivity in AI, offering a more comprehensive model that mirrors the complex duality of cognitive operations
arXiv Detail & Related papers (2023-10-03T05:40:56Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Language Knowledge-Assisted Representation Learning for Skeleton-Based
Action Recognition [71.35205097460124]
How humans understand and recognize the actions of others is a complex neuroscientific problem.
LA-GCN proposes a graph convolution network using large-scale language models (LLM) knowledge assistance.
arXiv Detail & Related papers (2023-05-21T08:29:16Z) - Semantic Brain Decoding: from fMRI to conceptually similar image
reconstruction of visual stimuli [0.29005223064604074]
We propose a novel approach to brain decoding that also relies on semantic and contextual similarity.
We employ an fMRI dataset of natural image vision and create a deep learning decoding pipeline inspired by the existence of both bottom-up and top-down processes in human vision.
We produce reconstructions of visual stimuli that match the original content very well on a semantic level, surpassing the state of the art in previous literature.
arXiv Detail & Related papers (2022-12-13T16:54:08Z) - The Tensor Brain: A Unified Theory of Perception, Memory and Semantic
Decoding [16.37225919719441]
We present a unified computational theory of perception and memory.
In our model, perception, episodic memory, and semantic memory are realized by different functional and operational modes.
arXiv Detail & Related papers (2021-09-27T23:32:44Z) - Hierarchical Associative Memory [2.66512000865131]
Associative Memories or Modern Hopfield Networks have many appealing properties.
They can do pattern completion, store a large number of memories, and can be described using a recurrent neural network.
This paper tackles a gap and describes a fully recurrent model of associative memory with an arbitrary large number of layers.
arXiv Detail & Related papers (2021-07-14T01:38:40Z) - pix2rule: End-to-end Neuro-symbolic Rule Learning [84.76439511271711]
This paper presents a complete neuro-symbolic method for processing images into objects, learning relations and logical rules.
The main contribution is a differentiable layer in a deep learning architecture from which symbolic relations and rules can be extracted.
We demonstrate that our model scales beyond state-of-the-art symbolic learners and outperforms deep relational neural network architectures.
arXiv Detail & Related papers (2021-06-14T15:19:06Z) - Emotional EEG Classification using Connectivity Features and
Convolutional Neural Networks [81.74442855155843]
We introduce a new classification system that utilizes brain connectivity with a CNN and validate its effectiveness via the emotional video classification.
The level of concentration of the brain connectivity related to the emotional property of the target video is correlated with classification performance.
arXiv Detail & Related papers (2021-01-18T13:28:08Z) - Towards a Neural Model for Serial Order in Frontal Cortex: a Brain
Theory from Memory Development to Higher-Level Cognition [53.816853325427424]
We propose that the immature prefrontal cortex (PFC) use its primary functionality of detecting hierarchical patterns in temporal signals.
Our hypothesis is that the PFC detects the hierarchical structure in temporal sequences in the form of ordinal patterns and use them to index information hierarchically in different parts of the brain.
By doing so, it gives the tools to the language-ready brain for manipulating abstract knowledge and planning temporally ordered information.
arXiv Detail & Related papers (2020-05-22T14:29:51Z) - The Tensor Brain: Semantic Decoding for Perception and Memory [25.49830575143093]
We analyse perception and memory using mathematical models for knowledge graphs and tensors.
We argue that a biological realization of perception and memory imposes constraints on information processing.
In particular, we propose that explicit perception and declarative memories require a semantic decoder.
arXiv Detail & Related papers (2020-01-29T07:48:01Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.