Emergent Symbols through Binding in External Memory
- URL: http://arxiv.org/abs/2012.14601v2
- Date: Wed, 10 Mar 2021 01:13:38 GMT
- Title: Emergent Symbols through Binding in External Memory
- Authors: Taylor W. Webb, Ishan Sinha, Jonathan D. Cohen
- Abstract summary: We introduce the Emergent Symbol Binding Network (ESBN), a recurrent network augmented with an external memory.
This binding mechanism allows symbol-like representations to emerge through the learning process without the need to explicitly incorporate symbol-processing machinery.
Across a series of tasks, we show that this architecture displays nearly perfect generalization of learned rules to novel entities.
- Score: 2.3562267625320352
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: A key aspect of human intelligence is the ability to infer abstract rules
directly from high-dimensional sensory data, and to do so given only a limited
amount of training experience. Deep neural network algorithms have proven to be
a powerful tool for learning directly from high-dimensional data, but currently
lack this capacity for data-efficient induction of abstract rules, leading some
to argue that symbol-processing mechanisms will be necessary to account for
this capacity. In this work, we take a step toward bridging this gap by
introducing the Emergent Symbol Binding Network (ESBN), a recurrent network
augmented with an external memory that enables a form of variable-binding and
indirection. This binding mechanism allows symbol-like representations to
emerge through the learning process without the need to explicitly incorporate
symbol-processing machinery, enabling the ESBN to learn rules in a manner that
is abstracted away from the particular entities to which those rules apply.
Across a series of tasks, we show that this architecture displays nearly
perfect generalization of learned rules to novel entities given only a limited
number of training examples, and outperforms a number of other competitive
neural network architectures.
Related papers
- Towards Scalable and Versatile Weight Space Learning [51.78426981947659]
This paper introduces the SANE approach to weight-space learning.
Our method extends the idea of hyper-representations towards sequential processing of subsets of neural network weights.
arXiv Detail & Related papers (2024-06-14T13:12:07Z) - LARS-VSA: A Vector Symbolic Architecture For Learning with Abstract Rules [1.3049516752695616]
We propose a "relational bottleneck" that separates object-level features from abstract rules, allowing learning from limited amounts of data.
We adapt the "relational bottleneck" strategy to a high-dimensional space, incorporating explicit vector binding operations between symbols and relational representations.
Our system benefits from the low overhead of operations in hyperdimensional space, making it significantly more efficient than the state of the art when evaluated on a variety of test datasets.
arXiv Detail & Related papers (2024-05-23T11:05:42Z) - The Role of Foundation Models in Neuro-Symbolic Learning and Reasoning [54.56905063752427]
Neuro-Symbolic AI (NeSy) holds promise to ensure the safe deployment of AI systems.
Existing pipelines that train the neural and symbolic components sequentially require extensive labelling.
New architecture, NeSyGPT, fine-tunes a vision-language foundation model to extract symbolic features from raw data.
arXiv Detail & Related papers (2024-02-02T20:33:14Z) - Heterogenous Memory Augmented Neural Networks [84.29338268789684]
We introduce a novel heterogeneous memory augmentation approach for neural networks.
By introducing learnable memory tokens with attention mechanism, we can effectively boost performance without huge computational overhead.
We show our approach on various image and graph-based tasks under both in-distribution (ID) and out-of-distribution (OOD) conditions.
arXiv Detail & Related papers (2023-10-17T01:05:28Z) - Symbolic Synthesis of Neural Networks [0.0]
I present Graph-basedally Synthesized Neural Networks (GSSNNs)
GSSNNs are a form of neural network whose topology and parameters are informed by the output of a symbolic program.
I demonstrate that by developing symbolic abstractions at a population level, I can elicit reliable patterns of improved generalization with small quantities of data known to contain local and discrete features.
arXiv Detail & Related papers (2023-03-06T18:13:14Z) - Symbolic Visual Reinforcement Learning: A Scalable Framework with
Object-Level Abstraction and Differentiable Expression Search [63.3745291252038]
We propose DiffSES, a novel symbolic learning approach that discovers discrete symbolic policies.
By using object-level abstractions instead of raw pixel-level inputs, DiffSES is able to leverage the simplicity and scalability advantages of symbolic expressions.
Our experiments demonstrate that DiffSES is able to generate symbolic policies that are simpler and more scalable than state-of-the-art symbolic RL methods.
arXiv Detail & Related papers (2022-12-30T17:50:54Z) - On Binding Objects to Symbols: Learning Physical Concepts to Understand
Real from Fake [155.6741526791004]
We revisit the classic signal-to-symbol barrier in light of the remarkable ability of deep neural networks to generate synthetic data.
We characterize physical objects as abstract concepts and use the previous analysis to show that physical objects can be encoded by finite architectures.
We conclude that binding physical entities to digital identities is possible in finite time with finite resources.
arXiv Detail & Related papers (2022-07-25T17:21:59Z) - Neuro-Symbolic Learning of Answer Set Programs from Raw Data [54.56905063752427]
Neuro-Symbolic AI aims to combine interpretability of symbolic techniques with the ability of deep learning to learn from raw data.
We introduce Neuro-Symbolic Inductive Learner (NSIL), an approach that trains a general neural network to extract latent concepts from raw data.
NSIL learns expressive knowledge, solves computationally complex problems, and achieves state-of-the-art performance in terms of accuracy and data efficiency.
arXiv Detail & Related papers (2022-05-25T12:41:59Z) - A Memory-Augmented Neural Network Model of Abstract Rule Learning [2.3562267625320352]
We focus on neural networks' capacity for arbitrary role-filler binding.
We introduce the Emergent Symbol Binding Network (ESBN), a recurrent neural network model that learns to use an external memory as a binding mechanism.
This mechanism enables symbol-like variable representations to emerge through the ESBN's training process without the need for explicit symbol-processing machinery.
arXiv Detail & Related papers (2020-12-13T22:40:07Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.