Systematic Abductive Reasoning via Diverse Relation Representations in Vector-symbolic Architecture
- URL: http://arxiv.org/abs/2501.11896v2
- Date: Wed, 22 Jan 2025 03:23:22 GMT
- Title: Systematic Abductive Reasoning via Diverse Relation Representations in Vector-symbolic Architecture
- Authors: Zhong-Hua Sun, Ru-Yuan Zhang, Zonglei Zhen, Da-Hui Wang, Yong-Jie Li, Xiaohong Wan, Hongzhi You,
- Abstract summary: We propose a Systematic Abductive Reasoning model with diverse relation representations (Rel-SAR) in Vector-symbolic Architecture (VSA)
To derive representations with symbolic reasoning potential, we introduce not only various types of atomic vectors represent numeric, periodic and logical semantics, but also the structured high-dimentional representation (S)
For systematic reasoning, we propose novel numerical and logical functions and perform rule abduction and generalization execution in a unified framework that integrates these relation representations.
- Score: 10.27696004820717
- License:
- Abstract: In abstract visual reasoning, monolithic deep learning models suffer from limited interpretability and generalization, while existing neuro-symbolic approaches fall short in capturing the diversity and systematicity of attributes and relation representations. To address these challenges, we propose a Systematic Abductive Reasoning model with diverse relation representations (Rel-SAR) in Vector-symbolic Architecture (VSA) to solve Raven's Progressive Matrices (RPM). To derive attribute representations with symbolic reasoning potential, we introduce not only various types of atomic vectors that represent numeric, periodic and logical semantics, but also the structured high-dimentional representation (SHDR) for the overall Grid component. For systematic reasoning, we propose novel numerical and logical relation functions and perform rule abduction and execution in a unified framework that integrates these relation representations. Experimental results demonstrate that Rel-SAR achieves significant improvement on RPM tasks and exhibits robust out-of-distribution generalization. Rel-SAR leverages the synergy between HD attribute representations and symbolic reasoning to achieve systematic abductive reasoning with both interpretable and computable semantics.
Related papers
- NeSyCoCo: A Neuro-Symbolic Concept Composer for Compositional Generalization [17.49136753589057]
NeSyCoCo is a neuro-symbolic framework that generates symbolic representations and maps them to differentiable neural computations.
Our framework achieves state-of-the-art results on the ReaSCAN and CLEVR-CoGenT compositional generalization benchmarks.
arXiv Detail & Related papers (2024-12-20T05:48:58Z) - LARS-VSA: A Vector Symbolic Architecture For Learning with Abstract Rules [1.3049516752695616]
We propose a "relational bottleneck" that separates object-level features from abstract rules, allowing learning from limited amounts of data.
We adapt the "relational bottleneck" strategy to a high-dimensional space, incorporating explicit vector binding operations between symbols and relational representations.
Our system benefits from the low overhead of operations in hyperdimensional space, making it significantly more efficient than the state of the art when evaluated on a variety of test datasets.
arXiv Detail & Related papers (2024-05-23T11:05:42Z) - Discovering Abstract Symbolic Relations by Learning Unitary Group Representations [7.303827428956944]
We investigate a principled approach for symbolic operation completion (SOC)
SOC poses a unique challenge in modeling abstract relationships between discrete symbols.
We demonstrate that SOC can be efficiently solved by a minimal model - a bilinear map - with a novel factorized architecture.
arXiv Detail & Related papers (2024-02-26T20:18:43Z) - Labeling Neural Representations with Inverse Recognition [25.867702786273586]
Inverse Recognition (INVERT) is a scalable approach for connecting learned representations with human-understandable concepts.
In contrast to prior work, INVERT is capable of handling diverse types of neurons, exhibits less computational complexity, and does not rely on the availability of segmentation masks.
We demonstrate the applicability of INVERT in various scenarios, including the identification of representations affected by spurious correlations.
arXiv Detail & Related papers (2023-11-22T18:55:25Z) - Discrete, compositional, and symbolic representations through attractor dynamics [51.20712945239422]
We introduce a novel neural systems model that integrates attractor dynamics with symbolic representations to model cognitive processes akin to the probabilistic language of thought (PLoT)
Our model segments the continuous representational space into discrete basins, with attractor states corresponding to symbolic sequences, that reflect the semanticity and compositionality characteristic of symbolic systems through unsupervised learning, rather than relying on pre-defined primitives.
This approach establishes a unified framework that integrates both symbolic and sub-symbolic processing through neural dynamics, a neuroplausible substrate with proven expressivity in AI, offering a more comprehensive model that mirrors the complex duality of cognitive operations
arXiv Detail & Related papers (2023-10-03T05:40:56Z) - LOGICSEG: Parsing Visual Semantics with Neural Logic Learning and
Reasoning [73.98142349171552]
LOGICSEG is a holistic visual semantic that integrates neural inductive learning and logic reasoning with both rich data and symbolic knowledge.
During fuzzy logic-based continuous relaxation, logical formulae are grounded onto data and neural computational graphs, hence enabling logic-induced network training.
These designs together make LOGICSEG a general and compact neural-logic machine that is readily integrated into existing segmentation models.
arXiv Detail & Related papers (2023-09-24T05:43:19Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Neural-Symbolic Recursive Machine for Systematic Generalization [113.22455566135757]
We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS)
NSR integrates neural perception, syntactic parsing, and semantic reasoning.
We evaluate NSR's efficacy across four challenging benchmarks designed to probe systematic generalization capabilities.
arXiv Detail & Related papers (2022-10-04T13:27:38Z) - On Neural Architecture Inductive Biases for Relational Tasks [76.18938462270503]
We introduce a simple architecture based on similarity-distribution scores which we name Compositional Network generalization (CoRelNet)
We find that simple architectural choices can outperform existing models in out-of-distribution generalizations.
arXiv Detail & Related papers (2022-06-09T16:24:01Z) - Structural Landmarking and Interaction Modelling: on Resolution Dilemmas
in Graph Classification [50.83222170524406]
We study the intrinsic difficulty in graph classification under the unified concept of resolution dilemmas''
We propose SLIM'', an inductive neural network model for Structural Landmarking and Interaction Modelling.
arXiv Detail & Related papers (2020-06-29T01:01:42Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.