Semantic Construction Grammar: Bridging the NL / Logic Divide
- URL: http://arxiv.org/abs/2112.05256v1
- Date: Fri, 10 Dec 2021 00:02:40 GMT
- Title: Semantic Construction Grammar: Bridging the NL / Logic Divide
- Authors: Dave Schneider, Michael Witbrock
- Abstract summary: We discuss Semantic Construction Grammar (SCG), a system developed to facilitate translation between natural language and logical representations.
SCG is designed to support a variety of different methods of representation, ranging from those that are fairly close to the NL structure to those that are quite different from the NL structure.
- Score: 3.245535754791156
- License: http://creativecommons.org/licenses/by/4.0/
- Abstract: In this paper, we discuss Semantic Construction Grammar (SCG), a system
developed over the past several years to facilitate translation between natural
language and logical representations. Crucially, SCG is designed to support a
variety of different methods of representation, ranging from those that are
fairly close to the NL structure (e.g. so-called 'logical forms'), to those
that are quite different from the NL structure, with higher-order and
high-arity relations. Semantic constraints and checks on representations are
integral to the process of NL understanding with SCG, and are easily carried
out due to the SCG's integration with Cyc's Knowledge Base and inference
engine.
Related papers
- Compositional Program Generation for Few-Shot Systematic Generalization [59.57656559816271]
This study on a neuro-symbolic architecture called the Compositional Program Generator (CPG)
CPG has three key features: textitmodularity, textitcomposition, and textitabstraction, in the form of grammar rules.
It perfect achieves generalization on both the SCAN and COGS benchmarks using just 14 examples for SCAN and 22 examples for COGS.
arXiv Detail & Related papers (2023-09-28T14:33:20Z) - ChatRule: Mining Logical Rules with Large Language Models for Knowledge
Graph Reasoning [107.61997887260056]
We propose a novel framework, ChatRule, unleashing the power of large language models for mining logical rules over knowledge graphs.
Specifically, the framework is initiated with an LLM-based rule generator, leveraging both the semantic and structural information of KGs.
To refine the generated rules, a rule ranking module estimates the rule quality by incorporating facts from existing KGs.
arXiv Detail & Related papers (2023-09-04T11:38:02Z) - Modeling Hierarchical Reasoning Chains by Linking Discourse Units and
Key Phrases for Reading Comprehension [80.99865844249106]
We propose a holistic graph network (HGN) which deals with context at both discourse level and word level, as the basis for logical reasoning.
Specifically, node-level and type-level relations, which can be interpreted as bridges in the reasoning process, are modeled by a hierarchical interaction mechanism.
arXiv Detail & Related papers (2023-06-21T07:34:27Z) - Enhancing Language Representation with Constructional Information for
Natural Language Understanding [5.945710973349298]
We introduce construction grammar (CxG), which highlights the pairings of form and meaning.
We adopt usage-based construction grammar as the basis of our work.
A HyCxG framework is proposed to enhance language representation through a three-stage solution.
arXiv Detail & Related papers (2023-06-05T12:15:12Z) - Query Structure Modeling for Inductive Logical Reasoning Over Knowledge
Graphs [67.043747188954]
We propose a structure-modeled textual encoding framework for inductive logical reasoning over KGs.
It encodes linearized query structures and entities using pre-trained language models to find answers.
We conduct experiments on two inductive logical reasoning datasets and three transductive datasets.
arXiv Detail & Related papers (2023-05-23T01:25:29Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - The Better Your Syntax, the Better Your Semantics? Probing Pretrained
Language Models for the English Comparative Correlative [7.03497683558609]
Construction Grammar (CxG) is a paradigm from cognitive linguistics emphasising the connection between syntax and semantics.
We present an investigation of their capability to classify and understand one of the most commonly studied constructions, the English comparative correlative (CC)
Our results show that all three investigated PLMs are able to recognise the structure of the CC but fail to use its meaning.
arXiv Detail & Related papers (2022-10-24T13:01:24Z) - Comparison by Conversion: Reverse-Engineering UCCA from Syntax and
Lexical Semantics [29.971739294416714]
Building robust natural language understanding systems will require a clear characterization of whether and how various linguistic meaning representations complement each other.
We evaluate the mapping between meaning representations from different frameworks using two complementary methods: (i) a rule-based converter, and (ii) a supervised delexicalized that parses to one framework using only information from the other as features.
arXiv Detail & Related papers (2020-11-02T09:03:46Z) - Logical Inferences with Comparatives and Generalized Quantifiers [18.58482811176484]
A logical inference system for comparatives has not been sufficiently developed for use in the Natural Language Inference task.
We present a compositional semantics that maps various comparative constructions in English to semantic representations via Category Grammar (CCG)
We show that the system outperforms previous logic-based systems as well as recent deep learning-based models.
arXiv Detail & Related papers (2020-05-16T11:11:48Z) - A Benchmark for Systematic Generalization in Grounded Language
Understanding [61.432407738682635]
Humans easily interpret expressions that describe unfamiliar situations composed from familiar parts.
Modern neural networks, by contrast, struggle to interpret novel compositions.
We introduce a new benchmark, gSCAN, for evaluating compositional generalization in situated language understanding.
arXiv Detail & Related papers (2020-03-11T08:40:15Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.