Compositional Semantics and Inference System for Temporal Order based on
Japanese CCG
- URL: http://arxiv.org/abs/2204.09245v1
- Date: Wed, 20 Apr 2022 06:21:21 GMT
- Title: Compositional Semantics and Inference System for Temporal Order based on
Japanese CCG
- Authors: Tomoki Sugimoto, Hitomi Yanaka
- Abstract summary: We present a logic-based Natural Language Inference system that considers temporal order in Japanese.
Our system performs inference involving temporal order by using axioms for temporal relations and automated theorem provers.
We show that our system outperforms previous logic-based systems as well as current deep learning-based models.
- Score: 9.683269364766426
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Natural Language Inference (NLI) is the task of determining whether a premise
entails a hypothesis. NLI with temporal order is a challenging task because
tense and aspect are complex linguistic phenomena involving interactions with
temporal adverbs and temporal connectives. To tackle this, temporal and
aspectual inference has been analyzed in various ways in the field of formal
semantics. However, a Japanese NLI system for temporal order based on the
analysis of formal semantics has not been sufficiently developed. We present a
logic-based NLI system that considers temporal order in Japanese based on
compositional semantics via Combinatory Categorial Grammar (CCG) syntactic
analysis. Our system performs inference involving temporal order by using
axioms for temporal relations and automated theorem provers. We evaluate our
system by experimenting with Japanese NLI datasets that involve temporal order.
We show that our system outperforms previous logic-based systems as well as
current deep learning-based models.
Related papers
- An Overview Of Temporal Commonsense Reasoning and Acquisition [20.108317515225504]
Temporal commonsense reasoning refers to the ability to understand the typical temporal context of phrases, actions, and events.
Recent research on the performance of large language models suggests that they often take shortcuts in their reasoning and fall prey to simple linguistic traps.
arXiv Detail & Related papers (2023-07-28T01:30:15Z) - Unlocking Temporal Question Answering for Large Language Models with Tailor-Made Reasoning Logic [84.59255070520673]
Large language models (LLMs) face a challenge when engaging in temporal reasoning.
We propose TempLogic, a novel framework designed specifically for temporal question-answering tasks.
arXiv Detail & Related papers (2023-05-24T10:57:53Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Generic Temporal Reasoning with Differential Analysis and Explanation [61.96034987217583]
We introduce a novel task named TODAY that bridges the gap with temporal differential analysis.
TODAY evaluates whether systems can correctly understand the effect of incremental changes.
We show that TODAY's supervision style and explanation annotations can be used in joint learning.
arXiv Detail & Related papers (2022-12-20T17:40:03Z) - Neural-Symbolic Recursive Machine for Systematic Generalization [113.22455566135757]
We introduce the Neural-Symbolic Recursive Machine (NSR), whose core is a Grounded Symbol System (GSS)
NSR integrates neural perception, syntactic parsing, and semantic reasoning.
We evaluate NSR's efficacy across four challenging benchmarks designed to probe systematic generalization capabilities.
arXiv Detail & Related papers (2022-10-04T13:27:38Z) - Interpretable Time-series Representation Learning With Multi-Level
Disentanglement [56.38489708031278]
Disentangle Time Series (DTS) is a novel disentanglement enhancement framework for sequential data.
DTS generates hierarchical semantic concepts as the interpretable and disentangled representation of time-series.
DTS achieves superior performance in downstream applications, with high interpretability of semantic concepts.
arXiv Detail & Related papers (2021-05-17T22:02:24Z) - Temporal Answer Set Programming [3.263632801414296]
We present an overview on Temporal Logic Programming under the perspective of its application for Knowledge Representation and declarative problem solving.
We focus on recent results of the non-monotonic formalism called Temporal Equilibrium Logic (TEL) that is defined for the full syntax.
In a second part, we focus on practical aspects, defining a syntactic fragment called temporal logic programs closer to ASP, and explain how this has been exploited in the construction of the solver TELINGO.
arXiv Detail & Related papers (2020-09-14T16:13:36Z) - Systematic Generalization on gSCAN with Language Conditioned Embedding [19.39687991647301]
Systematic Generalization refers to a learning algorithm's ability to extrapolate learned behavior to unseen situations.
We propose a novel method that learns objects' contextualized embeddings with dynamic message passing conditioned on the input natural language.
arXiv Detail & Related papers (2020-09-11T17:35:05Z) - Supporting Optimal Phase Space Reconstructions Using Neural Network
Architecture for Time Series Modeling [68.8204255655161]
We propose an artificial neural network with a mechanism to implicitly learn the phase spaces properties.
Our approach is either as competitive as or better than most state-of-the-art strategies.
arXiv Detail & Related papers (2020-06-19T21:04:47Z) - Logical Inferences with Comparatives and Generalized Quantifiers [18.58482811176484]
A logical inference system for comparatives has not been sufficiently developed for use in the Natural Language Inference task.
We present a compositional semantics that maps various comparative constructions in English to semantic representations via Category Grammar (CCG)
We show that the system outperforms previous logic-based systems as well as recent deep learning-based models.
arXiv Detail & Related papers (2020-05-16T11:11:48Z) - Probing Linguistic Systematicity [11.690179162556353]
There is accumulating evidence that neural models often generalize non-systematically.
We identify ways in which network architectures can generalize non-systematically, and discuss why such forms of generalization may be unsatisfying.
arXiv Detail & Related papers (2020-05-08T23:31:31Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.