Fuzzy Temporal Protoforms for the Quantitative Description of Processes
in Natural Language
- URL: http://arxiv.org/abs/2305.09506v1
- Date: Tue, 16 May 2023 14:59:38 GMT
- Title: Fuzzy Temporal Protoforms for the Quantitative Description of Processes
in Natural Language
- Authors: Yago Fontenla-Seco, Alberto Bugar\'in-Diz and Manuel Lama
- Abstract summary: The model includes temporal and causal information from processes and attributes, quantifies attributes in time during the process life-span and recalls causal relations and temporal distances between events.
A real use-case in the cardiology domain is presented, showing the potential of our model for providing natural language explanations addressed to domain experts.
- Score: 0.0
- License: http://creativecommons.org/licenses/by-nc-nd/4.0/
- Abstract: In this paper, we propose a series of fuzzy temporal protoforms in the
framework of the automatic generation of quantitative and qualitative natural
language descriptions of processes. The model includes temporal and causal
information from processes and attributes, quantifies attributes in time during
the process life-span and recalls causal relations and temporal distances
between events, among other features. Through integrating process mining
techniques and fuzzy sets within the usual Data-to-Text architecture, our
framework is able to extract relevant quantitative temporal as well as
structural information from a process and describe it in natural language
involving uncertain terms. A real use-case in the cardiology domain is
presented, showing the potential of our model for providing natural language
explanations addressed to domain experts.
Related papers
- Nature-Inspired Local Propagation [68.63385571967267]
Natural learning processes rely on mechanisms where data representation and learning are intertwined in such a way as to respect locality.
We show that the algorithmic interpretation of the derived "laws of learning", which takes the structure of Hamiltonian equations, reduces to Backpropagation when the speed of propagation goes to infinity.
This opens the doors to machine learning based on full on-line information that are based the replacement of Backpropagation with the proposed local algorithm.
arXiv Detail & Related papers (2024-02-04T21:43:37Z) - Injecting linguistic knowledge into BERT for Dialogue State Tracking [60.42231674887294]
This paper proposes a method that extracts linguistic knowledge via an unsupervised framework.
We then utilize this knowledge to augment BERT's performance and interpretability in Dialogue State Tracking (DST) tasks.
We benchmark this framework on various DST tasks and observe a notable improvement in accuracy.
arXiv Detail & Related papers (2023-11-27T08:38:42Z) - Subspace Chronicles: How Linguistic Information Emerges, Shifts and
Interacts during Language Model Training [56.74440457571821]
We analyze tasks covering syntax, semantics and reasoning, across 2M pre-training steps and five seeds.
We identify critical learning phases across tasks and time, during which subspaces emerge, share information, and later disentangle to specialize.
Our findings have implications for model interpretability, multi-task learning, and learning from limited data.
arXiv Detail & Related papers (2023-10-25T09:09:55Z) - Spatio-Temporal Branching for Motion Prediction using Motion Increments [55.68088298632865]
Human motion prediction (HMP) has emerged as a popular research topic due to its diverse applications.
Traditional methods rely on hand-crafted features and machine learning techniques.
We propose a noveltemporal-temporal branching network using incremental information for HMP.
arXiv Detail & Related papers (2023-08-02T12:04:28Z) - An Overview Of Temporal Commonsense Reasoning and Acquisition [20.108317515225504]
Temporal commonsense reasoning refers to the ability to understand the typical temporal context of phrases, actions, and events.
Recent research on the performance of large language models suggests that they often take shortcuts in their reasoning and fall prey to simple linguistic traps.
arXiv Detail & Related papers (2023-07-28T01:30:15Z) - Unlocking Temporal Question Answering for Large Language Models with Tailor-Made Reasoning Logic [84.59255070520673]
Large language models (LLMs) face a challenge when engaging in temporal reasoning.
We propose TempLogic, a novel framework designed specifically for temporal question-answering tasks.
arXiv Detail & Related papers (2023-05-24T10:57:53Z) - Process-To-Text: A Framework for the Quantitative Description of
Processes in Natural Language [0.0]
We present the Process-To-Text (P2T) framework for the automatic generation of descriptive explanations of processes.
P2T integrates three AI paradigms: process mining for extracting temporal and structural information from a process, fuzzy linguistic protoforms for modelling uncertain terms, and natural language generation for building the explanations.
A real use-case in the cardiology domain is presented, showing the potential of P2T for providing natural language explanations addressed to specialists.
arXiv Detail & Related papers (2023-05-23T13:14:34Z) - Feature Importance Explanations for Temporal Black-Box Models [3.655021726150369]
We propose TIME, a method to explain models that are inherently temporal in nature.
Our approach uses a model-agnostic permutation-based approach to analyze global feature importance.
arXiv Detail & Related papers (2021-02-23T20:41:07Z) - Modelling Compositionality and Structure Dependence in Natural Language [0.12183405753834563]
Drawing on linguistics and set theory, a formalisation of these ideas is presented in the first half of this thesis.
We see how cognitive systems that process language need to have certain functional constraints.
Using the advances of word embedding techniques, a model of relational learning is simulated.
arXiv Detail & Related papers (2020-11-22T17:28:50Z) - A Survey on Temporal Reasoning for Temporal Information Extraction from
Text (Extended Abstract) [21.62977556227642]
Temporal reasoning plays a central role in temporal information extraction.
This article presents a comprehensive survey of the research on temporal reasoning for automatic temporal information extraction from text.
It provides a case study on the integration of symbolic reasoning with machine learning-based information extraction systems.
arXiv Detail & Related papers (2020-05-13T18:53:15Z) - Procedural Reading Comprehension with Attribute-Aware Context Flow [85.34405161075276]
Procedural texts often describe processes that happen over entities.
We introduce an algorithm for procedural reading comprehension by translating the text into a general formalism.
arXiv Detail & Related papers (2020-03-31T00:06:29Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.