Learning Rhetorical Structure Theory-based descriptions of observed
behaviour
- URL: http://arxiv.org/abs/2206.12294v1
- Date: Fri, 24 Jun 2022 13:47:20 GMT
- Title: Learning Rhetorical Structure Theory-based descriptions of observed
behaviour
- Authors: Luis Botelho, Luis Nunes, Ricardo Ribeiro, and Rui J. Lopes
- Abstract summary: This paper proposes a new set of concepts, axiom schemata and algorithms that allow the agent to learn new descriptions of an observed behaviour.
The relations used by agents to represent the descriptions they learn were inspired on the Theory of Rhetorical Structure (RST)
The paper shows results of the presented proposals in a demonstration scenario, using implemented software.
- Score: 0.5249805590164901
- License: http://creativecommons.org/licenses/by-sa/4.0/
- Abstract: In a previous paper, we have proposed a set of concepts, axiom schemata and
algorithms that can be used by agents to learn to describe their behaviour,
goals, capabilities, and environment. The current paper proposes a new set of
concepts, axiom schemata and algorithms that allow the agent to learn new
descriptions of an observed behaviour (e.g., perplexing actions), of its actor
(e.g., undesired propositions or actions), and of its environment (e.g.,
incompatible propositions). Each learned description (e.g., a certain action
prevents another action from being performed in the future) is represented by a
relationship between entities (either propositions or actions) and is learned
by the agent, just by observation, using domain-independent axiom schemata and
or learning algorithms. The relations used by agents to represent the
descriptions they learn were inspired on the Theory of Rhetorical Structure
(RST). The main contribution of the paper is the relation family Although,
inspired on the RST relation Concession. The accurate definition of the
relations of the family Although involves a set of deontic concepts whose
definition and corresponding algorithms are presented. The relations of the
family Although, once extracted from the agent's observations, express surprise
at the observed behaviour and, in certain circumstances, present a
justification for it.
The paper shows results of the presented proposals in a demonstration
scenario, using implemented software.
Related papers
- A Complexity-Based Theory of Compositionality [53.025566128892066]
In AI, compositional representations can enable a powerful form of out-of-distribution generalization.
Here, we propose a formal definition of compositionality that accounts for and extends our intuitions about compositionality.
The definition is conceptually simple, quantitative, grounded in algorithmic information theory, and applicable to any representation.
arXiv Detail & Related papers (2024-10-18T18:37:27Z) - A Semantic Approach to Decidability in Epistemic Planning (Extended
Version) [72.77805489645604]
We use a novel semantic approach to achieve decidability.
Specifically, we augment the logic of knowledge S5$_n$ and with an interaction axiom called (knowledge) commutativity.
We prove that our framework admits a finitary non-fixpoint characterization of common knowledge, which is of independent interest.
arXiv Detail & Related papers (2023-07-28T11:26:26Z) - Explainable Representations for Relation Prediction in Knowledge Graphs [0.0]
We propose SEEK, a novel approach for explainable representations to support relation prediction in knowledge graphs.
It is based on identifying relevant shared semantic aspects between entities and learning representations for each subgraph.
We evaluate SEEK on two real-world relation prediction tasks: protein-protein interaction prediction and gene-disease association prediction.
arXiv Detail & Related papers (2023-06-22T06:18:40Z) - Variational Cross-Graph Reasoning and Adaptive Structured Semantics
Learning for Compositional Temporal Grounding [143.5927158318524]
Temporal grounding is the task of locating a specific segment from an untrimmed video according to a query sentence.
We introduce a new Compositional Temporal Grounding task and construct two new dataset splits.
We argue that the inherent structured semantics inside the videos and language is the crucial factor to achieve compositional generalization.
arXiv Detail & Related papers (2023-01-22T08:02:23Z) - Homomorphism Autoencoder -- Learning Group Structured Representations from Observed Transitions [51.71245032890532]
We propose methods enabling an agent acting upon the world to learn internal representations of sensory information consistent with actions that modify it.
In contrast to existing work, our approach does not require prior knowledge of the group and does not restrict the set of actions the agent can perform.
arXiv Detail & Related papers (2022-07-25T11:22:48Z) - On Neural Architecture Inductive Biases for Relational Tasks [76.18938462270503]
We introduce a simple architecture based on similarity-distribution scores which we name Compositional Network generalization (CoRelNet)
We find that simple architectural choices can outperform existing models in out-of-distribution generalizations.
arXiv Detail & Related papers (2022-06-09T16:24:01Z) - AGM Belief Revision, Semantically [1.7403133838762446]
We establish a generic, model-theoretic characterization of belief revision operators implementing the paradigm of minimal change.
Our characterization applies to all Tarskian logics, that is, all logics with a classical model-theoretic semantics.
arXiv Detail & Related papers (2021-12-27T07:53:21Z) - Generating Contrastive Explanations for Inductive Logic Programming
Based on a Near Miss Approach [0.7734726150561086]
We introduce an explanation generation algorithm for relational concepts learned with Inductive Logic Programming (textscGeNME)
A modified rule which covers the near miss but not the original instance is given as an explanation.
We also present a psychological experiment comparing human preferences of rule-based, example-based, and near miss explanations in the family and the arches domains.
arXiv Detail & Related papers (2021-06-15T11:42:05Z) - Modelling Compositionality and Structure Dependence in Natural Language [0.12183405753834563]
Drawing on linguistics and set theory, a formalisation of these ideas is presented in the first half of this thesis.
We see how cognitive systems that process language need to have certain functional constraints.
Using the advances of word embedding techniques, a model of relational learning is simulated.
arXiv Detail & Related papers (2020-11-22T17:28:50Z) - What can I do here? A Theory of Affordances in Reinforcement Learning [65.70524105802156]
We develop a theory of affordances for agents who learn and plan in Markov Decision Processes.
Affordances play a dual role in this case, by reducing the number of actions available in any given situation.
We propose an approach to learn affordances and use it to estimate transition models that are simpler and generalize better.
arXiv Detail & Related papers (2020-06-26T16:34:53Z) - Expressiveness and machine processability of Knowledge Organization
Systems (KOS): An analysis of concepts and relations [0.0]
The potential of both the expressiveness and machine processability of each Knowledge Organization System is extensively regulated by its structural rules.
Ontologies explicitly define diverse types of relations, and are by their nature machine-processable.
arXiv Detail & Related papers (2020-03-11T12:35:52Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.