Automated Action Model Acquisition from Narrative Texts
- URL: http://arxiv.org/abs/2307.10247v1
- Date: Mon, 17 Jul 2023 07:04:31 GMT
- Title: Automated Action Model Acquisition from Narrative Texts
- Authors: Ruiqi Li, Leyang Cui, Songtuan Lin, Patrik Haslum
- Abstract summary: We present NaRuto, a system that extracts structured events from narrative text and generates planning-language-style action models.
Experimental results in classical narrative planning domains show that NaRuto can generate action models of significantly better quality than existing fully automated methods.
- Score: 13.449750550301992
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Action models, which take the form of precondition/effect axioms, facilitate
causal and motivational connections between actions for AI agents. Action model
acquisition has been identified as a bottleneck in the application of planning
technology, especially within narrative planning. Acquiring action models from
narrative texts in an automated way is essential, but challenging because of
the inherent complexities of such texts. We present NaRuto, a system that
extracts structured events from narrative text and subsequently generates
planning-language-style action models based on predictions of commonsense event
relations, as well as textual contradictions and similarities, in an
unsupervised manner. Experimental results in classical narrative planning
domains show that NaRuto can generate action models of significantly better
quality than existing fully automated methods, and even on par with those of
semi-automated methods.
Related papers
- Attacks against Abstractive Text Summarization Models through Lead Bias and Influence Functions [1.7863534204867277]
Large Language Models are vulnerable to adversarial perturbations and data poisoning attacks.
In this work, we unveil a novel approach by exploiting the inherent lead bias in summarization models.
We also introduce an innovative application of influence functions, to execute data poisoning, which compromises the model's integrity.
arXiv Detail & Related papers (2024-10-26T00:35:15Z) - Visual Storytelling with Question-Answer Plans [70.89011289754863]
We present a novel framework which integrates visual representations with pretrained language models and planning.
Our model translates the image sequence into a visual prefix, a sequence of continuous embeddings which language models can interpret.
It also leverages a sequence of question-answer pairs as a blueprint plan for selecting salient visual concepts and determining how they should be assembled into a narrative.
arXiv Detail & Related papers (2023-10-08T21:45:34Z) - Structured Like a Language Model: Analysing AI as an Automated Subject [0.0]
We argue the intentional fictional projection of subjectivity onto large language models can yield an alternate frame through which AI behaviour can be analysed.
We trace a brief history of language models, culminating in the releases of systems that realise state-of-the-art natural language processing performance.
We conclude that critical media methods and psychoanalytic theory together offer a productive frame for grasping the powerful new capacities of AI-driven language systems.
arXiv Detail & Related papers (2022-12-08T21:58:43Z) - EtriCA: Event-Triggered Context-Aware Story Generation Augmented by
Cross Attention [17.049035309926637]
We present EtriCA, a novel neural generation model, which improves the relevance and coherence of the generated stories.
We show that our model significantly outperforms state-of-the-art baselines.
arXiv Detail & Related papers (2022-10-22T14:51:12Z) - Zero-Shot On-the-Fly Event Schema Induction [61.91468909200566]
We present a new approach in which large language models are utilized to generate source documents that allow predicting, given a high-level event definition, the specific events, arguments, and relations between them.
Using our model, complete schemas on any topic can be generated on-the-fly without any manual data collection, i.e., in a zero-shot manner.
arXiv Detail & Related papers (2022-10-12T14:37:00Z) - Generating Coherent Narratives by Learning Dynamic and Discrete Entity
States with a Contrastive Framework [68.1678127433077]
We extend the Transformer model to dynamically conduct entity state updates and sentence realization for narrative generation.
Experiments on two narrative datasets show that our model can generate more coherent and diverse narratives than strong baselines.
arXiv Detail & Related papers (2022-08-08T09:02:19Z) - Text-Based Action-Model Acquisition for Planning [13.110360825201044]
We propose a novel approach to learning action models from natural language texts by integrating Constraint Satisfaction and Natural Language Processing techniques.
Specifically, we first build a novel language model to extract plan traces from texts, and then build a set of constraints to generate action models based on the extracted plan traces.
arXiv Detail & Related papers (2022-02-15T02:23:31Z) - Goal-Aware Prediction: Learning to Model What Matters [105.43098326577434]
One of the fundamental challenges in using a learned forward dynamics model is the mismatch between the objective of the learned model and that of the downstream planner or policy.
We propose to direct prediction towards task relevant information, enabling the model to be aware of the current task and encouraging it to only model relevant quantities of the state space.
We find that our method more effectively models the relevant parts of the scene conditioned on the goal, and as a result outperforms standard task-agnostic dynamics models and model-free reinforcement learning.
arXiv Detail & Related papers (2020-07-14T16:42:59Z) - Improving Adversarial Text Generation by Modeling the Distant Future [155.83051741029732]
We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues.
We propose a novel guider network to focus on the generative process over a longer horizon, which can assist next-word prediction and provide intermediate rewards for generator optimization.
arXiv Detail & Related papers (2020-05-04T05:45:13Z) - Temporal Embeddings and Transformer Models for Narrative Text
Understanding [72.88083067388155]
We present two approaches to narrative text understanding for character relationship modelling.
The temporal evolution of these relations is described by dynamic word embeddings, that are designed to learn semantic changes over time.
A supervised learning approach based on the state-of-the-art transformer model BERT is used instead to detect static relations between characters.
arXiv Detail & Related papers (2020-03-19T14:23:12Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.