Generating Coherent Narratives by Learning Dynamic and Discrete Entity
States with a Contrastive Framework
- URL: http://arxiv.org/abs/2208.03985v1
- Date: Mon, 8 Aug 2022 09:02:19 GMT
- Title: Generating Coherent Narratives by Learning Dynamic and Discrete Entity
States with a Contrastive Framework
- Authors: Jian Guan, Zhenyu Yang, Rongsheng Zhang, Zhipeng Hu, Minlie Huang
- Abstract summary: We extend the Transformer model to dynamically conduct entity state updates and sentence realization for narrative generation.
Experiments on two narrative datasets show that our model can generate more coherent and diverse narratives than strong baselines.
- Score: 68.1678127433077
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite advances in generating fluent texts, existing pretraining models tend
to attach incoherent event sequences to involved entities when generating
narratives such as stories and news. We conjecture that such issues result from
representing entities as static embeddings of superficial words, while
neglecting to model their ever-changing states, i.e., the information they
carry, as the text unfolds. Therefore, we extend the Transformer model to
dynamically conduct entity state updates and sentence realization for narrative
generation. We propose a contrastive framework to learn the state
representations in a discrete space, and insert additional attention layers
into the decoder to better exploit these states. Experiments on two narrative
datasets show that our model can generate more coherent and diverse narratives
than strong baselines with the guidance of meaningful entity states.
Related papers
Err
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.