Graph Based Network with Contextualized Representations of Turns in
Dialogue
- URL: http://arxiv.org/abs/2109.04008v1
- Date: Thu, 9 Sep 2021 03:09:08 GMT
- Title: Graph Based Network with Contextualized Representations of Turns in
Dialogue
- Authors: Bongseok Lee and Yong Suk Choi
- Abstract summary: Dialogue-based relation extraction (RE) aims to extract relation(s) between two arguments that appear in a dialogue.
We propose the TUrn COntext awaRE Graph Convolutional Network (TUCORE-GCN) modeled by paying attention to the way people understand dialogues.
- Score: 0.0
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Dialogue-based relation extraction (RE) aims to extract relation(s) between
two arguments that appear in a dialogue. Because dialogues have the
characteristics of high personal pronoun occurrences and low information
density, and since most relational facts in dialogues are not supported by any
single sentence, dialogue-based relation extraction requires a comprehensive
understanding of dialogue. In this paper, we propose the TUrn COntext awaRE
Graph Convolutional Network (TUCORE-GCN) modeled by paying attention to the way
people understand dialogues. In addition, we propose a novel approach which
treats the task of emotion recognition in conversations (ERC) as a
dialogue-based RE. Experiments on a dialogue-based RE dataset and three ERC
datasets demonstrate that our model is very effective in various dialogue-based
natural language understanding tasks. In these experiments, TUCORE-GCN
outperforms the state-of-the-art models on most of the benchmark datasets. Our
code is available at https://github.com/BlackNoodle/TUCORE-GCN.
Related papers
- PK-Chat: Pointer Network Guided Knowledge Driven Generative Dialogue
Model [79.64376762489164]
PK-Chat is a Pointer network guided generative dialogue model, incorporating a unified pretrained language model and a pointer network over knowledge graphs.
The words generated by PK-Chat in the dialogue are derived from the prediction of word lists and the direct prediction of the external knowledge graph knowledge.
Based on the PK-Chat, a dialogue system is built for academic scenarios in the case of geosciences.
arXiv Detail & Related papers (2023-04-02T18:23:13Z) - CTRLStruct: Dialogue Structure Learning for Open-Domain Response
Generation [38.60073402817218]
Well-structured topic flow can leverage background information and predict future topics to help generate controllable and explainable responses.
We present a new framework for dialogue structure learning to effectively explore topic-level dialogue clusters as well as their transitions with unlabelled information.
Experiments on two popular open-domain dialogue datasets show our model can generate more coherent responses compared to some excellent dialogue models.
arXiv Detail & Related papers (2023-03-02T09:27:11Z) - A Benchmark for Understanding and Generating Dialogue between Characters
in Stories [75.29466820496913]
We present the first study to explore whether machines can understand and generate dialogue in stories.
We propose two new tasks including Masked Dialogue Generation and Dialogue Speaker Recognition.
We show the difficulty of the proposed tasks by testing existing models with automatic and manual evaluation on DialStory.
arXiv Detail & Related papers (2022-09-18T10:19:04Z) - Back to the Future: Bidirectional Information Decoupling Network for
Multi-turn Dialogue Modeling [80.51094098799736]
We propose Bidirectional Information Decoupling Network (BiDeN) as a universal dialogue encoder.
BiDeN explicitly incorporates both the past and future contexts and can be generalized to a wide range of dialogue-related tasks.
Experimental results on datasets of different downstream tasks demonstrate the universality and effectiveness of our BiDeN.
arXiv Detail & Related papers (2022-04-18T03:51:46Z) - Commonsense-Focused Dialogues for Response Generation: An Empirical
Study [39.49727190159279]
We present an empirical study of commonsense in dialogue response generation.
We first auto-extract commonsensical dialogues from existing dialogue datasets by leveraging ConceptNet.
We then collect a new dialogue dataset with 25K dialogues aimed at exhibiting social commonsense in an interactive setting.
arXiv Detail & Related papers (2021-09-14T04:32:09Z) - Incorporating Commonsense Knowledge into Abstractive Dialogue
Summarization via Heterogeneous Graph Networks [34.958271247099]
We present a novel multi-speaker dialogue summarizer to demonstrate how large-scale commonsense knowledge can facilitate dialogue understanding and summary generation.
We consider utterance and commonsense knowledge as two different types of data and design a Dialogue Heterogeneous Graph Network (D-HGN) for modeling both information.
arXiv Detail & Related papers (2020-10-20T05:44:55Z) - Dialogue Relation Extraction with Document-level Heterogeneous Graph
Attention Networks [21.409522845011907]
Dialogue relation extraction (DRE) aims to detect the relation between two entities mentioned in a multi-party dialogue.
We present a graph attention network-based method for DRE where a graph contains meaningfully connected speaker, entity, entity-type, and utterance nodes.
We empirically show that this graph-based approach quite effectively captures the relations between different entity pairs in a dialogue as it outperforms the state-of-the-art approaches.
arXiv Detail & Related papers (2020-09-10T18:51:48Z) - Rethinking Dialogue State Tracking with Reasoning [76.0991910623001]
This paper proposes to track dialogue states gradually with reasoning over dialogue turns with the help of the back-end data.
Empirical results demonstrate that our method significantly outperforms the state-of-the-art methods by 38.6% in terms of joint belief accuracy for MultiWOZ 2.1.
arXiv Detail & Related papers (2020-05-27T02:05:33Z) - Dialogue-Based Relation Extraction [53.2896545819799]
We present the first human-annotated dialogue-based relation extraction (RE) dataset DialogRE.
We argue that speaker-related information plays a critical role in the proposed task, based on an analysis of similarities and differences between dialogue-based and traditional RE tasks.
Experimental results demonstrate that a speaker-aware extension on the best-performing model leads to gains in both the standard and conversational evaluation settings.
arXiv Detail & Related papers (2020-04-17T03:51:57Z) - TOD-BERT: Pre-trained Natural Language Understanding for Task-Oriented
Dialogue [113.45485470103762]
In this work, we unify nine human-human and multi-turn task-oriented dialogue datasets for language modeling.
To better model dialogue behavior during pre-training, we incorporate user and system tokens into the masked language modeling.
arXiv Detail & Related papers (2020-04-15T04:09:05Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.