Cross Copy Network for Dialogue Generation
- URL: http://arxiv.org/abs/2010.11539v1
- Date: Thu, 22 Oct 2020 09:03:23 GMT
- Title: Cross Copy Network for Dialogue Generation
- Authors: Changzhen Ji, Xin Zhou, Yating Zhang, Xiaozhong Liu, Changlong Sun,
Conghui Zhu and Tiejun Zhao
- Abstract summary: We propose a novel network architecture - Cross Copy Networks(CCN) to explore the current dialog context and similar dialogue instances' logical structure simultaneously.
Experiments with two tasks, court debate and customer service content generation, proved that the proposed algorithm is superior to existing state-of-art content generation models.
- Score: 44.593899479668416
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: In the past few years, audiences from different fields witness the
achievements of sequence-to-sequence models (e.g., LSTM+attention, Pointer
Generator Networks, and Transformer) to enhance dialogue content generation.
While content fluency and accuracy often serve as the major indicators for
model training, dialogue logics, carrying critical information for some
particular domains, are often ignored. Take customer service and court debate
dialogue as examples, compatible logics can be observed across different
dialogue instances, and this information can provide vital evidence for
utterance generation. In this paper, we propose a novel network architecture -
Cross Copy Networks(CCN) to explore the current dialog context and similar
dialogue instances' logical structure simultaneously. Experiments with two
tasks, court debate and customer service content generation, proved that the
proposed algorithm is superior to existing state-of-art content generation
models.
Related papers
- A Static and Dynamic Attention Framework for Multi Turn Dialogue Generation [37.79563028123686]
In open domain multi turn dialogue generation, it is essential to modeling the contextual semantics of the dialogue history.
Previous research had verified the effectiveness of the hierarchical recurrent encoder-decoder framework on open domain multi turn dialogue generation.
We propose a static and dynamic attention-based approach to model the dialogue history and then generate open domain multi turn dialogue responses.
arXiv Detail & Related papers (2024-10-28T06:05:34Z) - Contextual Data Augmentation for Task-Oriented Dialog Systems [8.085645180329417]
We develop a novel dialog augmentation model that generates a user turn, conditioning on full dialog context.
With a new prompt design for language model, and output re-ranking, the dialogs generated from our model can be directly used to train downstream dialog systems.
arXiv Detail & Related papers (2023-10-16T13:22:34Z) - Multi-turn Dialogue Comprehension from a Topic-aware Perspective [70.37126956655985]
This paper proposes to model multi-turn dialogues from a topic-aware perspective.
We use a dialogue segmentation algorithm to split a dialogue passage into topic-concentrated fragments in an unsupervised way.
We also present a novel model, Topic-Aware Dual-Attention Matching (TADAM) Network, which takes topic segments as processing elements.
arXiv Detail & Related papers (2023-09-18T11:03:55Z) - Unsupervised Dialogue Topic Segmentation with Topic-aware Utterance
Representation [51.22712675266523]
Dialogue Topic (DTS) plays an essential role in a variety of dialogue modeling tasks.
We propose a novel unsupervised DTS framework, which learns topic-aware utterance representations from unlabeled dialogue data.
arXiv Detail & Related papers (2023-05-04T11:35:23Z) - CTRLStruct: Dialogue Structure Learning for Open-Domain Response
Generation [38.60073402817218]
Well-structured topic flow can leverage background information and predict future topics to help generate controllable and explainable responses.
We present a new framework for dialogue structure learning to effectively explore topic-level dialogue clusters as well as their transitions with unlabelled information.
Experiments on two popular open-domain dialogue datasets show our model can generate more coherent responses compared to some excellent dialogue models.
arXiv Detail & Related papers (2023-03-02T09:27:11Z) - Manual-Guided Dialogue for Flexible Conversational Agents [84.46598430403886]
How to build and use dialogue data efficiently, and how to deploy models in different domains at scale can be critical issues in building a task-oriented dialogue system.
We propose a novel manual-guided dialogue scheme, where the agent learns the tasks from both dialogue and manuals.
Our proposed scheme reduces the dependence of dialogue models on fine-grained domain ontology, and makes them more flexible to adapt to various domains.
arXiv Detail & Related papers (2022-08-16T08:21:12Z) - Back to the Future: Bidirectional Information Decoupling Network for
Multi-turn Dialogue Modeling [80.51094098799736]
We propose Bidirectional Information Decoupling Network (BiDeN) as a universal dialogue encoder.
BiDeN explicitly incorporates both the past and future contexts and can be generalized to a wide range of dialogue-related tasks.
Experimental results on datasets of different downstream tasks demonstrate the universality and effectiveness of our BiDeN.
arXiv Detail & Related papers (2022-04-18T03:51:46Z) - Ranking Enhanced Dialogue Generation [77.8321855074999]
How to effectively utilize the dialogue history is a crucial problem in multi-turn dialogue generation.
Previous works usually employ various neural network architectures to model the history.
This paper proposes a Ranking Enhanced Dialogue generation framework.
arXiv Detail & Related papers (2020-08-13T01:49:56Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.