Generating Pertinent and Diversified Comments with Topic-aware
Pointer-Generator Networks
- URL: http://arxiv.org/abs/2005.04396v1
- Date: Sat, 9 May 2020 09:04:09 GMT
- Title: Generating Pertinent and Diversified Comments with Topic-aware
Pointer-Generator Networks
- Authors: Junheng Huang, Lu Pan, Kang Xu, Weihua Peng, Fayuan Li
- Abstract summary: We propose a novel generation model based on Topic-aware Pointer-Generator Networks (TPGN)
We design a keyword-level and topic-level encoder attention mechanism to capture topic information in the articles.
We integrate the topic information into pointer-generator networks to guide comment generation.
- Score: 5.046104800241757
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Comment generation, a new and challenging task in Natural Language Generation
(NLG), attracts a lot of attention in recent years. However, comments generated
by previous work tend to lack pertinence and diversity. In this paper, we
propose a novel generation model based on Topic-aware Pointer-Generator
Networks (TPGN), which can utilize the topic information hidden in the articles
to guide the generation of pertinent and diversified comments. Firstly, we
design a keyword-level and topic-level encoder attention mechanism to capture
topic information in the articles. Next, we integrate the topic information
into pointer-generator networks to guide comment generation. Experiments on a
large scale of comment generation dataset show that our model produces the
valuable comments and outperforms competitive baseline models significantly.
Related papers
- Recommendation with Generative Models [35.029116616023586]
Generative models are AI models capable of creating new instances of data by learning and sampling from their statistical distributions.
These models have applications across various domains, such as image generation, text synthesis, and music composition.
In recommender systems, generative models, referred to as Gen-RecSys, improve the accuracy and diversity of recommendations.
arXiv Detail & Related papers (2024-09-18T18:29:15Z) - Exploring Precision and Recall to assess the quality and diversity of LLMs [82.21278402856079]
We introduce a novel evaluation framework for Large Language Models (LLMs) such as textscLlama-2 and textscMistral.
This approach allows for a nuanced assessment of the quality and diversity of generated text without the need for aligned corpora.
arXiv Detail & Related papers (2024-02-16T13:53:26Z) - Controllable Topic-Focused Abstractive Summarization [57.8015120583044]
Controlled abstractive summarization focuses on producing condensed versions of a source article to cover specific aspects.
This paper presents a new Transformer-based architecture capable of producing topic-focused summaries.
arXiv Detail & Related papers (2023-11-12T03:51:38Z) - An Overview on Controllable Text Generation via Variational
Auto-Encoders [15.97186478109836]
Recent advances in neural-based generative modeling have reignited the hopes of having computer systems capable of conversing with humans.
Latent variable models (LVM) such as variational auto-encoders (VAEs) are designed to characterize the distributional pattern of textual data.
This overview gives an introduction to existing generation schemes, problems associated with text variational auto-encoders, and a review of several applications about the controllable generation.
arXiv Detail & Related papers (2022-11-15T07:36:11Z) - Unsupervised Neural Stylistic Text Generation using Transfer learning
and Adapters [66.17039929803933]
We propose a novel transfer learning framework which updates only $0.3%$ of model parameters to learn style specific attributes for response generation.
We learn style specific attributes from the PERSONALITY-CAPTIONS dataset.
arXiv Detail & Related papers (2022-10-07T00:09:22Z) - Knowledge-Aware Bayesian Deep Topic Model [50.58975785318575]
We propose a Bayesian generative model for incorporating prior domain knowledge into hierarchical topic modeling.
Our proposed model efficiently integrates the prior knowledge and improves both hierarchical topic discovery and document representation.
arXiv Detail & Related papers (2022-09-20T09:16:05Z) - Twist Decoding: Diverse Generators Guide Each Other [116.20780037268801]
We introduce Twist decoding, a simple and general inference algorithm that generates text while benefiting from diverse models.
Our method does not assume the vocabulary, tokenization or even generation order is shared.
arXiv Detail & Related papers (2022-05-19T01:27:53Z) - Generating Diversified Comments via Reader-Aware Topic Modeling and
Saliency Detection [25.16392119801612]
We propose a reader-aware topic modeling and saliency information detection framework to enhance the quality of generated comments.
For reader-aware topic modeling, we design a variational generative clustering algorithm for latent semantic learning and topic mining from reader comments.
For saliency information detection, we introduce Bernoulli distribution estimating on news content to select saliency information.
arXiv Detail & Related papers (2021-02-13T03:50:31Z) - Neural Topic Modeling with Cycle-Consistent Adversarial Training [17.47328718035538]
We propose Topic Modeling with Cycle-consistent Adversarial Training (ToMCAT) and its supervised version sToMCAT.
ToMCAT employs a generator network to interpret topics and an encoder network to infer document topics.
SToMCAT extends ToMCAT by incorporating document labels into the topic modeling process to help discover more coherent topics.
arXiv Detail & Related papers (2020-09-29T12:41:27Z) - Injecting Entity Types into Entity-Guided Text Generation [39.96689831978859]
In this paper, we aim to model the entity type in the decoding phase to generate contextual words accurately.
Our model has a multi-step decoder that injects the entity types into the process of entity mention generation.
Experiments on two public news datasets demonstrate type injection performs better than existing type embedding concatenation baselines.
arXiv Detail & Related papers (2020-09-28T15:19:28Z) - Topic Adaptation and Prototype Encoding for Few-Shot Visual Storytelling [81.33107307509718]
We propose a topic adaptive storyteller to model the ability of inter-topic generalization.
We also propose a prototype encoding structure to model the ability of intra-topic derivation.
Experimental results show that topic adaptation and prototype encoding structure mutually bring benefit to the few-shot model.
arXiv Detail & Related papers (2020-08-11T03:55:11Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.