Hooks in the Headline: Learning to Generate Headlines with Controlled
Styles
- URL: http://arxiv.org/abs/2004.01980v3
- Date: Fri, 29 May 2020 02:21:37 GMT
- Title: Hooks in the Headline: Learning to Generate Headlines with Controlled
Styles
- Authors: Di Jin, Zhijing Jin, Joey Tianyi Zhou, Lisa Orii, Peter Szolovits
- Abstract summary: We propose a new task, Stylistic Headline Generation (SHG), to enrich the headlines with three style options.
TitleStylist generates style-specific headlines by combining the summarization and reconstruction tasks into a multitasking framework.
The attraction score of our model generated headlines surpasses that of the state-of-the-art summarization model by 9.68%, and even outperforms human-written references.
- Score: 69.30101340243375
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Current summarization systems only produce plain, factual headlines, but do
not meet the practical needs of creating memorable titles to increase exposure.
We propose a new task, Stylistic Headline Generation (SHG), to enrich the
headlines with three style options (humor, romance and clickbait), in order to
attract more readers. With no style-specific article-headline pair (only a
standard headline summarization dataset and mono-style corpora), our method
TitleStylist generates style-specific headlines by combining the summarization
and reconstruction tasks into a multitasking framework. We also introduced a
novel parameter sharing scheme to further disentangle the style from the text.
Through both automatic and human evaluation, we demonstrate that TitleStylist
can generate relevant, fluent headlines with three target styles: humor,
romance, and clickbait. The attraction score of our model generated headlines
surpasses that of the state-of-the-art summarization model by 9.68%, and even
outperforms human-written references.
Related papers
- StyleBART: Decorate Pretrained Model with Style Adapters for
Unsupervised Stylistic Headline Generation [13.064106986202294]
StyleBART is an unsupervised approach for stylistic headline generation.
Our method decorates the pretrained BART model with adapters that are responsible for different styles.
We show that StyleBART achieves new state-of-the-art performance in the unsupervised stylistic headline generation task.
arXiv Detail & Related papers (2023-10-26T19:31:22Z) - Stylized Data-to-Text Generation: A Case Study in the E-Commerce Domain [53.22419717434372]
We propose a new task, namely stylized data-to-text generation, whose aim is to generate coherent text according to a specific style.
This task is non-trivial, due to three challenges: the logic of the generated text, unstructured style reference, and biased training samples.
We propose a novel stylized data-to-text generation model, named StyleD2T, comprising three components: logic planning-enhanced data embedding, mask-based style embedding, and unbiased stylized text generation.
arXiv Detail & Related papers (2023-05-05T03:02:41Z) - Contrastive Learning enhanced Author-Style Headline Generation [15.391087541824279]
We propose a novel Seq2Seq model called CLH3G (Contrastive Learning enhanced Historical Headlines based Headline Generation)
By taking historical headlines into account, we can integrate the stylistic features of the author into our model, and generate a headline consistent with the author's style.
Experimental results show that historical headlines of the same user can improve the headline generation significantly.
arXiv Detail & Related papers (2022-11-07T04:51:03Z) - Unsupervised Neural Stylistic Text Generation using Transfer learning
and Adapters [66.17039929803933]
We propose a novel transfer learning framework which updates only $0.3%$ of model parameters to learn style specific attributes for response generation.
We learn style specific attributes from the PERSONALITY-CAPTIONS dataset.
arXiv Detail & Related papers (2022-10-07T00:09:22Z) - Generating More Pertinent Captions by Leveraging Semantics and Style on
Multi-Source Datasets [56.018551958004814]
This paper addresses the task of generating fluent descriptions by training on a non-uniform combination of data sources.
Large-scale datasets with noisy image-text pairs provide a sub-optimal source of supervision.
We propose to leverage and separate semantics and descriptive style through the incorporation of a style token and keywords extracted through a retrieval component.
arXiv Detail & Related papers (2021-11-24T19:00:05Z) - Stylized Story Generation with Style-Guided Planning [38.791298336259146]
We propose a new task, stylized story gen-eration, namely generating stories with speci-fied style given a leading context.
Our model can controllably generateemo-tion-driven or event-driven stories based on the ROCStories dataset.
arXiv Detail & Related papers (2021-05-18T15:55:38Z) - The Style-Content Duality of Attractiveness: Learning to Write
Eye-Catching Headlines via Disentanglement [59.58372539336339]
Eye-catching headlines function as the first device to trigger more clicks, bringing reciprocal effect between producers and viewers.
We propose a Disentanglement-based Attractive Headline Generator (DAHG) that generates headline which captures the attractive content following the attractive style.
arXiv Detail & Related papers (2020-12-14T11:11:43Z) - What's New? Summarizing Contributions in Scientific Literature [85.95906677964815]
We introduce a new task of disentangled paper summarization, which seeks to generate separate summaries for the paper contributions and the context of the work.
We extend the S2ORC corpus of academic articles by adding disentangled "contribution" and "context" reference labels.
We propose a comprehensive automatic evaluation protocol which reports the relevance, novelty, and disentanglement of generated outputs.
arXiv Detail & Related papers (2020-11-06T02:23:01Z) - Generating Representative Headlines for News Stories [31.67864779497127]
Grouping articles that are reporting the same event into news stories is a common way of assisting readers in their news consumption.
It remains a challenging research problem to efficiently and effectively generate a representative headline for each story.
We develop a distant supervision approach to train large-scale generation models without any human annotation.
arXiv Detail & Related papers (2020-01-26T02:08:22Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.