MOVER: Mask, Over-generate and Rank for Hyperbole Generation
- URL: http://arxiv.org/abs/2109.07726v1
- Date: Thu, 16 Sep 2021 05:25:13 GMT
- Title: MOVER: Mask, Over-generate and Rank for Hyperbole Generation
- Authors: Yunxiang Zhang, Xiaojun Wan
- Abstract summary: We introduce a new task of hyperbole generation to transfer a literal sentence into its hyperbolic paraphrase.
We construct HYPO-XL, the first large-scale hyperbole corpus containing 17,862 hyperbolic sentences in a non-trivial way.
Based on our corpus, we propose an unsupervised method for hyperbole generation with no need for parallel literal-hyperbole pairs.
- Score: 82.63394952538292
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: Despite being a common figure of speech, hyperbole is under-researched with
only a few studies addressing its identification task. In this paper, we
introduce a new task of hyperbole generation to transfer a literal sentence
into its hyperbolic paraphrase. To tackle the lack of available hyperbolic
sentences, we construct HYPO-XL, the first large-scale hyperbole corpus
containing 17,862 hyperbolic sentences in a non-trivial way. Based on our
corpus, we propose an unsupervised method for hyperbole generation with no need
for parallel literal-hyperbole pairs. During training, we fine-tune BART to
infill masked hyperbolic spans of sentences from HYPO-XL. During inference, we
mask part of an input literal sentence and over-generate multiple possible
hyperbolic versions. Then a BERT-based ranker selects the best candidate by
hyperbolicity and paraphrase quality. Human evaluation results show that our
model is capable of generating hyperbolic paraphrase sentences and outperforms
several baseline systems.
Related papers
- Hyperbolic vs Euclidean Embeddings in Few-Shot Learning: Two Sides of
the Same Coin [49.12496652756007]
We show that the best few-shot results are attained for hyperbolic embeddings at a common hyperbolic radius.
In contrast to prior benchmark results, we demonstrate that better performance can be achieved by a fixed-radius encoder equipped with the Euclidean metric.
arXiv Detail & Related papers (2023-09-18T14:51:46Z) - Image Matters: A New Dataset and Empirical Study for Multimodal
Hyperbole Detection [52.04083398850383]
We create a multimodal detection dataset from Weibo (a Chinese social media)
We treat the text and image from a piece of weibo as two modalities and explore the role of text and image for hyperbole detection.
Different pre-trained multimodal encoders are also evaluated on this downstream task to show their performance.
arXiv Detail & Related papers (2023-07-01T03:23:56Z) - Hyperbolic Convolution via Kernel Point Aggregation [4.061135251278187]
We propose HKConv, a novel trainable hyperbolic convolution which first correlates trainable local hyperbolic features with fixed kernel points placed in the hyperbolic space.
We show that neural networks with HKConv layers advance state-of-the-art in various tasks.
arXiv Detail & Related papers (2023-06-15T05:15:13Z) - A Match Made in Heaven: A Multi-task Framework for Hyperbole and
Metaphor Detection [27.85834441076481]
Hyperbole and metaphor are common in day-to-day communication.
Existing approaches to automatically detect metaphor and hyperbole have studied these language phenomena independently.
We propose a multi-task deep learning framework to detect hyperbole and metaphor simultaneously.
arXiv Detail & Related papers (2023-05-27T14:17:59Z) - HRCF: Enhancing Collaborative Filtering via Hyperbolic Geometric
Regularization [52.369435664689995]
We introduce a textitHyperbolic Regularization powered Collaborative Filtering (HRCF) and design a geometric-aware hyperbolic regularizer.
Specifically, the proposal boosts optimization procedure via the root alignment and origin-aware penalty.
Our proposal is able to tackle the over-smoothing problem caused by hyperbolic aggregation and also brings the models a better discriminative ability.
arXiv Detail & Related papers (2022-04-18T06:11:44Z) - HypoGen: Hyperbole Generation with Commonsense and Counterfactual
Knowledge [11.93269712166532]
A hyperbole is an intentional and creative exaggeration not to be taken literally.
We tackle the under-explored and challenging task of sentence-level hyperbole generation.
Our generation method is able to generate hyperboles creatively with high success rate and intensity scores.
arXiv Detail & Related papers (2021-09-10T20:19:52Z) - Semantic-Preserving Adversarial Text Attacks [85.32186121859321]
We propose a Bigram and Unigram based adaptive Semantic Preservation Optimization (BU-SPO) method to examine the vulnerability of deep models.
Our method achieves the highest attack success rates and semantics rates by changing the smallest number of words compared with existing methods.
arXiv Detail & Related papers (2021-08-23T09:05:18Z) - Long Text Generation by Modeling Sentence-Level and Discourse-Level
Coherence [59.51720326054546]
We propose a long text generation model, which can represent the prefix sentences at sentence level and discourse level in the decoding process.
Our model can generate more coherent texts than state-of-the-art baselines.
arXiv Detail & Related papers (2021-05-19T07:29:08Z) - Unsupervised Hyperbolic Representation Learning via Message Passing
Auto-Encoders [29.088604461911892]
In this paper, we analyze how unsupervised tasks can benefit from learned representations in hyperbolic space.
To explore how well the hierarchical structure of unlabeled data can be represented in hyperbolic spaces, we design a novel hyperbolic message passing auto-encoder.
The proposed model conducts auto-encoding the networks via fully utilizing hyperbolic geometry in message passing.
arXiv Detail & Related papers (2021-03-30T03:09:53Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.