Metaphoric Paraphrase Generation
- URL: http://arxiv.org/abs/2002.12854v1
- Date: Fri, 28 Feb 2020 16:30:33 GMT
- Title: Metaphoric Paraphrase Generation
- Authors: Kevin Stowe and Leonardo Ribeiro and Iryna Gurevych
- Abstract summary: We use crowdsourcing to evaluate our results, as well as developing an automatic metric for evaluating metaphoric paraphrases.
We show that while the lexical replacement baseline is capable of producing accurate paraphrases, they often lack metaphoricity.
Our metaphor masking model excels in generating metaphoric sentences while performing nearly as well with regard to fluency and paraphrase quality.
- Score: 58.592750281138265
- License: http://arxiv.org/licenses/nonexclusive-distrib/1.0/
- Abstract: This work describes the task of metaphoric paraphrase generation, in which we
are given a literal sentence and are charged with generating a metaphoric
paraphrase. We propose two different models for this task: a lexical
replacement baseline and a novel sequence to sequence model, 'metaphor
masking', that generates free metaphoric paraphrases. We use crowdsourcing to
evaluate our results, as well as developing an automatic metric for evaluating
metaphoric paraphrases. We show that while the lexical replacement baseline is
capable of producing accurate paraphrases, they often lack metaphoricity, while
our metaphor masking model excels in generating metaphoric sentences while
performing nearly as well with regard to fluency and paraphrase quality.
Related papers
- Verifying Claims About Metaphors with Large-Scale Automatic Metaphor Identification [14.143299702954023]
This study entails a large-scale, corpus-based analysis of certain existing claims about verb metaphors, by applying metaphor detection to sentences extracted from Common Crawl.
The verification results indicate that the direct objects of verbs used as metaphors tend to have lower degrees of concreteness, imageability, and familiarity, and that metaphors are more likely to be used in emotional and subjective sentences.
arXiv Detail & Related papers (2024-04-01T10:17:45Z) - ParaAMR: A Large-Scale Syntactically Diverse Paraphrase Dataset by AMR
Back-Translation [59.91139600152296]
ParaAMR is a large-scale syntactically diverse paraphrase dataset created by abstract meaning representation back-translation.
We show that ParaAMR can be used to improve on three NLP tasks: learning sentence embeddings, syntactically controlled paraphrase generation, and data augmentation for few-shot learning.
arXiv Detail & Related papers (2023-05-26T02:27:33Z) - Metaphorical Polysemy Detection: Conventional Metaphor meets Word Sense
Disambiguation [9.860944032009847]
Linguists distinguish between novel and conventional metaphor, a distinction which the metaphor detection task in NLP does not take into account.
In this paper, we investigate the limitations of treating conventional metaphors in this way.
We develop the first MPD model, which learns to identify conventional metaphors in the English WordNet.
arXiv Detail & Related papers (2022-12-16T10:39:22Z) - Metaphorical Paraphrase Generation: Feeding Metaphorical Language Models
with Literal Texts [2.6397379133308214]
The proposed algorithm does not only focus on verbs but also on nouns and adjectives.
Human evaluation showed that our system-generated metaphors are considered more creative and metaphorical than human-generated ones.
arXiv Detail & Related papers (2022-10-10T15:11:27Z) - Towards Document-Level Paraphrase Generation with Sentence Rewriting and
Reordering [88.08581016329398]
We propose CoRPG (Coherence Relationship guided Paraphrase Generation) for document-level paraphrase generation.
We use graph GRU to encode the coherence relationship graph and get the coherence-aware representation for each sentence.
Our model can generate document paraphrase with more diversity and semantic preservation.
arXiv Detail & Related papers (2021-09-15T05:53:40Z) - Metaphor Generation with Conceptual Mappings [58.61307123799594]
We aim to generate a metaphoric sentence given a literal expression by replacing relevant verbs.
We propose to control the generation process by encoding conceptual mappings between cognitive domains.
We show that the unsupervised CM-Lex model is competitive with recent deep learning metaphor generation systems.
arXiv Detail & Related papers (2021-06-02T15:27:05Z) - Interpreting Verbal Metaphors by Paraphrasing [12.750941606061877]
We show that our paraphrasing method significantly outperforms the state-of-the-art baseline.
We also demonstrate that our method can help a machine translation system improve its accuracy in translating English metaphors to 8 target languages.
arXiv Detail & Related papers (2021-04-07T21:00:23Z) - MERMAID: Metaphor Generation with Symbolism and Discriminative Decoding [22.756157298168127]
Based on a theoretically-grounded connection between metaphors and symbols, we propose a method to automatically construct a parallel corpus.
For the generation task, we incorporate a metaphor discriminator to guide the decoding of a sequence to sequence model fine-tuned on our parallel data.
A task-based evaluation shows that human-written poems enhanced with metaphors are preferred 68% of the time compared to poems without metaphors.
arXiv Detail & Related papers (2021-03-11T16:39:19Z) - Generating similes effortlessly like a Pro: A Style Transfer Approach
for Simile Generation [65.22565071742528]
Figurative language such as a simile go beyond plain expressions to give readers new insights and inspirations.
Generating a simile requires proper understanding for effective mapping of properties between two concepts.
We show how replacing literal sentences with similes from our best model in machine generated stories improves evocativeness and leads to better acceptance by human judges.
arXiv Detail & Related papers (2020-09-18T17:37:13Z) - Neural Syntactic Preordering for Controlled Paraphrase Generation [57.5316011554622]
Our work uses syntactic transformations to softly "reorder'' the source sentence and guide our neural paraphrasing model.
First, given an input sentence, we derive a set of feasible syntactic rearrangements using an encoder-decoder model.
Next, we use each proposed rearrangement to produce a sequence of position embeddings, which encourages our final encoder-decoder paraphrase model to attend to the source words in a particular order.
arXiv Detail & Related papers (2020-05-05T09:02:25Z)
This list is automatically generated from the titles and abstracts of the papers in this site.
This site does not guarantee the quality of this site (including all information) and is not responsible for any consequences.