Stowe, Kevin ; Beck, Nils ; Gurevych, Iryna (2021)
Exploring Metaphoric Paraphrase Generation.
25th Conference on Computational Natural Language Learning (CoNLL 2021). virtual Conference (10.11.2021-11.11.2021)
doi: 10.18653/v1/2021.conll-1.26
Konferenzveröffentlichung, Bibliographie
Kurzbeschreibung (Abstract)
Metaphor generation is a difficult task, and has seen tremendous improvement with the advent of deep pretrained models. We focus here on the specific task of metaphoric paraphrase generation, in which we provide a literal sentence and generate a metaphoric sentence which paraphrases that input. We compare naive, “free” generation models with those that exploit forms of control over the generation process, adding additional information based on conceptual metaphor theory. We evaluate two methods for generating paired training data, which is then used to train T5 models for free and controlled generation. We use crowdsourcing to evaluate the results, showing that free models tend to generate more fluent paraphrases, while controlled models are better at generating novel metaphors. We then analyze evaluation metrics, showing that different metrics are necessary to capture different aspects of metaphoric paraphrasing. We release our data and models, as well as our annotated results in order to facilitate development of better evaluation metrics.
Typ des Eintrags: | Konferenzveröffentlichung |
---|---|
Erschienen: | 2021 |
Autor(en): | Stowe, Kevin ; Beck, Nils ; Gurevych, Iryna |
Art des Eintrags: | Bibliographie |
Titel: | Exploring Metaphoric Paraphrase Generation |
Sprache: | Englisch |
Publikationsjahr: | 13 September 2021 |
Ort: | Stroudsburg, Pennsylvania, United States |
Verlag: | ACL |
Buchtitel: | Proceedings of the 25th Conference on Computational Natural Language Learning |
Veranstaltungstitel: | 25th Conference on Computational Natural Language Learning (CoNLL 2021) |
Veranstaltungsort: | virtual Conference |
Veranstaltungsdatum: | 10.11.2021-11.11.2021 |
DOI: | 10.18653/v1/2021.conll-1.26 |
URL / URN: | https://aclanthology.org/2021.conll-1.26 |
Zugehörige Links: | |
Kurzbeschreibung (Abstract): | Metaphor generation is a difficult task, and has seen tremendous improvement with the advent of deep pretrained models. We focus here on the specific task of metaphoric paraphrase generation, in which we provide a literal sentence and generate a metaphoric sentence which paraphrases that input. We compare naive, “free” generation models with those that exploit forms of control over the generation process, adding additional information based on conceptual metaphor theory. We evaluate two methods for generating paired training data, which is then used to train T5 models for free and controlled generation. We use crowdsourcing to evaluate the results, showing that free models tend to generate more fluent paraphrases, while controlled models are better at generating novel metaphors. We then analyze evaluation metrics, showing that different metrics are necessary to capture different aspects of metaphoric paraphrasing. We release our data and models, as well as our annotated results in order to facilitate development of better evaluation metrics. |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung |
Hinterlegungsdatum: | 21 Sep 2021 14:06 |
Letzte Änderung: | 19 Dez 2024 10:40 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |