TU Darmstadt / ULB / TUbiblio

CiteBench: A Benchmark for Scientific Citation Text Generation

Funkquist, Martin ; Kuznetsov, Ilia ; Hou, Yufang ; Gurevych, Iryna (2023)
CiteBench: A Benchmark for Scientific Citation Text Generation.
2023 Conference on Empirical Methods in Natural Language Processing. Singapore (06.-10.12.2023)
doi: 10.18653/v1/2023.emnlp-main.455
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Science progresses by building upon the prior body of knowledge documented in scientific publications. The acceleration of research makes it hard to stay up-to-date with the recent developments and to summarize the ever-growing body of prior work. To address this, the task of citation text generation aims to produce accurate textual summaries given a set of papers-to-cite and the citing paper context. Due to otherwise rare explicit anchoring of cited documents in the citing paper, citation text generation provides an excellent opportunity to study how humans aggregate and synthesize textual knowledge from sources. Yet, existing studies are based upon widely diverging task definitions, which makes it hard to study this task systematically. To address this challenge, we propose CiteBench: a benchmark for citation text generation that unifies multiple diverse datasets and enables standardized evaluation of citation text generation models across task designs and domains. Using the new benchmark, we investigate the performance of multiple strong baselines, test their transferability between the datasets, and deliver new insights into the task definition and evaluation to guide future research in citation text generation. We make the code for CiteBench publicly available at https://github.com/UKPLab/citebench.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2023
Autor(en): Funkquist, Martin ; Kuznetsov, Ilia ; Hou, Yufang ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: CiteBench: A Benchmark for Scientific Citation Text Generation
Sprache: Englisch
Publikationsjahr: Dezember 2023
Ort: Singapore
Verlag: Association for Computational Linguistics
Buchtitel: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Veranstaltungstitel: 2023 Conference on Empirical Methods in Natural Language Processing
Veranstaltungsort: Singapore
Veranstaltungsdatum: 06.-10.12.2023
DOI: 10.18653/v1/2023.emnlp-main.455
URL / URN: https://aclanthology.org/2023.emnlp-main.455/
Kurzbeschreibung (Abstract):

Science progresses by building upon the prior body of knowledge documented in scientific publications. The acceleration of research makes it hard to stay up-to-date with the recent developments and to summarize the ever-growing body of prior work. To address this, the task of citation text generation aims to produce accurate textual summaries given a set of papers-to-cite and the citing paper context. Due to otherwise rare explicit anchoring of cited documents in the citing paper, citation text generation provides an excellent opportunity to study how humans aggregate and synthesize textual knowledge from sources. Yet, existing studies are based upon widely diverging task definitions, which makes it hard to study this task systematically. To address this challenge, we propose CiteBench: a benchmark for citation text generation that unifies multiple diverse datasets and enables standardized evaluation of citation text generation models across task designs and domains. Using the new benchmark, we investigate the performance of multiple strong baselines, test their transferability between the datasets, and deliver new insights into the task definition and evaluation to guide future research in citation text generation. We make the code for CiteBench publicly available at https://github.com/UKPLab/citebench.

Freie Schlagworte: UKP_p_PEER,UKP_p_InterText, UKP_p_code_transformers
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 18 Jan 2024 13:45
Letzte Änderung: 15 Mär 2024 12:02
PPN: 516341227
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen