TU Darmstadt / ULB / TUbiblio

NLPeer: A Unified Resource for the Computational Study of Peer Review

Dycke, Nils ; Kuznetsov, Ilia ; Gurevych, Iryna (2024)
NLPeer: A Unified Resource for the Computational Study of Peer Review.
The 61st Annual Meeting of the Association for Computational Linguistics. Toronto, Canada (09.07.2023 - 14.07.2023)
doi: 10.26083/tuprints-00027661
Konferenzveröffentlichung, Zweitveröffentlichung, Verlagsversion

WarnungEs ist eine neuere Version dieses Eintrags verfügbar.

Kurzbeschreibung (Abstract)

Peer review constitutes a core component of scholarly publishing; yet it demands substantial expertise and training, and is susceptible to errors and biases. Various applications of NLP for peer reviewing assistance aim to support reviewers in this complex process, but the lack of clearly licensed datasets and multi-domain corpora prevent the systematic study of NLP for peer review. To remedy this, we introduce NLPeer– the first ethically sourced multidomain corpus of more than 5k papers and 11k review reports from five different venues. In addition to the new datasets of paper drafts, camera-ready versions and peer reviews from the NLP community, we establish a unified data representation and augment previous peer review datasets to include parsed and structured paper representations, rich metadata and versioning information. We complement our resource with implementations and analysis of three reviewing assistance tasks, including a novel guided skimming task. Our work paves the path towards systematic, multi-faceted, evidence-based study of peer review in NLP and beyond. The data and code are publicly available.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2024
Autor(en): Dycke, Nils ; Kuznetsov, Ilia ; Gurevych, Iryna
Art des Eintrags: Zweitveröffentlichung
Titel: NLPeer: A Unified Resource for the Computational Study of Peer Review
Sprache: Englisch
Publikationsjahr: 16 Juli 2024
Ort: Darmstadt
Publikationsdatum der Erstveröffentlichung: 2023
Ort der Erstveröffentlichung: Kerrville, TX, USA
Verlag: ACL
Buchtitel: Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Veranstaltungstitel: The 61st Annual Meeting of the Association for Computational Linguistics
Veranstaltungsort: Toronto, Canada
Veranstaltungsdatum: 09.07.2023 - 14.07.2023
DOI: 10.26083/tuprints-00027661
URL / URN: https://tuprints.ulb.tu-darmstadt.de/27661
Zugehörige Links:
Herkunft: Zweitveröffentlichungsservice
Kurzbeschreibung (Abstract):

Peer review constitutes a core component of scholarly publishing; yet it demands substantial expertise and training, and is susceptible to errors and biases. Various applications of NLP for peer reviewing assistance aim to support reviewers in this complex process, but the lack of clearly licensed datasets and multi-domain corpora prevent the systematic study of NLP for peer review. To remedy this, we introduce NLPeer– the first ethically sourced multidomain corpus of more than 5k papers and 11k review reports from five different venues. In addition to the new datasets of paper drafts, camera-ready versions and peer reviews from the NLP community, we establish a unified data representation and augment previous peer review datasets to include parsed and structured paper representations, rich metadata and versioning information. We complement our resource with implementations and analysis of three reviewing assistance tasks, including a novel guided skimming task. Our work paves the path towards systematic, multi-faceted, evidence-based study of peer review in NLP and beyond. The data and code are publicly available.

ID-Nummer: 2023.acl-long.277
Status: Verlagsversion
URN: urn:nbn:de:tuda-tuprints-276618
Sachgruppe der Dewey Dezimalklassifikatin (DDC): 000 Allgemeines, Informatik, Informationswissenschaft > 004 Informatik
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 16 Jul 2024 12:16
Letzte Änderung: 18 Jul 2024 07:08
PPN:
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen