TU Darmstadt / ULB / TUbiblio

Overview of PragTag-2023: Low-Resource Multi-Domain Pragmatic Tagging of Peer Reviews

Dycke, Nils ; Kuznetsov, Ilia ; Gurevych, Iryna (2023)
Overview of PragTag-2023: Low-Resource Multi-Domain Pragmatic Tagging of Peer Reviews.
2023 Conference on Empirical Methods in Natural Language Processing. Singapore (06.12.2023-10.12.2023)
doi: 10.18653/v1/2023.argmining-1.21
Konferenzveröffentlichung, Bibliographie

Dies ist die neueste Version dieses Eintrags.

Kurzbeschreibung (Abstract)

Peer review is the key quality control mechanism in science. The core component of peer review are the review reports – argumentative texts where the reviewers evaluate the work and make suggestions to the authors. Reviewing is a demanding expert task prone to bias. An active line of research in NLP aims to support peer review via automatic analysis of review reports. This research meets two key challenges. First, NLP to date has focused on peer reviews from machine learning conferences. Yet, NLP models are prone to domain shift and might underperform when applied to reviews from a new research community. Second, while some venues make their reviewing processes public, peer reviewing data is generally hard to obtain and expensive to label. Approaches to low-data NLP processing for peer review remain under-investigated. Enabled by the recent release of open multi-domain corpora of peer reviews, the PragTag-2023 Shared Task explored the ways to increase domain robustness and address data scarcity in pragmatic tagging – a sentence tagging task where review statements are classified by their argumentative function. This paper describes the shared task, outlines the participating systems, and summarizes the results.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2023
Autor(en): Dycke, Nils ; Kuznetsov, Ilia ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: Overview of PragTag-2023: Low-Resource Multi-Domain Pragmatic Tagging of Peer Reviews
Sprache: Englisch
Publikationsjahr: 7 Dezember 2023
Verlag: ACL
Buchtitel: Proceedings of the 10th Workshop on Argument Mining
Veranstaltungstitel: 2023 Conference on Empirical Methods in Natural Language Processing
Veranstaltungsort: Singapore
Veranstaltungsdatum: 06.12.2023-10.12.2023
DOI: 10.18653/v1/2023.argmining-1.21
URL / URN: https://aclanthology.org/2023.argmining-1.21
Zugehörige Links:
Kurzbeschreibung (Abstract):

Peer review is the key quality control mechanism in science. The core component of peer review are the review reports – argumentative texts where the reviewers evaluate the work and make suggestions to the authors. Reviewing is a demanding expert task prone to bias. An active line of research in NLP aims to support peer review via automatic analysis of review reports. This research meets two key challenges. First, NLP to date has focused on peer reviews from machine learning conferences. Yet, NLP models are prone to domain shift and might underperform when applied to reviews from a new research community. Second, while some venues make their reviewing processes public, peer reviewing data is generally hard to obtain and expensive to label. Approaches to low-data NLP processing for peer review remain under-investigated. Enabled by the recent release of open multi-domain corpora of peer reviews, the PragTag-2023 Shared Task explored the ways to increase domain robustness and address data scarcity in pragmatic tagging – a sentence tagging task where review statements are classified by their argumentative function. This paper describes the shared task, outlines the participating systems, and summarizes the results.

Freie Schlagworte: UKP_p_PEER, UKP_p_InterText
Zusätzliche Informationen:

10th Workshop on Argument Mining, 07.12.2023; Erstveröffentlichung

Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 20 Mär 2024 15:36
Letzte Änderung: 29 Jul 2024 08:07
PPN: 520189434
Export:
Suche nach Titel in: TUfind oder in Google

Verfügbare Versionen dieses Eintrags

Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen