TU Darmstadt / ULB / TUbiblio

Missing Counter-Evidence Renders NLP Fact-Checking Unrealistic for Misinformation

Glockner, Max ; Hou, Yufang ; Gurevych, Iryna (2022)
Missing Counter-Evidence Renders NLP Fact-Checking Unrealistic for Misinformation.
2022 Conference on Empirical Methods in Natural Language Processing. Abu Dhabi, United Arab Emirates (07.12.2022-11.12.2022)
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Misinformation emerges in times of uncertainty when credible information is limited. This is challenging for NLP-based fact-checking as it relies on counter-evidence, which may not yet be available. Despite increasing interest in automatic fact-checking, it is still unclear if automated approaches can realistically refute harmful real-world misinformation. Here, we contrast and compare NLP fact-checking with how professional fact-checkers combat misinformation in the absence of counter-evidence. In our analysis, we show that, by design, existing NLP task definitions for fact-checking cannot refute misinformation as professional fact-checkers do for the majority of claims. We then define two requirements that the evidence in datasets must fulfill for realistic fact-checking: It must be (1) sufficient to refute the claim and (2) not leaked from existing fact-checking articles. We survey existing fact-checking datasets and find that all of them fail to satisfy both criteria. Finally, we perform experiments to demonstrate that models trained on a large-scale fact-checking dataset rely on leaked evidence, which makes them unsuitable in real-world scenarios. Taken together, we show that current NLP fact-checking cannot realistically combat real-world misinformation because it depends on unrealistic assumptions about counter-evidence in the data.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2022
Autor(en): Glockner, Max ; Hou, Yufang ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: Missing Counter-Evidence Renders NLP Fact-Checking Unrealistic for Misinformation
Sprache: Englisch
Publikationsjahr: Dezember 2022
Verlag: ACL
Buchtitel: Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Veranstaltungstitel: 2022 Conference on Empirical Methods in Natural Language Processing
Veranstaltungsort: Abu Dhabi, United Arab Emirates
Veranstaltungsdatum: 07.12.2022-11.12.2022
URL / URN: https://aclanthology.org/2022.emnlp-main.397
Kurzbeschreibung (Abstract):

Misinformation emerges in times of uncertainty when credible information is limited. This is challenging for NLP-based fact-checking as it relies on counter-evidence, which may not yet be available. Despite increasing interest in automatic fact-checking, it is still unclear if automated approaches can realistically refute harmful real-world misinformation. Here, we contrast and compare NLP fact-checking with how professional fact-checkers combat misinformation in the absence of counter-evidence. In our analysis, we show that, by design, existing NLP task definitions for fact-checking cannot refute misinformation as professional fact-checkers do for the majority of claims. We then define two requirements that the evidence in datasets must fulfill for realistic fact-checking: It must be (1) sufficient to refute the claim and (2) not leaked from existing fact-checking articles. We survey existing fact-checking datasets and find that all of them fail to satisfy both criteria. Finally, we perform experiments to demonstrate that models trained on a large-scale fact-checking dataset rely on leaked evidence, which makes them unsuitable in real-world scenarios. Taken together, we show that current NLP fact-checking cannot realistically combat real-world misinformation because it depends on unrealistic assumptions about counter-evidence in the data.

Freie Schlagworte: UKP_p_texprax, UKP_p_seditrah_factcheck
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 27 Feb 2023 15:22
Letzte Änderung: 13 Jun 2023 16:30
PPN: 507922670
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen