TU Darmstadt / ULB / TUbiblio

Missci: Reconstructing Fallacies in Misrepresented Science

Glockner, Max ; Hou, Yufang ; Nakov, Preslav ; Gurevych, Iryna (2024)
Missci: Reconstructing Fallacies in Misrepresented Science.
62nd Annual Meeting of the Association for Computational Linguistics. Bangkok, Thailand (11.08.2024 - 16.08.2024)
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Health-related misinformation on social networks can lead to poor decision-making and real-world dangers. Such misinformation often misrepresents scientific publications and cites them as “proof” to gain perceived credibility. To effectively counter such claims automatically, a system must explain how the claim was falsely derived from the cited publication. Current methods for automated fact-checking or fallacy detection neglect to assess the (mis)used evidence in relation to misinformation claims, which is required to detect the mismatch between them. To address this gap, we introduce Missci, a novel argumentation theoretical model for fallacious reasoning together with a new dataset for real-world misinformation detection that misrepresents biomedical publications. Unlike previous fallacy detection datasets, Missci (i) focuses on implicit fallacies between the relevant content of the cited publication and the inaccurate claim, and (ii) requires models to verbalize the fallacious reasoning in addition to classifying it. We present Missci as a dataset to test the critical reasoning abilities of large language models (LLMs), that are required to reconstruct real-world fallacious arguments, in a zero-shot setting. We evaluate two representative LLMs and the impact of different levels of detail about the fallacy classes provided to the LLM via prompts. Our experiments and human evaluation show promising results for GPT 4, while also demonstrating the difficulty of this task.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2024
Autor(en): Glockner, Max ; Hou, Yufang ; Nakov, Preslav ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: Missci: Reconstructing Fallacies in Misrepresented Science
Sprache: Englisch
Publikationsjahr: August 2024
Verlag: ACL
Buchtitel: Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Veranstaltungstitel: 62nd Annual Meeting of the Association for Computational Linguistics
Veranstaltungsort: Bangkok, Thailand
Veranstaltungsdatum: 11.08.2024 - 16.08.2024
URL / URN: https://aclanthology.org/2024.acl-long.240/
Kurzbeschreibung (Abstract):

Health-related misinformation on social networks can lead to poor decision-making and real-world dangers. Such misinformation often misrepresents scientific publications and cites them as “proof” to gain perceived credibility. To effectively counter such claims automatically, a system must explain how the claim was falsely derived from the cited publication. Current methods for automated fact-checking or fallacy detection neglect to assess the (mis)used evidence in relation to misinformation claims, which is required to detect the mismatch between them. To address this gap, we introduce Missci, a novel argumentation theoretical model for fallacious reasoning together with a new dataset for real-world misinformation detection that misrepresents biomedical publications. Unlike previous fallacy detection datasets, Missci (i) focuses on implicit fallacies between the relevant content of the cited publication and the inaccurate claim, and (ii) requires models to verbalize the fallacious reasoning in addition to classifying it. We present Missci as a dataset to test the critical reasoning abilities of large language models (LLMs), that are required to reconstruct real-world fallacious arguments, in a zero-shot setting. We evaluate two representative LLMs and the impact of different levels of detail about the fallacy classes provided to the LLM via prompts. Our experiments and human evaluation show promising results for GPT 4, while also demonstrating the difficulty of this task.

Freie Schlagworte: UKP_p_square, UKP_p_seditrah_factcheck
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 20 Aug 2024 09:00
Letzte Änderung: 20 Aug 2024 09:00
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen