TU Darmstadt / ULB / TUbiblio

Representation Learning for Answer Selection with LSTM-Based Importance Weighting

Rücklé, Andreas ; Gurevych, Iryna (2017)
Representation Learning for Answer Selection with LSTM-Based Importance Weighting.
Montpellier, France
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

We present an approach to non-factoid answer selection with a separate component based on BiLSTM to determine the importance of segments in the input. In contrast to other recently proposed attention-based models within the same area, we determine the importance while assuming the independence of questions and candidate answers. Experimental results show the effectiveness of our approach, which outperforms several state-of-the-art attention-based models on the recent non-factoid answer selection datasets InsuranceQA v1 and v2. We show that it is possible to perform effective importance weighting for answer selection without relying on the relatedness of questions and answers. The source code of our experiments is publicly available.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2017
Autor(en): Rücklé, Andreas ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: Representation Learning for Answer Selection with LSTM-Based Importance Weighting
Sprache: Englisch
Publikationsjahr: September 2017
Verlag: Association for Computational Linguistics
Buchtitel: Proceedings of the 12th International Conference on Computational Semantics (IWCS 2017)
Band einer Reihe: Volume 2: Short papers
Veranstaltungsort: Montpellier, France
URL / URN: http://aclweb.org/anthology/W17-6935
Zugehörige Links:
Kurzbeschreibung (Abstract):

We present an approach to non-factoid answer selection with a separate component based on BiLSTM to determine the importance of segments in the input. In contrast to other recently proposed attention-based models within the same area, we determine the importance while assuming the independence of questions and candidate answers. Experimental results show the effectiveness of our approach, which outperforms several state-of-the-art attention-based models on the recent non-factoid answer selection datasets InsuranceQA v1 and v2. We show that it is possible to perform effective importance weighting for answer selection without relying on the relatedness of questions and answers. The source code of our experiments is publicly available.

Freie Schlagworte: UKP_p_QAEduInf;UKP_reviewed;reviewed
ID-Nummer: TUD-CS-2017-0184
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 14 Jul 2017 21:30
Letzte Änderung: 24 Jan 2020 12:03
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen