TU Darmstadt / ULB / TUbiblio

AdaSent: Efficient Domain-Adapted Sentence Embeddings for Few-Shot Classification

Huang, Yongxin ; Wang, Kexin ; Dutta, Sourav ; Patel, Raj Nath ; Glavaš, Goran ; Gurevych, Iryna (2023)
AdaSent: Efficient Domain-Adapted Sentence Embeddings for Few-Shot Classification.
2023 Conference on Empirical Methods in Natural Language Processing. Singapore (06.-10.12.2023)
doi: 10.18653/v1/2023.emnlp-main.208
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Recent work has found that few-shot sentence classification based on pre-trained Sentence Encoders (SEs) is efficient, robust, and effective. In this work, we investigate strategies for domain-specialization in the context of few-shot sentence classification with SEs. We first establish that unsupervised Domain-Adaptive Pre-Training (DAPT) of a base Pre-trained Language Model (PLM) (i.e., not an SE) substantially improves the accuracy of few-shot sentence classification by up to 8.4 points. However, applying DAPT on SEs, on the one hand, disrupts the effects of their (general-domain) Sentence Embedding Pre-Training (SEPT). On the other hand, applying general-domain SEPT on top of a domain-adapted base PLM (i.e., after DAPT) is effective but inefficient, since the computationally expensive SEPT needs to be executed on top of a DAPT-ed PLM of each domain. As a solution, we propose AdaSent, which decouples SEPT from DAPT by training a SEPT adapter on the base PLM. The adapter can be inserted into DAPT-ed PLMs from any domain. We demonstrate AdaSent’s effectiveness in extensive experiments on 17 different few-shot sentence classification datasets. AdaSent matches or surpasses the performance of full SEPT on DAPT-ed PLM, while substantially reducing the training costs. The code for AdaSent is available.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2023
Autor(en): Huang, Yongxin ; Wang, Kexin ; Dutta, Sourav ; Patel, Raj Nath ; Glavaš, Goran ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: AdaSent: Efficient Domain-Adapted Sentence Embeddings for Few-Shot Classification
Sprache: Englisch
Publikationsjahr: Dezember 2023
Ort: Singapore
Verlag: Association for Computational Linguistics
Buchtitel: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Veranstaltungstitel: 2023 Conference on Empirical Methods in Natural Language Processing
Veranstaltungsort: Singapore
Veranstaltungsdatum: 06.-10.12.2023
DOI: 10.18653/v1/2023.emnlp-main.208
URL / URN: https://aclanthology.org/2023.emnlp-main.208
Kurzbeschreibung (Abstract):

Recent work has found that few-shot sentence classification based on pre-trained Sentence Encoders (SEs) is efficient, robust, and effective. In this work, we investigate strategies for domain-specialization in the context of few-shot sentence classification with SEs. We first establish that unsupervised Domain-Adaptive Pre-Training (DAPT) of a base Pre-trained Language Model (PLM) (i.e., not an SE) substantially improves the accuracy of few-shot sentence classification by up to 8.4 points. However, applying DAPT on SEs, on the one hand, disrupts the effects of their (general-domain) Sentence Embedding Pre-Training (SEPT). On the other hand, applying general-domain SEPT on top of a domain-adapted base PLM (i.e., after DAPT) is effective but inefficient, since the computationally expensive SEPT needs to be executed on top of a DAPT-ed PLM of each domain. As a solution, we propose AdaSent, which decouples SEPT from DAPT by training a SEPT adapter on the base PLM. The adapter can be inserted into DAPT-ed PLMs from any domain. We demonstrate AdaSent’s effectiveness in extensive experiments on 17 different few-shot sentence classification datasets. AdaSent matches or surpasses the performance of full SEPT on DAPT-ed PLM, while substantially reducing the training costs. The code for AdaSent is available.

Freie Schlagworte: UKP_p_MISRIK, UKP_p_HUAWEI
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 18 Jan 2024 13:49
Letzte Änderung: 19 Mär 2024 17:40
PPN: 516403923
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen