TU Darmstadt / ULB / TUbiblio

Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning

Poth, Clifton ; Sterz, Hannah ; Paul, Indraneil ; Purkayastha, Sukannya ; Englander, Leon ; Imhof, Timo ; Vulić, Ivan ; Ruder, Sebastian ; Gurevych, Iryna ; Pfeiffer, Jonas (2023)
Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning.
2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations. Singapore (06.12.2023-10.12.2023)
doi: 10.18653/v1/2023.emnlp-demo.13
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

We introduce Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models. By integrating 10 diverse adapter methods into a unified interface, Adapters offers ease of use and flexible configuration. Our library allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups. We demonstrate the library’s efficacy by evaluating its performance against full fine-tuning on various NLP tasks. Adapters provides a powerful tool for addressing the challenges of conventional fine-tuning paradigms and promoting more efficient and modular transfer learning. The library is available via https://adapterhub.ml/adapters.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2023
Autor(en): Poth, Clifton ; Sterz, Hannah ; Paul, Indraneil ; Purkayastha, Sukannya ; Englander, Leon ; Imhof, Timo ; Vulić, Ivan ; Ruder, Sebastian ; Gurevych, Iryna ; Pfeiffer, Jonas
Art des Eintrags: Bibliographie
Titel: Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning
Sprache: Englisch
Publikationsjahr: Dezember 2023
Ort: Singapore
Verlag: Association for Computational Linguistics
Buchtitel: Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Veranstaltungstitel: 2023 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
Veranstaltungsort: Singapore
Veranstaltungsdatum: 06.12.2023-10.12.2023
DOI: 10.18653/v1/2023.emnlp-demo.13
URL / URN: https://aclanthology.org/2023.emnlp-demo.13
Kurzbeschreibung (Abstract):

We introduce Adapters, an open-source library that unifies parameter-efficient and modular transfer learning in large language models. By integrating 10 diverse adapter methods into a unified interface, Adapters offers ease of use and flexible configuration. Our library allows researchers and practitioners to leverage adapter modularity through composition blocks, enabling the design of complex adapter setups. We demonstrate the library’s efficacy by evaluating its performance against full fine-tuning on various NLP tasks. Adapters provides a powerful tool for addressing the challenges of conventional fine-tuning paradigms and promoting more efficient and modular transfer learning. The library is available via https://adapterhub.ml/adapters.

Freie Schlagworte: UKP_p_HUAWEI, UKP_p_KRITIS
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 18 Jan 2024 14:13
Letzte Änderung: 11 Apr 2024 12:34
PPN: 517105837
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen