TU Darmstadt / ULB / TUbiblio

Efficient Methods for Natural Language Processing: A Survey

Treviso, Marcos ; Lee, Ji-Ung ; Ji, Tianchu ; Aken, Betty van ; Cao, Qingqing ; Ciosici, Manuel R. ; Hassid, Michael ; Heafield, Kenneth ; Hooker, Sara ; Raffel, Colin ; Martins, Pedro H. ; Martins, André F. T. ; Forde, Jessica Zosa ; Milder, Peter Milder ; Simpson, Edwin ; Slonim, Noam ; Dodge, Jesse ; Strubell, Emma ; Balasubramaniam, Niranjan ; Derczynski, Leon ; Gurevych, Iryna ; Schwartz, Roy (2023)
Efficient Methods for Natural Language Processing: A Survey.
In: Transactions of the Association for Computational Linguistics, 11
doi: 10.1162/tacl_a_00577
Article, Bibliographie

Abstract

Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.

Item Type: Article
Erschienen: 2023
Creators: Treviso, Marcos ; Lee, Ji-Ung ; Ji, Tianchu ; Aken, Betty van ; Cao, Qingqing ; Ciosici, Manuel R. ; Hassid, Michael ; Heafield, Kenneth ; Hooker, Sara ; Raffel, Colin ; Martins, Pedro H. ; Martins, André F. T. ; Forde, Jessica Zosa ; Milder, Peter Milder ; Simpson, Edwin ; Slonim, Noam ; Dodge, Jesse ; Strubell, Emma ; Balasubramaniam, Niranjan ; Derczynski, Leon ; Gurevych, Iryna ; Schwartz, Roy
Type of entry: Bibliographie
Title: Efficient Methods for Natural Language Processing: A Survey
Language: English
Date: 12 July 2023
Publisher: MIT Press
Journal or Publication Title: Transactions of the Association for Computational Linguistics
Volume of the journal: 11
DOI: 10.1162/tacl_a_00577
Abstract:

Recent work in natural language processing (NLP) has yielded appealing results from scaling model parameters and training data; however, using only scale to improve performance means that resource consumption also grows. Such resources include data, time, storage, or energy, all of which are naturally limited and unevenly distributed. This motivates research into efficient methods that require fewer resources to achieve similar results. This survey synthesizes and relates current methods and findings in efficient NLP. We aim to provide both guidance for conducting NLP under limited resources, and point towards promising research directions for developing more efficient methods.

Divisions: 20 Department of Computer Science
20 Department of Computer Science > Ubiquitous Knowledge Processing
Date Deposited: 25 Jul 2023 11:49
Last Modified: 26 Jul 2023 10:26
PPN: 509928870
Export:
Suche nach Titel in: TUfind oder in Google
Send an inquiry Send an inquiry

Options (only for editors)
Show editorial Details Show editorial Details