Dodge, Jesse ; Gurevych, Iryna ; Schwartz, Roy ; Strubell, Emma ; Aken, Betty van (2023)
Efficient and Equitable Natural Language Processing in the Age of Deep Learning.
In: Dagstuhl Reports, 12 (6)
doi: 10.4230/DagRep.12.6.14
Artikel, Bibliographie
Kurzbeschreibung (Abstract)
This report documents the program and the outcomes of Dagstuhl Seminar 22232 “Efficient and Equitable Natural Language Processing in the Age of Deep Learning”. Since 2012, the field of artificial intelligence (AI) has reported remarkable progress on a broad range of capabilities including object recognition, game playing, speech recognition, and machine translation. Much of this progress has been achieved by increasingly large and computationally intensive deep learning models: training costs for state-of-the-art deep learning models have increased 300,000 times between 2012 and 2018 [1]. Perhaps the epitome of this trend is the subfield of natural language processing (NLP) that over the past three years has experienced even sharper growth in model size and corresponding computational requirements in the word embedding approaches (e.g. ELMo, BERT, openGPT-2, Megatron-LM, T5, and GPT-3, one of the largest models ever trained with 175B dense parameters) that are now the basic building blocks of nearly all NLP models. Recent studies indicate that this trend is both environmentally unfriendly and prohibitively expensive, raising barriers to participation in NLP research [2 , 3]. The goal of this seminar was to mitigate these concerns and promote equity of access in NLP.
Typ des Eintrags: | Artikel |
---|---|
Erschienen: | 2023 |
Autor(en): | Dodge, Jesse ; Gurevych, Iryna ; Schwartz, Roy ; Strubell, Emma ; Aken, Betty van |
Art des Eintrags: | Bibliographie |
Titel: | Efficient and Equitable Natural Language Processing in the Age of Deep Learning |
Sprache: | Englisch |
Publikationsjahr: | 20 Januar 2023 |
Verlag: | Schloss Dagstuhl - Leibniz-Zentrum für Informatik |
Titel der Zeitschrift, Zeitung oder Schriftenreihe: | Dagstuhl Reports |
Jahrgang/Volume einer Zeitschrift: | 12 |
(Heft-)Nummer: | 6 |
DOI: | 10.4230/DagRep.12.6.14 |
URL / URN: | urn:nbn:de:0030-drops-174549 |
Kurzbeschreibung (Abstract): | This report documents the program and the outcomes of Dagstuhl Seminar 22232 “Efficient and Equitable Natural Language Processing in the Age of Deep Learning”. Since 2012, the field of artificial intelligence (AI) has reported remarkable progress on a broad range of capabilities including object recognition, game playing, speech recognition, and machine translation. Much of this progress has been achieved by increasingly large and computationally intensive deep learning models: training costs for state-of-the-art deep learning models have increased 300,000 times between 2012 and 2018 [1]. Perhaps the epitome of this trend is the subfield of natural language processing (NLP) that over the past three years has experienced even sharper growth in model size and corresponding computational requirements in the word embedding approaches (e.g. ELMo, BERT, openGPT-2, Megatron-LM, T5, and GPT-3, one of the largest models ever trained with 175B dense parameters) that are now the basic building blocks of nearly all NLP models. Recent studies indicate that this trend is both environmentally unfriendly and prohibitively expensive, raising barriers to participation in NLP research [2 , 3]. The goal of this seminar was to mitigate these concerns and promote equity of access in NLP. |
Freie Schlagworte: | deep learning, efficiency, equity, natural language processing (NLP) |
Fachbereich(e)/-gebiet(e): | 20 Fachbereich Informatik 20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung |
Hinterlegungsdatum: | 23 Jan 2023 14:22 |
Letzte Änderung: | 07 Mär 2023 11:15 |
PPN: | 505584115 |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
Frage zum Eintrag |
Optionen (nur für Redakteure)
Redaktionelle Details anzeigen |