TU Darmstadt / ULB / TUbiblio

Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks

Reimers, Nils ; Gurevych, Iryna (2017)
Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks.
In: arXiv preprint arXiv:1707.06799
Artikel, Bibliographie

Kurzbeschreibung (Abstract)

Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance. However, little is published which parameters and design choices should be evaluated or selected making the correct hyperparameter optimization often a "black art that requires expert experiences" (Snoek et al., 2012). In this paper, we evaluate the importance of different network design choices and hyperparameters for five common linguistic sequence tagging tasks (POS, Chunking, NER, Entity Recognition, and Event Detection). We evaluated over 50.000 different setups and found, that some parameters, like the pre-trained word embeddings or the last layer of the network, have a large impact on the performance, while other parameters, for example the number of LSTM layers or the number of recurrent units, are of minor importance. We give a recommendation on a configuration that performs well among different tasks.

Typ des Eintrags: Artikel
Erschienen: 2017
Autor(en): Reimers, Nils ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: Optimal Hyperparameters for Deep LSTM-Networks for Sequence Labeling Tasks
Sprache: Englisch
Publikationsjahr: Juli 2017
Titel der Zeitschrift, Zeitung oder Schriftenreihe: arXiv preprint arXiv:1707.06799
URL / URN: https://arxiv.org/abs/1707.06799
Zugehörige Links:
Kurzbeschreibung (Abstract):

Selecting optimal parameters for a neural network architecture can often make the difference between mediocre and state-of-the-art performance. However, little is published which parameters and design choices should be evaluated or selected making the correct hyperparameter optimization often a "black art that requires expert experiences" (Snoek et al., 2012). In this paper, we evaluate the importance of different network design choices and hyperparameters for five common linguistic sequence tagging tasks (POS, Chunking, NER, Entity Recognition, and Event Detection). We evaluated over 50.000 different setups and found, that some parameters, like the pre-trained word embeddings or the last layer of the network, have a large impact on the performance, while other parameters, for example the number of LSTM layers or the number of recurrent units, are of minor importance. We give a recommendation on a configuration that performs well among different tasks.

ID-Nummer: TUD-CS-2017-0196
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
DFG-Graduiertenkollegs
DFG-Graduiertenkollegs > Graduiertenkolleg 1994 Adaptive Informationsaufbereitung aus heterogenen Quellen
Hinterlegungsdatum: 25 Jul 2017 10:47
Letzte Änderung: 24 Jan 2020 12:03
PPN:
Zugehörige Links:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen