Schnober, Carsten ; Eger, Steffen ; Do Dinh, Erik-Lân ; Gurevych, Iryna (2016):
Still not there? Comparing Traditional Sequence-to-Sequence Models to Encoder-Decoder Neural Networks on Monotone String Translation Tasks.
In: Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pp. 1703-1714,
The COLING 2016 Organizing Committee, Osaka, Japan, [Conference or Workshop Item]
Abstract
We analyze the performance of encoder-decoder neural models and compare them with well-known established methods. The latter represent different classes of traditional approaches that are applied to the monotone sequence-to-sequence tasks OCR post-correction, spelling correction, grapheme-to-phoneme conversion, and lemmatization. Such tasks are of practical relevance for various higher-level research fields including \textit{digital humanities}, automatic text correction, and speech recognition. We investigate how well generic deep-learning approaches adapt to these tasks, and how they perform in comparison with established and more specialized methods, including our own adaptation of pruned CRFs.
Item Type: | Conference or Workshop Item |
---|---|
Erschienen: | 2016 |
Creators: | Schnober, Carsten ; Eger, Steffen ; Do Dinh, Erik-Lân ; Gurevych, Iryna |
Title: | Still not there? Comparing Traditional Sequence-to-Sequence Models to Encoder-Decoder Neural Networks on Monotone String Translation Tasks |
Language: | English |
Abstract: | We analyze the performance of encoder-decoder neural models and compare them with well-known established methods. The latter represent different classes of traditional approaches that are applied to the monotone sequence-to-sequence tasks OCR post-correction, spelling correction, grapheme-to-phoneme conversion, and lemmatization. Such tasks are of practical relevance for various higher-level research fields including \textit{digital humanities}, automatic text correction, and speech recognition. We investigate how well generic deep-learning approaches adapt to these tasks, and how they perform in comparison with established and more specialized methods, including our own adaptation of pruned CRFs. |
Book Title: | Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers |
Publisher: | The COLING 2016 Organizing Committee |
Uncontrolled Keywords: | UKP-DIPF;UKP_reviewed;UKP_a_DLinNLP |
Divisions: | 20 Department of Computer Science 20 Department of Computer Science > Ubiquitous Knowledge Processing |
Event Location: | Osaka, Japan |
Date Deposited: | 31 Dec 2016 14:29 |
URL / URN: | http://aclweb.org/anthology/C16-1160 |
Identification Number: | TUD-CS-2016-1450 |
PPN: | |
Export: | |
Suche nach Titel in: | TUfind oder in Google |
![]() |
Send an inquiry |
Options (only for editors)
![]() |
Show editorial Details |