TU Darmstadt / ULB / TUbiblio

Interoperability = f(community, division of labour)

Eckart de Castilho, Richard (2016)
Interoperability = f(community, division of labour).
Portoroz, Slovenia
doi: 10.5281/zenodo.161848
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

This paper aims to motivate the hypothesis that practical interoperability can be seen as a function of whether and how stakeholder communities duplicate or divide work in a given area or market. We focus on the area of language processing which traditionally produces many diverse tools that are not immediately interoperable. However, there is also a strong desire to combine these tools into processing pipelines and to apply these to a wide range of different corpora. The space opened between generic, inherently "empty" interoperability frameworks that offer no NLP capabilities themselves and dedicated NLP tools gave rise to a new class of NLP-related projects that focus specifically on interoperability: component collections. This new class of projects drives interoperability in a very pragmatic way that could well be more successful than, e.g., past efforts towards standardised formats which ultimately saw little adoption or support by software tools.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2016
Autor(en): Eckart de Castilho, Richard
Art des Eintrags: Bibliographie
Titel: Interoperability = f(community, division of labour)
Sprache: Englisch
Publikationsjahr: Mai 2016
Buchtitel: Proceedings of the Workshop on Cross-Platform Text Mining and Natural Language Processing Interoperability collocated with LREC 2016
Veranstaltungsort: Portoroz, Slovenia
DOI: 10.5281/zenodo.161848
URL / URN: https://zenodo.org/record/161848
Kurzbeschreibung (Abstract):

This paper aims to motivate the hypothesis that practical interoperability can be seen as a function of whether and how stakeholder communities duplicate or divide work in a given area or market. We focus on the area of language processing which traditionally produces many diverse tools that are not immediately interoperable. However, there is also a strong desire to combine these tools into processing pipelines and to apply these to a wide range of different corpora. The space opened between generic, inherently "empty" interoperability frameworks that offer no NLP capabilities themselves and dedicated NLP tools gave rise to a new class of NLP-related projects that focus specifically on interoperability: component collections. This new class of projects drives interoperability in a very pragmatic way that could well be more successful than, e.g., past efforts towards standardised formats which ultimately saw little adoption or support by software tools.

Freie Schlagworte: CEDIFOR;UKP_reviewed;UKP_s_DKPro_Core;UKP_p_OpenMinTeD;UKP_a_LangTech4eHum
ID-Nummer: TUD-CS-2016-0074
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
DFG-Graduiertenkollegs
DFG-Graduiertenkollegs > Graduiertenkolleg 1994 Adaptive Informationsaufbereitung aus heterogenen Quellen
Hinterlegungsdatum: 31 Dez 2016 14:29
Letzte Änderung: 18 Sep 2018 09:29
PPN:
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen