TU Darmstadt / ULB / TUbiblio

IRCoder: Intermediate Representations Make Language Models Robust Multilingual Code Generators

Paul, Indraneil ; Glavaš, Goran ; Gurevych, Iryna (2024)
IRCoder: Intermediate Representations Make Language Models Robust Multilingual Code Generators.
The 62nd Annual Meeting of the Association for Computational Linguistics. Bangkok, Thailand (12-16.08.2024)
Konferenzveröffentlichung, Bibliographie

Kurzbeschreibung (Abstract)

Code generation has fast become one of the most popular applications of language models (LMs). Nonetheless, research on multilingual aspects of Code-LMs, such as cross-lingual transfer between different programming languages, language-specific data augmentation, and post-hoc LM adaptation, alongside the exploitation of data sources other than the original textual content, has been much sparser than for their natural language counterparts. In particular, most mainstream Code-LMs have been pre-trained on source code files alone. In this work, we investigate the prospect of leveraging readily available compiler intermediate representations (IR)—shared across programming languages—to improve the multilingual capabilities of Code-LMs and facilitate cross-lingual transfer. To this end, we first compile SLTrans, a parallel dataset consisting of nearly 4M self-contained source code files coupled with their respective intermediate representations. Next, starting from various base Code-LMs (ranging from 1.1B to 7.3B parameters), we carry out continued causal language modelling training on SLTrans, forcing the Code-LMs to (1) learn the IR language and (2) align the IR constructs with respective constructs of various programming languages. Our resulting models, dubbed IRCoder, display sizeable and consistent gains across various code generation tasks and metrics, including prompt robustness, multilingual code completion, code understanding, and instruction following.

Typ des Eintrags: Konferenzveröffentlichung
Erschienen: 2024
Autor(en): Paul, Indraneil ; Glavaš, Goran ; Gurevych, Iryna
Art des Eintrags: Bibliographie
Titel: IRCoder: Intermediate Representations Make Language Models Robust Multilingual Code Generators
Sprache: Englisch
Publikationsjahr: August 2024
Verlag: Association for Computational Linguistics
Buchtitel: Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Veranstaltungstitel: The 62nd Annual Meeting of the Association for Computational Linguistics
Veranstaltungsort: Bangkok, Thailand
Veranstaltungsdatum: 12-16.08.2024
URL / URN: https://aclanthology.org/2024.acl-long.802/
Kurzbeschreibung (Abstract):

Code generation has fast become one of the most popular applications of language models (LMs). Nonetheless, research on multilingual aspects of Code-LMs, such as cross-lingual transfer between different programming languages, language-specific data augmentation, and post-hoc LM adaptation, alongside the exploitation of data sources other than the original textual content, has been much sparser than for their natural language counterparts. In particular, most mainstream Code-LMs have been pre-trained on source code files alone. In this work, we investigate the prospect of leveraging readily available compiler intermediate representations (IR)—shared across programming languages—to improve the multilingual capabilities of Code-LMs and facilitate cross-lingual transfer. To this end, we first compile SLTrans, a parallel dataset consisting of nearly 4M self-contained source code files coupled with their respective intermediate representations. Next, starting from various base Code-LMs (ranging from 1.1B to 7.3B parameters), we carry out continued causal language modelling training on SLTrans, forcing the Code-LMs to (1) learn the IR language and (2) align the IR constructs with respective constructs of various programming languages. Our resulting models, dubbed IRCoder, display sizeable and consistent gains across various code generation tasks and metrics, including prompt robustness, multilingual code completion, code understanding, and instruction following.

Freie Schlagworte: moveUKP_p_HUAWEI,UKP_p_code_transformers
Fachbereich(e)/-gebiet(e): 20 Fachbereich Informatik
20 Fachbereich Informatik > Ubiquitäre Wissensverarbeitung
Hinterlegungsdatum: 11 Sep 2024 08:30
Letzte Änderung: 28 Okt 2024 13:23
PPN: 522512070
Export:
Suche nach Titel in: TUfind oder in Google
Frage zum Eintrag Frage zum Eintrag

Optionen (nur für Redakteure)
Redaktionelle Details anzeigen Redaktionelle Details anzeigen