Item

Atlas-Chat: Adapting Large Language Models for Low-Resource Moroccan Arabic Dialect

Shang, Guokan
Abdine, Hadi
Khoubrane, Yousef
Mohamed, Amr
Abbahaddou, Yassine
Ennadir, Sofiane
Momayiz, Imane
Ren, Xuguang
Moulines, Eric
Nakov, Preslav
... show 2 more
Research Projects
Organizational Units
Journal Issue
Abstract
We introduce Atlas-Chat, the first-ever collection of LLMs specifically developed for dialectal Arabic. Focusing on Moroccan Arabic, also known as Darija, we construct our instruction dataset by consolidating existing Darija language resources, creating novel datasets both manually and synthetically, and translating English instructions with stringent quality control. Atlas-Chat-2B, 9B1, and 27B models, fine-tuned on the dataset, exhibit superior ability in following Darija instructions and performing standard NLP tasks. Notably, our models outperform both state-of-the-art and Arabic-specialized LLMs like LLaMa, Jais, and AceGPT, e.g., our 9B model gains a 13% performance boost over a larger 13B model on DarijaMMLU, in our newly introduced evaluation suite for Darija covering both discriminative and generative tasks. Furthermore, we perform an experimental analysis of various fine-tuning strategies and base model choices to determine optimal configurations. All our resources are publicly accessible, and we believe our work offers comprehensive design methodologies of instruction-tuning for low-resource languages, which are often neglected in favor of data-rich languages by contemporary LLMs.
Citation
G. Shang et al., “Atlas-Chat: Adapting Large Language Models for Low-Resource Moroccan Arabic Dialect,” Proceedings - International Conference on Computational Linguistics, COLING, pp. 9–30, Jan. 2025.
Source
Proceedings - International Conference on Computational Linguistics, COLING
Conference
Keywords
Atlas-Chat, Moroccan Arabic, Dialectal Arabic, Large Language Models, Low-resource languages
Subjects
Source
Publisher
Association for Computational Linguistics
DOI
Full-text link