Item

Nile-Chat: Egyptian Language Models for Arabic and Latin Scripts

Shang, Guokan
Abdine, Hadi
Chamma, Ahmad
Mohamed, Amr
Anwar, Mohamed
Bounhar, Abdelaziz
El Herraoui, Omar
Nakov, Preslav
Vazirgiannis, Michalis
Xing, Eric P.
Research Projects
Organizational Units
Journal Issue
Abstract
We introduce Nile-Chat-4B, 3x4B-A6B, and 12B, a collection of LLMs for Egyptian dialect, uniquely designed to understand and generate texts written in both Arabic and Latin scripts. Specifically, with Nile-Chat-3x4B-A6B, we introduce a novel language adaptation approach by leveraging the Branch-Train-MiX strategy to merge script-specialized experts, into a single MoE model. Our Nile-Chat models significantly outperform leading multilingual and Arabic LLMs, such as LLaMa, Jais, and ALLaM, on our newly introduced Egyptian evaluation benchmarks, which span both understanding and generative tasks. Notably, our 12B model delivers a 14.4% performance gain over Qwen2.5-14B-Instruct on Latin-script benchmarks. All our resources are publicly available. We believe this work presents a comprehensive methodology for adapting LLMs to a single language with dual-script usage, addressing an often overlooked aspect in contemporary LLM development.
Citation
Source
Proceedings of The Third Arabic Natural Language Processing Conference
Conference
Third Arabic Natural Language Processing Conference
Keywords
Egyptian Dialect LLMs, Arabic and Latin Scripts, Script-Specialized Expert Merging, MoE Model Architecture, Dialect-Specific Benchmarks, Large Language Model Adaptation, Dual-Script Tokenization Strategy, Public Resource Release
Subjects
Source
Third Arabic Natural Language Processing Conference
Publisher
Association for Computational Linguistics
Full-text link