Item

Bilingual medical mixture of experts large language model

Pieris, Sara
Mullappilly, Sahal Shaji
Khan, Fahad
Anwer, Rao
Khan, Salman
Baldwin, Timothy
Cholakkal, Hisham
Supervisor
Department
Computer Vision
Embargo End Date
Type
Patent
Date
2025
License
Language
English
Collections
Research Projects
Organizational Units
Journal Issue
Abstract
A computer-implemented system and computer instructions stored on non-transitory computer readable medium for bilingual medical inquiry in both Arabic and English, including multiple-choice question answering, open-ended question answering, and multi-turn question answering. The system and instructions use a mixture of experts large language model (MOE LLM) having a router network connected to multiple expert networks. The MOE LLM is trained with medical domain data and is used to receive the input bilingual text in a format for a medical inquiry, and output text in a format of a response to the medical inquiry, in sequence. The system and instructions incorporate an English-to-Arabic translation pipeline having a language translation model to generate Arabic language medical instruction sets from English language medical instructions, for large scale use in Arabic and English medical inquiry.
Citation
Sara Pieri, Sahal Shaji Mullappilly, Fahad Khan, Rao Anwer, Salman Khan, Timothy Baldwin, and Hisham Cholakkal, “Bilingual medical mixture of experts large language model,” U.S. Patent Application 18/905,019, filed Oct. 2, 2024; published Aug. 21, 2025
Source
US Patent
Conference
Keywords
Subjects
Source
Publisher
Google Patent
DOI
Full-text link