Akinade, I.Alabi, J. O.Adelani, D.Odoje, C.Klakow, D.2026-02-132023ui_inpro_akinade_integrating_2023AfricaNLP Workshop at ICLR2023https://repository.ui.edu.ng/handle/123456789/12163This paper investigates the performance of massively multilingual neural machine translation (NMT) systems in translating Yorùbá greetings (" kú <mask>1), which are a big part of Yorùbá language and culture, into English. To evaluate these models, we present IkiniYorùbá, a Yorùbá-English translation dataset containing some Yorùbá greetings, and sample use cases. We analysed the performance of different multilingual NMT systems including Google Translate and NLLB and show that these models struggle to accurately translate Yorùbá greetings into English. In addition, we trained a Yorùbá-English model by finetuning an existing NMT model on the training split of IkiniYorùbá and this achieved better performance when compared to the pre-trained multilingual NMT models, although they were trained on a large volume of data.enNeural Machine Translation (NMT)Yorùbá greetingsYorùbá languageIkiniYorùbáVarepsilon kú mask: Integrating Yorùbá cultural greetings into machine translationArticle