Collective wisdom: improving low-resource Neural Machine Translation using adaptive knowledge distillation

Fahimeh Saleh, Wray Buntine, Reza Haffari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

6 Citations (Scopus)

Abstract

Scarcity of parallel sentence-pairs poses a significant hurdle for training high-quality Neural Machine Translation (NMT) models in bilingually low-resource scenarios. A standard approach is transfer learning, which involves taking a model trained on a high-resource language-pair and fine-tuning it on the data of the low-resource MT condition of interest. However, it is not clear generally which high-resource language-pair offers the best transfer learning for the target MT setting. Furthermore, different transferred models may have complementary semantic and/or syntactic strengths, hence using only one model may be sub-optimal. In this paper, we tackle this problem using knowledge distillation, where we propose to distill the knowledge of ensemble of teacher models to a single student model. As the quality of these teacher models varies, we propose an effective adaptive knowledge distillation approach to dynamically adjust the contribution of the teacher models during the distillation process. Experiments on transferring from a collection of six language pairs from IWSLT to five low-resource language-pairs from TED Talks demonstrate the effectiveness of our approach, achieving up to +0.9 BLEU score improvement compared to strong baselines.

Original languageEnglish
Title of host publicationCOLING 2020, The 28th International Conference on Computational Linguistics, Proceedings of the Conference
EditorsNuria Bel, Chengqing Zong
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages3413-3421
Number of pages9
ISBN (Electronic)9781952148279
DOIs
Publication statusPublished - 2020
EventInternational Conference on Computational Linguistics 2020 - Virtual, Barcelona, Spain
Duration: 8 Dec 202013 Dec 2020
Conference number: 28th
https://coling2020.org (Website)
https://www.aclweb.org/anthology/volumes/2020.coling-main/ (Proceedings)

Conference

ConferenceInternational Conference on Computational Linguistics 2020
Abbreviated titleCOLING 2020
Country/TerritorySpain
CityBarcelona
Period8/12/2013/12/20
Internet address

Cite this