Neural machine translation for bilingually scarce scenarios: a deep multi-task learning approach

Poorya Zaremoodi, Gholamreza Haffari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

11 Citations (Scopus)


Neural machine translation requires large amounts of parallel training text to learn a reasonable-quality translation model. This is particularly inconvenient for language pairs for which enough parallel text is not available. In this paper, we use monolingual linguistic resources in the source side to address this challenging problem based on a multi-Task learning approach. More specifically, we scaffold the machine translation task on auxiliary tasks including semantic parsing, syntactic parsing, and named-entity recognition. This effectively injects semantic and/or syntactic knowledge into the translation model, which would otherwise require a large amount of training bitext. We empirically evaluate and show the effectiveness of our multi-Task learning approach on three translation tasks: English-To-French, English-To-Farsi, and English-To-Vietnamese.

Original languageEnglish
Title of host publicationNAACL HLT 2018, The 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Subtitle of host publicationProceedings of the Conference Volume 1 (Long Papers)
EditorsHeng Ji, Amanda Stent
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages10
ISBN (Electronic)9781948087278
Publication statusPublished - Jun 2018
EventNorth American Association for Computational Linguistics 2018 - Hyatt Regency, New Orleans, United States of America
Duration: 1 Jun 20186 Jun 2018
Conference number: 16th (Proceedings )


ConferenceNorth American Association for Computational Linguistics 2018
Abbreviated titleNAACL HLT 2018
Country/TerritoryUnited States of America
CityNew Orleans
Internet address

Cite this