Abstract
Neural machine translation requires large amounts of parallel training text to learn a reasonable-quality translation model. This is particularly inconvenient for language pairs for which enough parallel text is not available. In this paper, we use monolingual linguistic resources in the source side to address this challenging problem based on a multi-Task learning approach. More specifically, we scaffold the machine translation task on auxiliary tasks including semantic parsing, syntactic parsing, and named-entity recognition. This effectively injects semantic and/or syntactic knowledge into the translation model, which would otherwise require a large amount of training bitext. We empirically evaluate and show the effectiveness of our multi-Task learning approach on three translation tasks: English-To-French, English-To-Farsi, and English-To-Vietnamese.
Original language | English |
---|---|
Title of host publication | NAACL HLT 2018, The 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies |
Subtitle of host publication | Proceedings of the Conference Volume 1 (Long Papers) |
Editors | Heng Ji, Amanda Stent |
Place of Publication | Stroudsburg PA USA |
Publisher | Association for Computational Linguistics (ACL) |
Pages | 1356-1365 |
Number of pages | 10 |
Volume | 1 |
ISBN (Electronic) | 9781948087278 |
DOIs | |
Publication status | Published - Jun 2018 |
Event | North American Association for Computational Linguistics 2018 - Hyatt Regency, New Orleans, United States of America Duration: 1 Jun 2018 → 6 Jun 2018 Conference number: 16th http://naacl2018.org/ https://www.aclweb.org/anthology/volumes/N18-1/ (Proceedings ) |
Conference
Conference | North American Association for Computational Linguistics 2018 |
---|---|
Abbreviated title | NAACL HLT 2018 |
Country/Territory | United States of America |
City | New Orleans |
Period | 1/06/18 → 6/06/18 |
Internet address |