Multi-target machine translation with multi-synchronous context-free grammars

Graham Neubig, Philip Arthur, Kevin Duh

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

4 Citations (Scopus)


We propose a method for simultaneously translating from a single source language to multiple target languages T1, T2, etc. The motivation behind this method is that if we only have a weak language model for T1 and translations in T1 and T2 are associated, we can use the information from a strong language model over T2 to disambiguate the translations in T1, providing better translation results. As a specific framework to realize multi-target translation, we expand the formalism of synchronous context-free grammars to handle multiple targets, and describe methods for rule extraction, scoring, pruning, and search with these models. Experiments find that multi-target translation with a strong language model in a similar second target language can provide gains of up to 0.8-1.5 BLEU points.

Original languageEnglish
Title of host publicationNAACL HLT 2015 - 2015 Conference of the North American Chapter of the Association for Computational Linguistics
Subtitle of host publicationHuman Language Technologies, Proceedings of the Conference
EditorsJoyce Chai, Anoop Sarkar
Place of PublicationRed Hook NY USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages10
ISBN (Electronic)9781941643495
Publication statusPublished - 2015
Externally publishedYes
EventNorth American Association for Computational Linguistics 2015 - Denver, United States of America
Duration: 31 May 20155 Jun 2015


ConferenceNorth American Association for Computational Linguistics 2015
Abbreviated titleNAACL HLT 2015
Country/TerritoryUnited States of America
Internet address

Cite this