Multi-target machine translation with multi-synchronous context-free grammars

Graham Neubig, Philip Arthur, Kevin Duh

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

2 Citations (Scopus)

Abstract

We propose a method for simultaneously translating from a single source language to multiple target languages T1, T2, etc. The motivation behind this method is that if we only have a weak language model for T1 and translations in T1 and T2 are associated, we can use the information from a strong language model over T2 to disambiguate the translations in T1, providing better translation results. As a specific framework to realize multi-target translation, we expand the formalism of synchronous context-free grammars to handle multiple targets, and describe methods for rule extraction, scoring, pruning, and search with these models. Experiments find that multi-target translation with a strong language model in a similar second target language can provide gains of up to 0.8-1.5 BLEU points.

Original languageEnglish
Title of host publicationNAACL HLT 2015 - 2015 Conference of the North American Chapter of the Association for Computational Linguistics
Subtitle of host publicationHuman Language Technologies, Proceedings of the Conference
EditorsJoyce Chai, Anoop Sarkar
Place of PublicationRed Hook NY USA
PublisherAssociation for Computational Linguistics (ACL)
Pages293-302
Number of pages10
ISBN (Electronic)9781941643495
Publication statusPublished - 2015
Externally publishedYes
EventNorth American Association for Computational Linguistics 2015: Human Language Technologies - Denver, United States of America
Duration: 31 May 20155 Jun 2015
http://naacl.org/naacl-hlt-2015/

Conference

ConferenceNorth American Association for Computational Linguistics 2015
Abbreviated titleNAACL HLT 2015
CountryUnited States of America
CityDenver
Period31/05/155/06/15
Internet address

Cite this