Contextual neural machine translation improves translation of cataphoric pronouns

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

The advent of context-aware NMT has resulted in promising improvements in the overall translation quality and specifically in the translation of discourse phenomena such as pronouns. Previous works have mainly focused on the use of past sentences as context with a focus on anaphora translation. In this work, we investigate the effect of future sentences as context by comparing the performance of a contextual NMT model trained with the future context to the one trained with the past context. Our experiments and evaluation, using generic and pronoun-focused automatic metrics, show that the use of future context not only achieves significant improvements over the context-agnostic Transformer, but also demonstrates comparable and in some cases improved performance over its counterpart trained on past context. We also perform an evaluation on a targeted cataphora test suite and report significant gains over the context-agnostic Transformer in terms of BLEU.
Original languageEnglish
Title of host publicationACL 2020 - The 58th Annual Meeting of the Association for Computational Linguistics
Subtitle of host publicationProceedings of the Conference
EditorsJoyce Chai, Natalie Schluter, Joel Tetreault
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages5971-5978
Number of pages8
ISBN (Electronic)9781952148255
DOIs
Publication statusPublished - Jul 2020
EventAnnual Meeting of the Association for Computational Linguistics 2020 - Virtual, Seattle, United States of America
Duration: 5 Jul 202010 Jul 2020
Conference number: 58th
https://www.aclweb.org/anthology/volumes/2020.acl-main/

Conference

ConferenceAnnual Meeting of the Association for Computational Linguistics 2020
Abbreviated titleACL 2020
Country/TerritoryUnited States of America
CitySeattle
Period5/07/2010/07/20
Internet address

Cite this