Document Flattening: Beyond concatenating context for document-level neural machine translation

Minghao Wu, George Foster, Lizhen Qu, Gholamreza Haffari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

4 Citations (Scopus)


Existing work in document-level neural machine translation commonly concatenates several consecutive sentences as a pseudo-document, and then learns inter-sentential dependencies. This strategy limits the model's ability to leverage information from distant context. We overcome this limitation with a novel Document Flattening (DOCFLAT) technique that integrates FLAT-BATCH ATTENTION (FBA) and NEURAL CONTEXT GATE (NCG) into Transformer model to utilize information beyond the pseudo-document boundaries. FBA allows the model to attend to all the positions in the batch and learns the relationships between positions explicitly and NCG identifies the useful information from the distant context. We conduct comprehensive experiments and analyses on three benchmark datasets for English-German translation, and validate the effectiveness of two variants of DOCFLAT. Empirical results show that our approach outperforms strong baselines with statistical significance on BLEU, COMET and accuracy on the contrastive test set. The analyses highlight that DOCFLAT is highly effective in capturing the long-range information.

Original languageEnglish
Title of host publicationEACL 2023 - 17th Conference of the European Chapter of the Association for Computational Linguistics, Proceedings of the Conference
EditorsRya Cotterell, Carolina Scarton
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages15
ISBN (Electronic)9781959429449
Publication statusPublished - 2023
EventEuropean Association of Computational Linguistics Conference 2023 - Dubrovnik, Croatia
Duration: 2 May 20236 May 2023
Conference number: 17th (Website) (Proceedings)


ConferenceEuropean Association of Computational Linguistics Conference 2023
Abbreviated titleEACL 2023
Internet address

Cite this