Multilingual semantic parsing and code-switching

Long Duong, Hadi Afshar, Dominique Estival, Glen Pink, Philip Raymond Cohen, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

29 Citations (Scopus)


Extending semantic parsing systems to new domains and languages is a highly expensive, time-consuming process, so making effective use of existing resources is critical. In this paper, we describe a transfer learning method using crosslingual word embeddings in a sequence-tosequence model. On the NLmaps corpus, our approach achieves state-of-the-art accuracy of 85.7% for English. Most importantly, we observed a consistent improvement for German compared with several baseline domain adaptation techniques. As a by-product of this approach, our models that are trained on a combination of English and German utterances perform reasonably well on codeswitching utterances which contain a mixture of English and German, even though the training data does not contain any code-switching. As far as we know, this is the first study of code-switching in semantic parsing. We manually constructed the set of code-switching test utterances for the NLmaps corpus and achieve 78.3% accuracy on this dataset.
Original languageEnglish
Title of host publicationCoNLL 2017 - The 21st Conference on Computational Natural Language Learning - Proceedings of the Conference
Subtitle of host publicationAugust 3 - August 4, 2017 Vancouver, Canada
EditorsRoger Levy, Lucia Specia
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
ISBN (Electronic)9781945626548
Publication statusPublished - Aug 2017
Externally publishedYes
EventConference on Natural Language Learning 2017 - Vancouver, Canada
Duration: 3 Aug 20174 Aug 2017
Conference number: 21st (Proceedings)


ConferenceConference on Natural Language Learning 2017
Abbreviated titleCoNLL 2017
Internet address


  • semantic parsing
  • multilingual
  • code switching
  • deep learning

Cite this