TOTAL RECALL: a customized continual learning method for neural semantic parsers

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


This paper investigates continual learning for semantic parsing. In this setting, a neural semantic parser learns tasks sequentially without accessing full training data from previous tasks. Direct application of the SOTA continual learning algorithms to this problem fails to achieve comparable performance with re-training models with all seen tasks because they have not considered the special properties of structured outputs yielded by semantic parsers. Therefore, we propose TotalRecall, a continual learning method designed for neural semantic parsers from two aspects: i) a sampling method for memory replay that diversifies logical form templates and balances distributions of parse actions in a memory; ii) a two-stage training method that significantly improves generalization capability of the parsers across tasks. We conduct extensive experiments to study the research problems involved in continual semantic parsing and demonstrate that a neural semantic parser trained with TotalRecall achieves superior performance than the one trained directly with the SOTA continual learning algorithms and achieve a 3-6 times speedup compared to re-training from scratch.
Original languageEnglish
Title of host publication2021 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
EditorsXuanjing Huang, Lucia Specia, Scott Wen-tau Yin
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages16
ISBN (Electronic)9781955917094
Publication statusPublished - 2021
EventEmpirical Methods in Natural Language Processing 2021 - Online, Dominican Republic
Duration: 7 Nov 202111 Nov 2021 (Website) (Proceedings)


ConferenceEmpirical Methods in Natural Language Processing 2021
Abbreviated titleEMNLP 2021
Country/TerritoryDominican Republic
Internet address

Cite this