Total Recall: a Customized Continual Learning Method for Neural Semantic Parsers

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

13 Citations (Scopus)

Abstract

This paper investigates continual learning for semantic parsing. In this setting, a neural semantic parser learns tasks sequentially without accessing full training data from previous tasks. Direct application of the SOTA continual learning algorithms to this problem fails to achieve comparable performance with retraining models with all seen tasks, because they have not considered the special properties of structured outputs, yielded by semantic parsers. Therefore, we propose TOTAL RECALL, a continual learning method designed for neural semantic parsers from two aspects: i) a sampling method for memory replay that diversifies logical form templates and balances distributions of parse actions in a memory; ii) a two-stage training method that significantly improves generalization capability of the parsers across tasks. We conduct extensive experiments to study the research problems involved in continual semantic parsing, and demonstrate that a neural semantic parser trained with TOTAL RECALL achieves superior performance than the one trained directly with the SOTA continual learning algorithms, and achieve a 3-6 times speedup compared to retraining from scratch. Code and datasets are available at: https://github.com/zhuang-li/cl_nsp.

Original languageEnglish
Title of host publication2021 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
EditorsXuanjing Huang, Lucia Specia, Scott Wen-tau Yin
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages3816–3831
Number of pages16
ISBN (Electronic)9781955917094
Publication statusPublished - 2021
EventEmpirical Methods in Natural Language Processing 2021 - Online, Punta Cana, Dominican Republic
Duration: 7 Nov 202111 Nov 2021
https://2021.emnlp.org/ (Website)
https://aclanthology.org/2021.emnlp-main.0/ (Proceedings)
https://aclanthology.org/2021.findings-emnlp.0/ (Proceedings - findings)

Conference

ConferenceEmpirical Methods in Natural Language Processing 2021
Abbreviated titleEMNLP 2021
Country/TerritoryDominican Republic
CityPunta Cana
Period7/11/2111/11/21
Internet address

Cite this