Lifelong explainer for Lifelong Learners

Snow Situ, Sameen Maruf, Ingrid Zukerman, Cécile Paris, Reza Haffari

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


Lifelong Learning (LL) black-box models are dynamic in that they keep learning from new tasks and constantly update their parameters. Owing to the need to utilize information from previously seen tasks, and capture commonalities in potentially diverse data, it is hard for automatic explanation methods to explain the outcomes of these models. In addition, existing explanation methods, e.g., LIME, which are computationally expensive when explaining a static black-box model, are even more inefficient in the LL setting. In this paper, we propose a novel Lifelong Explanation (LLE) approach that continuously trains a student explainer under the supervision of a teacher – an arbitrary explanation algorithm – on different tasks undertaken in LL. We also leverage the Experience Replay (ER) mechanism to prevent catastrophic forgetting in the student explainer. Our experiments comparing LLE to three baselines on text classification tasks show that LLE can enhance the stability of the explanations for all seen tasks and maintain the same level of faithfulness to the black-box model as the teacher, while being up to 10ˆ2 times faster at test time. Our ablation study shows that the ER mechanism in our LLE approach enhances the learning capabilities of the student explainer. Our code is available at
Original languageEnglish
Title of host publication2021 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
EditorsXuanjing Huang, Lucia Specia, Scott Wen-tau Yin
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages8
ISBN (Electronic)9781955917094
Publication statusPublished - 2021
EventEmpirical Methods in Natural Language Processing 2021 - Online, Dominican Republic
Duration: 7 Nov 202111 Nov 2021 (Website) (Proceedings)


ConferenceEmpirical Methods in Natural Language Processing 2021
Abbreviated titleEMNLP 2021
Country/TerritoryDominican Republic
Internet address

Cite this