Semi-supervised continual learning with meta self-training

Stella Ho, Ming Liu, Lan Du, Yunfeng Li, Longxiang Gao, Shang Gao

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

2 Citations (Scopus)

Abstract

Continual learning (CL) aims to enhance sequential learning by alleviating the forgetting of previously acquired knowledge. Recent advances in CL lack consideration of the real-world scenarios, where labeled data are scarce and unlabeled data are abundant. To narrow this gap, we focus on semi-supervised continual learning (SSCL). We exploit unlabeled data under limited supervision in the CL setting and demonstrate the feasibility of semi-supervised learning in CL. In this work, we propose a novel method, namely Meta-SSCL, which combines meta-learning with pseudo-labeling and data augmentations to learn a sequence of semi-supervised tasks without catastrophic forgetting. Extensive experiments on CL benchmark text classification datasets show that our method achieves promising results in SSCL.

Original languageEnglish
Title of host publicationProceedings of the 31st ACM International Conference on Information and Knowledge Management
EditorsEsra Akbas, Badrul Sarwar
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Pages4024-4028
Number of pages5
ISBN (Electronic)9781450392365
ISBN (Print)9781450392365
DOIs
Publication statusPublished - 2022
EventACM International Conference on Information and Knowledge Management 2022 - Atlanta, United States of America
Duration: 17 Oct 202221 Oct 2022
Conference number: 31st
https://dl.acm.org/doi/proceedings/10.1145/3511808 (Proceedings)
https://www.cikm2022.org/calls/call-for-applied-research-papers (Website)

Conference

ConferenceACM International Conference on Information and Knowledge Management 2022
Abbreviated titleCIKM 2022
Country/TerritoryUnited States of America
CityAtlanta
Period17/10/2221/10/22
Internet address

Keywords

  • continual learning
  • text classification
  • semi-supervised learning
  • meta-learning

Cite this