Abstract
Continual learning (CL) aims to enhance sequential learning by alleviating the forgetting of previously acquired knowledge. Recent advances in CL lack consideration of the real-world scenarios, where labeled data are scarce and unlabeled data are abundant. To narrow this gap, we focus on semi-supervised continual learning (SSCL). We exploit unlabeled data under limited supervision in the CL setting and demonstrate the feasibility of semi-supervised learning in CL. In this work, we propose a novel method, namely Meta-SSCL, which combines meta-learning with pseudo-labeling and data augmentations to learn a sequence of semi-supervised tasks without catastrophic forgetting. Extensive experiments on CL benchmark text classification datasets show that our method achieves promising results in SSCL.
| Original language | English |
|---|---|
| Title of host publication | Proceedings of the 31st ACM International Conference on Information and Knowledge Management |
| Editors | Esra Akbas, Badrul Sarwar |
| Place of Publication | New York NY USA |
| Publisher | Association for Computing Machinery (ACM) |
| Pages | 4024-4028 |
| Number of pages | 5 |
| ISBN (Electronic) | 9781450392365 |
| ISBN (Print) | 9781450392365 |
| DOIs | |
| Publication status | Published - 2022 |
| Event | ACM International Conference on Information and Knowledge Management 2022 - Atlanta, United States of America Duration: 17 Oct 2022 → 21 Oct 2022 Conference number: 31st https://dl.acm.org/doi/proceedings/10.1145/3511808 (Proceedings) https://www.cikm2022.org/calls/call-for-applied-research-papers (Website) |
Conference
| Conference | ACM International Conference on Information and Knowledge Management 2022 |
|---|---|
| Abbreviated title | CIKM 2022 |
| Country/Territory | United States of America |
| City | Atlanta |
| Period | 17/10/22 → 21/10/22 |
| Internet address |
Keywords
- continual learning
- text classification
- semi-supervised learning
- meta-learning
Cite this
- APA
- Author
- BIBTEX
- Harvard
- Standard
- RIS
- Vancouver