Multi-label Few/Zero-shot learning with knowledge aggregated from multiple label graphs

Jueqing Lu, Lan Du, Joanna Dipnall, Ming Liu

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Few/Zero-shot learning is a big challenge of many classifications tasks, where a classifier is required to recognise instances of classes that have very few or even no training samples. It becomes more difficult in multi-label classification, where each instance is labelled with more than one class. In this paper, we present a simple multi-graph aggregation model that fuses knowledge from multiple label graphs encoding different semantic label relationships in order to study how the aggregated knowledge can benefit multi-label zero/few-shot document classification. The model utilises three kinds of semantic information, i.e., the pre-trained word embeddings, label description, and pre-defined label relations. Experimental results derived on two large clinical datasets (i.e., MIMIC-II and MIMIC-III ) and the EU legislation dataset show that methods equipped with the multi-graph knowledge aggregation achieve significant performance improvement across almost all the measures on few/zero-shot labels.
Original languageEnglish
Title of host publication2020 Conference on Empirical Methods in Natural Language Processing, Proceedings of the Conference
EditorsTrevor Cohn, Yulan He, Yang Liu
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages2935-2943
Number of pages9
ISBN (Electronic)9781952148606
DOIs
Publication statusPublished - 2020
EventEmpirical Methods in Natural Language Processing 2020 - Virtual, Punta Cana, Dominican Republic
Duration: 16 Nov 202020 Nov 2020
https://2020.emnlp.org/ (Website)
http://Proceedings (www.aclweb.org/anthology/volumes/2020.emnlp-main/)

Conference

ConferenceEmpirical Methods in Natural Language Processing 2020
Abbreviated titleEMNLP
CountryDominican Republic
CityPunta Cana
Period16/11/2020/11/20
Internet address

Cite this