Probabilistic task modelling for meta-learning

Cuong C. Nguyen, Thanh-Toan Do, Gustavo Carneiro

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

We propose probabilistic task modelling – a generative probabilistic model for collections of tasks used in meta-learning. The proposed model combines variational auto-encoding and latent Dirichlet allocation to model each task as a mixture of Gaussian distribution in an embedding space. Such modelling provides an explicit representation of a task through its task-theme mixture. We present an efficient approximation inference technique based on variational inference method for empirical Bayes parameter estimation. We perform empirical evaluations to validate the task uncertainty and task distance produced by the proposed method through correlation diagrams of the prediction accuracy on testing tasks. We also carry out experiments of task selection in meta-learning to demonstrate how the task relatedness inferred from the proposed model help to facilitate meta-learning algorithms.

Original languageEnglish
Title of host publicationProceedings of the Thirty-Seventh Conference on Uncertainty in Artificial Intelligence (UAI 2021)
EditorsCassio Campos, Marloes H. Maathuis
Place of PublicationLondon UK
PublisherProceedings of Machine Learning Research (PMLR)
Pages781-791
Number of pages11
Publication statusPublished - 2021
EventConference on Uncertainty in Artificial Intelligence 2021 - Online
Duration: 27 Jul 202130 Jul 2021
Conference number: 37th
https://proceedings.mlr.press/v161/ (Proceedings)

Publication series

NameProceedings of Machine Learning Research
Volume161
ISSN (Electronic)2640-3498

Conference

ConferenceConference on Uncertainty in Artificial Intelligence 2021
Abbreviated titleUAI 2021
Period27/07/2130/07/21
Internet address

Cite this