Factorial multi-task learning: A Bayesian nonparametric approach

Sunil Kumar Gupta, Dinh Phung, Svetha Venkatesh

Research output: Contribution to conferencePaper

10 Citations (Scopus)

Abstract

Multi-task learning is a paradigm shown to improve the performance of related tasks through their joint learning. However, for real-world data, it is usually difficult to assess the task relatedness and joint learning with unrelated tasks may lead to serious performance degradations. To this end, we propose a framework that groups the tasks based on their relatedness in a subspace and allows a varying degree of relatedness among tasks by sharing the subspace bases across the groups. This provides the flexibility of no sharing when two sets of tasks are unrelated and partial/total sharing when the tasks are related. Importantly, the number of task-groups and the subspace dimensionality are automatically inferred from the data. To realize our framework, we introduce a novel Bayesian nonparametric prior that extends the traditional hierarchical beta process prior using a Dirichlet process to permit potentially infinite number of child beta processes. We apply our model for multi-task regression and classification applications. Experimental results using several synthetic and real datasets show the superiority of our model to other recent multi-task learning methods.

Original languageEnglish
Pages1694-1702
Number of pages9
Publication statusPublished - 1 Jan 2013
Event30th International Conference on Machine Learning, ICML 2013 - Atlanta, GA, United States of America
Duration: 16 Jun 201321 Jun 2013

Conference

Conference30th International Conference on Machine Learning, ICML 2013
CountryUnited States of America
CityAtlanta, GA
Period16/06/1321/06/13

Cite this

Gupta, S. K., Phung, D., & Venkatesh, S. (2013). Factorial multi-task learning: A Bayesian nonparametric approach. 1694-1702. Paper presented at 30th International Conference on Machine Learning, ICML 2013, Atlanta, GA, United States of America.