Abstract
In natural language processing and related fields, it has been shown that the word embeddings can successfully capture both the semantic and syntactic features of words. They can serve as complementary information to topics models, especially for the cases where word co-occurrence data is insufficient, such as with short texts. In this paper, we propose a focused topic model where how a topic focuses on words is informed by word embeddings. Our models is able to discover more informed and focused topics with more representative words, leading to better modelling accuracy and topic quality. With the data argumentation technique, we can derive an efficient Gibbs sampling algorithm that benefits from the fully local conjugacy of the model. We conduct extensive experiments on several real world datasets, which demonstrate that our model achieves comparable or improved performance in terms of both perplexity and topic coherence, particularly in handling short text data.
Original language | English |
---|---|
Title of host publication | 2017 Ninth Asian Conference on Machine Learning, ACML 2017 |
Subtitle of host publication | 15-17 November 2017, Seoul, Korea, Proceedings |
Editors | Min-Ling Zhang, Yung-Kyun Noh |
Place of Publication | USA |
Publisher | Proceedings of Machine Learning Research (PMLR) |
Pages | 423-438 |
Number of pages | 16 |
Publication status | Published - 2017 |
Event | Asian Conference on Machine Learning 2017 - Yonsei University, Seoul, Korea, Republic of (South) Duration: 15 Nov 2017 → 17 Nov 2017 Conference number: 9th http://www.acml-conf.org/2017/ http://proceedings.mlr.press/v77/ (Proceedings) |
Publication series
Name | Proceedings of Machine Learning Research |
---|---|
Publisher | Proceedings of Machine Learning Research (PMLR) |
Volume | 77 |
ISSN (Print) | 1938-7228 |
Conference
Conference | Asian Conference on Machine Learning 2017 |
---|---|
Abbreviated title | ACML 2017 |
Country/Territory | Korea, Republic of (South) |
City | Seoul |
Period | 15/11/17 → 17/11/17 |
Internet address |