Neighborhood mixture model for knowledge base completion

Dat Quoc Nguyen, Kairit Sirts, Lizhen Qu, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

21 Citations (Scopus)


Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE—a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other state-of-the-art embedding models on three benchmark datasets for triple classification, entity prediction and relation prediction tasks.

Original languageEnglish
Title of host publicationCoNLL 2016 - The 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL) - Proceedings of the Conference
EditorsYoav Goldberg, Stefan Riezler
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Number of pages11
ISBN (Electronic)9781945626197
Publication statusPublished - 2016
Externally publishedYes
EventConference on Natural Language Learning 2016 - Berlin, Germany
Duration: 11 Aug 201612 Aug 2016
Conference number: 20th (Proceedings)


ConferenceConference on Natural Language Learning 2016
Abbreviated titleCoNLL 2016
Internet address


  • Embedding model
  • Entity prediction
  • Knowledge base completion
  • Link prediction
  • Mixture model
  • Relation prediction
  • Triple classification

Cite this