Neighborhood mixture model for knowledge base completion

Dat Quoc Nguyen, Kairit Sirts, Lizhen Qu, Mark Johnson

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

34 Citations (Scopus)

Abstract

Knowledge bases are useful resources for many natural language processing tasks, however, they are far from complete. In this paper, we define a novel entity representation as a mixture of its neighborhood in the knowledge base and apply this technique on TransE—a well-known embedding model for knowledge base completion. Experimental results show that the neighborhood information significantly helps to improve the results of the TransE, leading to better performance than obtained by other state-of-the-art embedding models on three benchmark datasets for triple classification, entity prediction and relation prediction tasks.

Original languageEnglish
Title of host publicationCoNLL 2016 - The 20th SIGNLL Conference on Computational Natural Language Learning (CoNLL) - Proceedings of the Conference
EditorsYoav Goldberg, Stefan Riezler
Place of PublicationStroudsburg PA USA
PublisherAssociation for Computational Linguistics (ACL)
Pages40-50
Number of pages11
ISBN (Electronic)9781945626197
DOIs
Publication statusPublished - 2016
Externally publishedYes
EventConference on Natural Language Learning 2016 - Berlin, Germany
Duration: 11 Aug 201612 Aug 2016
Conference number: 20th
https://www.conll.org/2016
https://www.aclweb.org/anthology/volumes/K16-1/ (Proceedings)

Conference

ConferenceConference on Natural Language Learning 2016
Abbreviated titleCoNLL 2016
Country/TerritoryGermany
CityBerlin
Period11/08/1612/08/16
Internet address

Keywords

  • Embedding model
  • Entity prediction
  • Knowledge base completion
  • Link prediction
  • Mixture model
  • Relation prediction
  • Triple classification

Cite this