Exploring auxiliary context: discrete semantic transfer hashing for scalable image retrieval

Lei Zhu, Zi Huang, Zhihui Li, Liang Xie, Heng Tao Shen

Research output: Contribution to journalArticleResearchpeer-review

96 Citations (Scopus)

Abstract

Unsupervised hashing can desirably support scalable content-based image retrieval for its appealing advantages of semantic label independence, memory, and search efficiency. However, the learned hash codes are embedded with limited discriminative semantics due to the intrinsic limitation of image representation. To address the problem, in this paper, we propose a novel hashing approach, dubbed as discrete semantic transfer hashing (DSTH). The key idea is to directly augment the semantics of discrete image hash codes by exploring auxiliary contextual modalities. To this end, a unified hashing framework is formulated to simultaneously preserve visual similarities of images and perform semantic transfer from contextual modalities. Furthermore, to guarantee direct semantic transfer and avoid information loss, we explicitly impose the discrete constraint, bit-uncorrelation constraint, and bit-balance constraint on hash codes. A novel and effective discrete optimization method based on augmented Lagrangian multiplier is developed to iteratively solve the optimization problem. The whole learning process has linear computation complexity and desirable scalability. Experiments on three benchmark data sets demonstrate the superiority of DSTH compared with several state-of-the-art approaches.

Original languageEnglish
Article number8291840
Pages (from-to)5264-5276
Number of pages13
JournalIEEE Transactions on Neural Networks and Learning Systems
Volume29
Issue number11
DOIs
Publication statusPublished - Nov 2018
Externally publishedYes

Keywords

  • Content-based image retrieval
  • discrete optimization
  • semantic transfer
  • unsupervised hashing
  • visual similarities

Cite this