Compact representation of high-dimensional feature vectors for large-scale image recognition and retrieval

Yu Zhang, Jianxin Wu, Jianfei Cai

Research output: Contribution to journalArticleResearchpeer-review

15 Citations (Scopus)


In large-scale visual recognition and image retrieval tasks, feature vectors, such as Fisher vector (FV) or the vector of locally aggregated descriptors (VLAD), have achieved state-of-the-art results. However, the combination of the large numbers of examples and high-dimensional vectors necessitates dimensionality reduction, in order to reduce its storage and CPU costs to a reasonable range. In spite of the popularity of various feature compression methods, this paper shows that the feature (dimension) selection is a better choice for high-dimensional FV/VLAD than the feature (dimension) compression methods, e.g., product quantization. We show that strong correlation among the feature dimensions in the FV and the VLAD may not exist, which renders feature selection a natural choice. We also show that, many dimensions in FV/VLAD are noise. Throwing them away using feature selection is better than compressing them and useful dimensions altogether using feature compression methods. To choose features, we propose an efficient importance sorting algorithm considering both the supervised and unsupervised cases, for visual recognition and image retrieval, respectively. Combining with the 1-bit quantization, feature selection has achieved both higher accuracy and less computational cost than feature compression methods, such as product quantization, on the FV and the VLAD image representations.

Original languageEnglish
Pages (from-to)2407-2419
Number of pages13
JournalIEEE Transactions on Image Processing
Issue number5
Publication statusPublished - May 2016
Externally publishedYes


  • feature selection
  • image representation
  • large scale

Cite this