Large-scale metric learning: a voyage from shallow to deep

Masoud Faraki, Mehrtash T. Harandi, Fatih Porikli

Research output: Contribution to journalArticleResearchpeer-review

8 Citations (Scopus)


Despite its attractive properties, the performance of the recently introduced Keep It Simple and Straightforward MEtric learning (KISSME) method is greatly dependent on principal component analysis as a preprocessing step. This dependence can lead to difficulties, \eg, when the dimensionality is not meticulously set. To address this issue, we devise a unified formulation for joint dimensionality reduction and metric learning based on the KISSME algorithm. Our joint formulation is expressed as an optimization problem on the Grassmann manifold, and hence enjoys the properties of Riemannian optimization techniques. Following the success of deep learning in recent years, we also devise end-to-end learning of a generic deep network for metric learning using our derivation.

Original languageEnglish
Pages (from-to)4339-4346
Number of pages8
JournalIEEE Transactions on Neural Networks and Learning Systems
Issue number9
Publication statusPublished - Sept 2018
Externally publishedYes


  • Algorithm design and analysis
  • Deep metric learning
  • dimensionality reduction
  • Extraterrestrial measurements
  • Mahalanobis metric learning
  • Optimization
  • Principal component analysis
  • Riemannian geometry.
  • Training

Cite this