Learning Boltzmann distance metric for face recognition

Truyen Tran, Dinh Q. Phung, Svetha Venkatesh

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

3 Citations (Scopus)


We introduce a new method for face recognition using a versatile probabilistic model known as Restricted Boltzmann Machine (RBM). In particular, we propose to regularise the standard data likelihood learning with an information-theoretic distance metric defined on intra-personal images. This results in an effective face representation which captures the regularities in the face space and minimises the intra-personal variations. In addition, our method allows easy incorporation of multiple feature sets with controllable level of sparsity. Our experiments on a high variation dataset show that the proposed method is competitive against other metric learning rivals. We also investigated the RBM method under a variety of settings, including fusing facial parts and utilising localised feature detectors under varying resolutions. In particular, the accuracy is boosted from 71.8% with the standard whole-face pixels to 99.2% with combination of facial parts, localised feature extractors and appropriate resolutions.

Original languageEnglish
Title of host publicationProceedings - 2012 IEEE International Conference on Multimedia Expo
Subtitle of host publicationICME 2012
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages6
Publication statusPublished - 5 Nov 2012
Externally publishedYes
EventIEEE International Conference on Multimedia and Expo 2012 - Melbourne Convention and Exhibition Center, Melbourne, Australia
Duration: 9 Jul 201213 Jul 2012
Conference number: 13th
http://ieeexplore.ieee.org/xpl/mostRecentIssue.jsp?punumber=6297631 (IEEE Conference Proceedngs)


ConferenceIEEE International Conference on Multimedia and Expo 2012
Abbreviated titleICME 2012
Internet address


  • Face recognition
  • information fusion
  • metric learning
  • Restricted Boltzmann Machines

Cite this