Deep metric learning meets deep clustering: an novel unsupervised approach for feature embedding

Binh X. Nguyen, Binh D. Nguyen, Gustavo Carneiro, Erman Tjiputra, Quang D. Tran, Thanh-Toan Do

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review


Unsupervised Deep Distance Metric Learning (UDML) aims to learn sample simi- larities in the embedding space from an unlabeled dataset. Traditional UDML methods usually use the triplet loss or pairwise loss which requires the mining of positive and neg- ative samples w.r.t. anchor data points. This is, however, challenging in an unsupervised setting as the label information is not available. In this paper, we propose a new UDML method that overcomes that challenge. In particular, we propose to use a deep cluster- ing loss to learn centroids, i.e., pseudo labels, that represent semantic classes. During learning, these centroids are also used to reconstruct the input samples. It hence ensures the representativeness of centroids — each centroid represents visually similar samples. Therefore, the centroids give information about positive (visually similar) and negative (visually dissimilar) samples. Based on pseudo labels, we propose a novel unsupervised metric loss which enforces the positive concentration and negative separation of samples in the embedding space. Experimental results on benchmarking datasets show that the proposed approach outperforms other UDML methods.
Original languageEnglish
Title of host publication31st British Machine Vision Conference, BMVC 2020
EditorsNeill Campbell
Place of PublicationLondon UK
PublisherBritish Machine Vision Association and Society for Pattern Recognition
Number of pages13
Publication statusPublished - 2020
EventBritish Machine Vision Conference 2020 - Virtual, London, United Kingdom
Duration: 7 Sep 202010 Sep 2020
Conference number: 31st (Website) (Proceedings)


ConferenceBritish Machine Vision Conference 2020
Abbreviated titleBMVC 2020
Country/TerritoryUnited Kingdom
Internet address

Cite this