Contrastive learning for cold-start recommendation

Yinwei Wei, Xiang Wang, Qi Li, Liqiang Nie, Yan Li, Xuanping Li, Tat-Seng Chua

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch

250 Citations (Scopus)

Abstract

Recommending purely cold-start items is a long-standing and fundamental challenge in the recommender systems. Without any historical interaction on cold-start items, the collaborative filtering (CF) scheme fails to leverage collaborative signals to infer user preference on these items. To solve this problem, extensive studies have been conducted to incorporate side information of items (e.g. content features) into the CF scheme. Specifically, they employ modern neural network techniques (e.g., dropout, consistency constraint) to discover and exploit the coalition effect of content features and collaborative representations. However, we argue that these works less explore the mutual dependencies between content features and collaborative representations and lack sufficient theoretical supports, thus resulting in unsatisfactory performance on cold-start recommendation. In this work, we reformulate the cold-start item representation learning from an information-theoretic standpoint. It aims to maximize the mutual dependencies between item content and collaborative signals. Specifically, the representation learning is theoretically lower-bounded by the integration of two terms: mutual information between collaborative embeddings of users and items, and mutual information between collaborative embeddings and feature representations of items. To model such a learning process, we devise a new objective function founded upon contrastive learning and develop a simple yet efficient Contrastive Learning-based Cold-start Recommendation framework (CLCRec). In particular, CLCRec consists of three components: contrastive pair organization, contrastive embedding, and contrastive optimization modules. It allows us to preserve collaborative signals in the content representations for both warm and cold-start items. Through extensive experiments on four publicly accessible datasets, we observe that CLCRec achieves significant improvements over state-of-the-art approaches in both warm- and cold-start scenarios.

Original languageEnglish
Title of host publicationProceedings of the 29th ACM International Conference on Multimedia
EditorsLiqiang Nie, Qianru Sun, Peng Cui
Place of PublicationNeed York NY USA
PublisherAssociation for Computing Machinery (ACM)
Pages5382-5390
Number of pages9
ISBN (Electronic)9781450386517
DOIs
Publication statusPublished - 2021
Externally publishedYes
EventACM International Conference on Multimedia 2021 - Chengdu, China
Duration: 20 Oct 202124 Oct 2021
Conference number: 29th
https://dl.acm.org/doi/proceedings/10.1145/3474085 (Proceedings)
https://2021.acmmm.org/ (Website)

Conference

ConferenceACM International Conference on Multimedia 2021
Abbreviated titleMM 2021
Country/TerritoryChina
CityChengdu
Period20/10/2124/10/21
Internet address

Keywords

  • cold-start recommendation
  • collaborative filtering
  • contrastive learning
  • multimedia recommendation
  • recommender system

Cite this