A self-attention network based node embedding model

Dai Quoc Nguyen, Tu Dinh Nguyen, Dinh Phung

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Despite several signs of progress have been made recently, limited research has been conducted for an inductive setting where embeddings are required for newly unseen nodes – a setting encountered commonly in practical applications of deep learning for graph networks. This significantly affects the performances of downstream tasks such as node classification, link prediction or community extraction. To this end, we propose SANNE – a novel unsupervised embedding model – whose central idea is to employ a transformer self-attention network to iteratively aggregate vector representations of nodes in random walks. Our SANNE aims to produce plausible embeddings not only for present nodes, but also for newly unseen nodes. Experimental results show that the proposed SANNE obtains state-of-the-art results for the node classification task on well-known benchmark datasets.

Original languageEnglish
Title of host publicationEuropean Conference, ECML PKDD 2020 Ghent, Belgium, September 14–18, 2020 Proceedings, Part III
EditorsFrank Hutter, Kristian Kersting, Jefrey Lijffijt, Isabel Valera
Place of PublicationCham Switzerland
PublisherSpringer
Pages364-377
Number of pages14
ISBN (Electronic)9783030676643
ISBN (Print)9783030676636
DOIs
Publication statusPublished - 2021
EventEuropean Conference on Machine Learning European Conference on Principles and Practice of Knowledge Discovery in Databases 2020 - Ghent, Belgium
Duration: 14 Sep 202018 Sep 2020
https://ecmlpkdd2020.net/
https://link.springer.com/book/10.1007/978-3-030-67661-2 (Proceedings)

Publication series

NameLecture Notes in Computer Science
PublisherSpringer
Volume12459
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

ConferenceEuropean Conference on Machine Learning European Conference on Principles and Practice of Knowledge Discovery in Databases 2020
Abbreviated titleECML PKDD 2020
Country/TerritoryBelgium
CityGhent
Period14/09/2018/09/20
Internet address

Keywords

  • Node classification
  • Node embeddings
  • Self-attention network
  • Transformer

Cite this