Abstract
We introduce a transformer-based GNN model, named UGformer, to learn graph representations. In particular, we present two UGformer variants, wherein the first variant (publicized in September 2019) is to leverage the transformer on a set of sampled neighbors for each input node, while the second (publicized in May 2021) is to leverage the transformer on all input nodes. Experimental results demonstrate that the first UGformer variant achieves state-of-the-art accuracies on benchmark datasets for graph classification in both inductive setting and unsupervised transductive setting; and the second UGformer variant obtains state-of-the-art accuracies for inductive text classification. The code is available at: https://github.com/daiquocnguyen/Graph-Transformer.
Original language | English |
---|---|
Title of host publication | WWW'22 - Companion Proceedings of the Web Conference 2022 |
Editors | Ivan Herman, Lionel Médini |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 193-196 |
Number of pages | 4 |
ISBN (Electronic) | 9781450391306 |
DOIs | |
Publication status | Published - 2022 |
Event | International World Wide Web Conference 2022 - Online, France Duration: 25 Apr 2022 → 29 Apr 2022 Conference number: 31st https://www2022.thewebconf.org/ (Website) https://dl.acm.org/doi/proceedings/10.1145/3487553 (Proceedings) |
Conference
Conference | International World Wide Web Conference 2022 |
---|---|
Abbreviated title | WWW 2022 |
Country/Territory | France |
Period | 25/04/22 → 29/04/22 |
Internet address |
|
Keywords
- graph classification
- graph neural networks
- graph transformer
- inductive text classification
- unsupervised transductive learning