Long-short distance aggregation networks for positive unlabeled graph learning

Man Wu, Ivor Tsang, Shirui Pan, Xingquan Zhu, Lan Du, Bo Du

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Graph neural nets are emerging tools to represent network nodes for classification. However, existing approaches typically suffer from two limitations: (1) they only aggregate information from short distance (e.g., 1-hop neighbors) each round and fail to capture long distance relationship in graphs; (2) they require users to label data from several classes to facilitate the learning of discriminative models; whereas in reality, users may only provide labels of a small number of nodes in a single class. To overcome these limitations, this paper presents a novel long-short distance aggregation networks (LSDAN) for positive unlabeled (PU) graph learning. Our theme is to generate multiple graphs at different distances based on the adjacency matrix, and further develop a long-short distance attention model for these graphs. The short-distance attention mechanism is used to capture the importance of neighbor nodes to a target node. The long-distance attention mechanism is used to capture the propagation of information within a localized area of each node and help model weights of different graphs for node representation learning. A non-negative risk estimator is further employed, to aggregate long- short-distance networks, for PU learning using back-propagated loss modeling. Experiments on real-world datasets validate the effectiveness of our approach.

Original languageEnglish
Title of host publicationProceedings of the 28th ACM International Conference on Information & Knowledge Management
EditorsPeng Cui, Elke Rundensteiner, David Carmel, Qi He, Jeffrey Xu Yu
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Pages2157-2160
Number of pages4
ISBN (Electronic)9781450369763
DOIs
Publication statusPublished - 2019
EventACM International Conference on Information and Knowledge Management 2019 - Beijing, China
Duration: 3 Nov 20197 Nov 2019
Conference number: 28th
http://www.cikm2019.net/

Conference

ConferenceACM International Conference on Information and Knowledge Management 2019
Abbreviated titleCIKM 2019
CountryChina
CityBeijing
Period3/11/197/11/19
Internet address

Keywords

  • Graph neural networks
  • Positive unlabeled learning

Cite this

Wu, M., Tsang, I., Pan, S., Zhu, X., Du, L., & Du, B. (2019). Long-short distance aggregation networks for positive unlabeled graph learning. In P. Cui, E. Rundensteiner, D. Carmel, Q. He, & J. Xu Yu (Eds.), Proceedings of the 28th ACM International Conference on Information & Knowledge Management (pp. 2157-2160). New York NY USA: Association for Computing Machinery (ACM). https://doi.org/10.1145/3357384.3358122
Wu, Man ; Tsang, Ivor ; Pan, Shirui ; Zhu, Xingquan ; Du, Lan ; Du, Bo. / Long-short distance aggregation networks for positive unlabeled graph learning. Proceedings of the 28th ACM International Conference on Information & Knowledge Management. editor / Peng Cui ; Elke Rundensteiner ; David Carmel ; Qi He ; Jeffrey Xu Yu. New York NY USA : Association for Computing Machinery (ACM), 2019. pp. 2157-2160
@inproceedings{38ab7dfafcd04d978ee1f1c2aec1ddee,
title = "Long-short distance aggregation networks for positive unlabeled graph learning",
abstract = "Graph neural nets are emerging tools to represent network nodes for classification. However, existing approaches typically suffer from two limitations: (1) they only aggregate information from short distance (e.g., 1-hop neighbors) each round and fail to capture long distance relationship in graphs; (2) they require users to label data from several classes to facilitate the learning of discriminative models; whereas in reality, users may only provide labels of a small number of nodes in a single class. To overcome these limitations, this paper presents a novel long-short distance aggregation networks (LSDAN) for positive unlabeled (PU) graph learning. Our theme is to generate multiple graphs at different distances based on the adjacency matrix, and further develop a long-short distance attention model for these graphs. The short-distance attention mechanism is used to capture the importance of neighbor nodes to a target node. The long-distance attention mechanism is used to capture the propagation of information within a localized area of each node and help model weights of different graphs for node representation learning. A non-negative risk estimator is further employed, to aggregate long- short-distance networks, for PU learning using back-propagated loss modeling. Experiments on real-world datasets validate the effectiveness of our approach.",
keywords = "Graph neural networks, Positive unlabeled learning",
author = "Man Wu and Ivor Tsang and Shirui Pan and Xingquan Zhu and Lan Du and Bo Du",
year = "2019",
doi = "10.1145/3357384.3358122",
language = "English",
pages = "2157--2160",
editor = "Cui, {Peng } and Rundensteiner, {Elke } and Carmel, {David } and He, {Qi } and {Xu Yu}, {Jeffrey }",
booktitle = "Proceedings of the 28th ACM International Conference on Information & Knowledge Management",
publisher = "Association for Computing Machinery (ACM)",
address = "United States of America",

}

Wu, M, Tsang, I, Pan, S, Zhu, X, Du, L & Du, B 2019, Long-short distance aggregation networks for positive unlabeled graph learning. in P Cui, E Rundensteiner, D Carmel, Q He & J Xu Yu (eds), Proceedings of the 28th ACM International Conference on Information & Knowledge Management. Association for Computing Machinery (ACM), New York NY USA, pp. 2157-2160, ACM International Conference on Information and Knowledge Management 2019, Beijing, China, 3/11/19. https://doi.org/10.1145/3357384.3358122

Long-short distance aggregation networks for positive unlabeled graph learning. / Wu, Man; Tsang, Ivor; Pan, Shirui; Zhu, Xingquan; Du, Lan; Du, Bo.

Proceedings of the 28th ACM International Conference on Information & Knowledge Management. ed. / Peng Cui; Elke Rundensteiner; David Carmel; Qi He; Jeffrey Xu Yu. New York NY USA : Association for Computing Machinery (ACM), 2019. p. 2157-2160.

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

TY - GEN

T1 - Long-short distance aggregation networks for positive unlabeled graph learning

AU - Wu, Man

AU - Tsang, Ivor

AU - Pan, Shirui

AU - Zhu, Xingquan

AU - Du, Lan

AU - Du, Bo

PY - 2019

Y1 - 2019

N2 - Graph neural nets are emerging tools to represent network nodes for classification. However, existing approaches typically suffer from two limitations: (1) they only aggregate information from short distance (e.g., 1-hop neighbors) each round and fail to capture long distance relationship in graphs; (2) they require users to label data from several classes to facilitate the learning of discriminative models; whereas in reality, users may only provide labels of a small number of nodes in a single class. To overcome these limitations, this paper presents a novel long-short distance aggregation networks (LSDAN) for positive unlabeled (PU) graph learning. Our theme is to generate multiple graphs at different distances based on the adjacency matrix, and further develop a long-short distance attention model for these graphs. The short-distance attention mechanism is used to capture the importance of neighbor nodes to a target node. The long-distance attention mechanism is used to capture the propagation of information within a localized area of each node and help model weights of different graphs for node representation learning. A non-negative risk estimator is further employed, to aggregate long- short-distance networks, for PU learning using back-propagated loss modeling. Experiments on real-world datasets validate the effectiveness of our approach.

AB - Graph neural nets are emerging tools to represent network nodes for classification. However, existing approaches typically suffer from two limitations: (1) they only aggregate information from short distance (e.g., 1-hop neighbors) each round and fail to capture long distance relationship in graphs; (2) they require users to label data from several classes to facilitate the learning of discriminative models; whereas in reality, users may only provide labels of a small number of nodes in a single class. To overcome these limitations, this paper presents a novel long-short distance aggregation networks (LSDAN) for positive unlabeled (PU) graph learning. Our theme is to generate multiple graphs at different distances based on the adjacency matrix, and further develop a long-short distance attention model for these graphs. The short-distance attention mechanism is used to capture the importance of neighbor nodes to a target node. The long-distance attention mechanism is used to capture the propagation of information within a localized area of each node and help model weights of different graphs for node representation learning. A non-negative risk estimator is further employed, to aggregate long- short-distance networks, for PU learning using back-propagated loss modeling. Experiments on real-world datasets validate the effectiveness of our approach.

KW - Graph neural networks

KW - Positive unlabeled learning

UR - http://www.scopus.com/inward/record.url?scp=85075484047&partnerID=8YFLogxK

U2 - 10.1145/3357384.3358122

DO - 10.1145/3357384.3358122

M3 - Conference Paper

AN - SCOPUS:85075484047

SP - 2157

EP - 2160

BT - Proceedings of the 28th ACM International Conference on Information & Knowledge Management

A2 - Cui, Peng

A2 - Rundensteiner, Elke

A2 - Carmel, David

A2 - He, Qi

A2 - Xu Yu, Jeffrey

PB - Association for Computing Machinery (ACM)

CY - New York NY USA

ER -

Wu M, Tsang I, Pan S, Zhu X, Du L, Du B. Long-short distance aggregation networks for positive unlabeled graph learning. In Cui P, Rundensteiner E, Carmel D, He Q, Xu Yu J, editors, Proceedings of the 28th ACM International Conference on Information & Knowledge Management. New York NY USA: Association for Computing Machinery (ACM). 2019. p. 2157-2160 https://doi.org/10.1145/3357384.3358122