Anomaly Detection in DYnamic graphs via Transformer

Yixin Liu, Shirui Pan, Yu Guang Wang, Fei Xiong, Liang Wang, Qingfeng Chen, Vincent CS Lee

Research output: Contribution to journalArticleResearchpeer-review

15 Citations (Scopus)

Abstract

Detecting anomalies for dynamic graphs has drawn increasing attention due to their wide applications in social networks, e-commerce, and cybersecurity. Recent deep learning-based approaches have shown promising results over shallow methods. However, they fail to address two core challenges of anomaly detection in dynamic graphs: the lack of informative encoding for unattributed nodes and the difficulty of learning discriminate knowledge from coupled spatial-temporal dynamic graphs. To overcome these challenges, in this paper, we present a novel transformer-based Anomaly Detection framework for dynamic graphs (TADDY). Our framework constructs a comprehensive node encoding strategy to better represent each nodes structural and temporal roles in an evolving graphs stream. Meanwhile, TADDY captures informative representation from dynamic graphs with coupled spatial-temporal patterns via a dynamic graph transformer model. The extensive experimental results demonstrate that our proposed TADDY framework outperforms the state-of-the-art methods by a large margin on six real-world datasets.

Original languageEnglish
Pages (from-to)12081-12094
Number of pages14
JournalIEEE Transactions on Knowledge and Data Engineering
Volume35
Issue number12
DOIs
Publication statusPublished - 1 Dec 2023

Keywords

  • Anomaly detection
  • dynamic graphs
  • Encoding
  • Feature extraction
  • Image edge detection
  • Solid modeling
  • Task analysis
  • transformer
  • Transformers

Cite this