Exploiting scene graphs for Human-Object Interaction detection

Tao He, Lianli Gao, Jingkuan Song, Yuan-Fang Li

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Human-Object Interaction (HOI) detection is a fundamental visual task aiming at localizing and recognizing interactions between humans and objects. Existing works focus on the visual and linguistic features of humans and objects. However, they do not captalise on the high-level and semantic relationships present in the image, which provides crucial contextual and detailed relational knowledge for HOI inference. We propose a novel method to exploit this information, through the scene graph, for the HumanObject Interaction (SG2HOI) detection task. Our method, SG2HOI, incorporates the SG information in two ways: (1) we embed a scene graph into a global context clue, serving as the scene-specific environmental context; and (2) we build a relation-aware message-passing module to gather relationships from objects' neighborhood and transfer them into interactions. Empirical evaluation shows that our SG2HOI method outperforms the state-of-the-art methods on two benchmark HOI datasets: V-COCO and HICO-DET. Code will be available at https://github.com/ht014/SG2HOI.
Original languageEnglish
Title of host publicationProceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), 2021
EditorsEric Mortensen
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages15984-15993
Number of pages10
Publication statusPublished - 2021
EventIEEE International Conference on Computer Vision 2021 - Online, United States of America
Duration: 11 Oct 202117 Oct 2021
https://iccv2021.thecvf.com/home (Website)
https://ieeexplore.ieee.org/xpl/conhome/9709627/proceeding (Proceedings)

Conference

ConferenceIEEE International Conference on Computer Vision 2021
Abbreviated titleICCV 2021
Country/TerritoryUnited States of America
CityOnline
Period11/10/2117/10/21
Internet address

Cite this