VideoHandles: replicating gestures to search through action-camera video

Jarrod Knibbe, Sue Ann Seah, Mike Fraser

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

5 Citations (Scopus)


We present VideoHandles, a novel interaction technique to support rapid review of wearable video camera data by re-performing gestures as a search query. The availability of wearable video capture devices has led to a significant increase in activity logging across a range of domains. However, searching through and reviewing footage for data curation can be a laborious and painstaking process. In this paper we showcase the use of gestures as search queries to support review and navigation of video data. By exploring example self-captured footage across a range of activities, we propose two video data navigation styles using gestures: prospective gesture tagging and retrospective gesture searching. We describe VideoHandles' interaction design, motivation and results of a pilot study.

Original languageEnglish
Title of host publicationSUI'14 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction, SUI 2014
EditorsFrank Steinicke, Evan Suma, Wolfgang Stuerzlinger
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages4
ISBN (Electronic)9781450328203
Publication statusPublished - 2014
Externally publishedYes
EventSymposium on Spatial User Interaction 2014 - Honolulu, United States of America
Duration: 4 Oct 20145 Oct 2014
Conference number: 2nd


ConferenceSymposium on Spatial User Interaction 2014
Abbreviated titleSUI 2014
Country/TerritoryUnited States of America

Cite this