Abstract
We present VideoHandles, a novel interaction technique to support rapid review of wearable video camera data by re-performing gestures as a search query. The availability of wearable video capture devices has led to a significant increase in activity logging across a range of domains. However, searching through and reviewing footage for data curation can be a laborious and painstaking process. In this paper we showcase the use of gestures as search queries to support review and navigation of video data. By exploring example self-captured footage across a range of activities, we propose two video data navigation styles using gestures: prospective gesture tagging and retrospective gesture searching. We describe VideoHandles' interaction design, motivation and results of a pilot study.
Original language | English |
---|---|
Title of host publication | SUI'14 - Proceedings of the 2nd ACM Symposium on Spatial User Interaction, SUI 2014 |
Editors | Frank Steinicke, Evan Suma, Wolfgang Stuerzlinger |
Place of Publication | New York NY USA |
Publisher | Association for Computing Machinery (ACM) |
Pages | 50-53 |
Number of pages | 4 |
ISBN (Electronic) | 9781450328203 |
DOIs | |
Publication status | Published - 2014 |
Externally published | Yes |
Event | Symposium on Spatial User Interaction 2014 - Honolulu, United States of America Duration: 4 Oct 2014 → 5 Oct 2014 Conference number: 2nd |
Conference
Conference | Symposium on Spatial User Interaction 2014 |
---|---|
Abbreviated title | SUI 2014 |
Country/Territory | United States of America |
City | Honolulu |
Period | 4/10/14 → 5/10/14 |