Demo: Multi-scale gestural interaction for augmented reality

Barrett Ens, Aaron Quigley, Hui-Shyong Yeo, Pourang Irani, Mark Billinghurst

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

1 Citation (Scopus)


We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone.

Original languageEnglish
Title of host publicationProceeding - SA '17 SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications
Subtitle of host publicationBangkok, Thailand — November 27 - 30, 2017
EditorsSurapong Lertsithichai, Pavadee Sompagdee
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages2
ISBN (Electronic)9781450354103
Publication statusPublished - 2017
Externally publishedYes
EventSymposium On Mobile Graphics And Interactive Applications 2017 - Bangkok, Thailand
Duration: 27 Nov 201730 Nov 2017
Conference number: 27th


ConferenceSymposium On Mobile Graphics And Interactive Applications 2017
Abbreviated titleMGIA 2017
Internet address


  • Augmented reality
  • Gesture interaction
  • Microgestures

Cite this