Pinpointing: precise head- and eye-based target selection for augmented reality

Mikko Kytö, Barrett Ens, Thammathip Piumsomboon, Gun A. Lee, Mark Billinghurst

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

Abstract

Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.

Original languageEnglish
Title of host publicationCHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems
Subtitle of host publicationApril 21–26, 2018 Montréal, QC, Canada
EditorsMark Perry, Anna Cox
Place of PublicationNew York NY USA
PublisherAssociation for Computing Machinery (ACM)
Number of pages14
ISBN (Electronic)9781450356206, 9781450356213
DOIs
Publication statusPublished - 2018
Externally publishedYes
EventInternational Conference on Human Factors in Computing Systems 2018 - Montreal, Canada
Duration: 21 Apr 201826 Apr 2018
https://chi2018.acm.org/

Conference

ConferenceInternational Conference on Human Factors in Computing Systems 2018
Abbreviated titleCHI 2018
CountryCanada
CityMontreal
Period21/04/1826/04/18
Internet address

Keywords

  • Augmented reality
  • Eye tracking
  • Gaze interaction
  • Head-worn display
  • Refinement techniques
  • Target selection

Cite this

Kytö, M., Ens, B., Piumsomboon, T., Lee, G. A., & Billinghurst, M. (2018). Pinpointing: precise head- and eye-based target selection for augmented reality. In M. Perry, & A. Cox (Eds.), CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: April 21–26, 2018 Montréal, QC, Canada New York NY USA: Association for Computing Machinery (ACM). https://doi.org/10.1145/3173574.3173655
Kytö, Mikko ; Ens, Barrett ; Piumsomboon, Thammathip ; Lee, Gun A. ; Billinghurst, Mark. / Pinpointing : precise head- and eye-based target selection for augmented reality. CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: April 21–26, 2018 Montréal, QC, Canada. editor / Mark Perry ; Anna Cox. New York NY USA : Association for Computing Machinery (ACM), 2018.
@inproceedings{9d8ca5b7f65544658a004f8b5bd452f4,
title = "Pinpointing: precise head- and eye-based target selection for augmented reality",
abstract = "Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.",
keywords = "Augmented reality, Eye tracking, Gaze interaction, Head-worn display, Refinement techniques, Target selection",
author = "Mikko Kyt{\"o} and Barrett Ens and Thammathip Piumsomboon and Lee, {Gun A.} and Mark Billinghurst",
year = "2018",
doi = "10.1145/3173574.3173655",
language = "English",
editor = "Mark Perry and Anna Cox",
booktitle = "CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems",
publisher = "Association for Computing Machinery (ACM)",
address = "United States of America",

}

Kytö, M, Ens, B, Piumsomboon, T, Lee, GA & Billinghurst, M 2018, Pinpointing: precise head- and eye-based target selection for augmented reality. in M Perry & A Cox (eds), CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: April 21–26, 2018 Montréal, QC, Canada. Association for Computing Machinery (ACM), New York NY USA, International Conference on Human Factors in Computing Systems 2018, Montreal, Canada, 21/04/18. https://doi.org/10.1145/3173574.3173655

Pinpointing : precise head- and eye-based target selection for augmented reality. / Kytö, Mikko; Ens, Barrett; Piumsomboon, Thammathip; Lee, Gun A.; Billinghurst, Mark.

CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: April 21–26, 2018 Montréal, QC, Canada. ed. / Mark Perry; Anna Cox. New York NY USA : Association for Computing Machinery (ACM), 2018.

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

TY - GEN

T1 - Pinpointing

T2 - precise head- and eye-based target selection for augmented reality

AU - Kytö, Mikko

AU - Ens, Barrett

AU - Piumsomboon, Thammathip

AU - Lee, Gun A.

AU - Billinghurst, Mark

PY - 2018

Y1 - 2018

N2 - Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.

AB - Head and eye movement can be leveraged to improve the user's interaction repertoire for wearable displays. Head movements are deliberate and accurate, and provide the current state-of-the-art pointing technique. Eye gaze can potentially be faster and more ergonomic, but suffers from low accuracy due to calibration errors and drift of wearable eye-tracking sensors. This work investigates precise, multimodal selection techniques using head motion and eye gaze. A comparison of speed and pointing accuracy reveals the relative merits of each method, including the achievable target size for robust selection. We demonstrate and discuss example applications for augmented reality, including compact menus with deep structure, and a proof-of-concept method for on-line correction of calibration drift.

KW - Augmented reality

KW - Eye tracking

KW - Gaze interaction

KW - Head-worn display

KW - Refinement techniques

KW - Target selection

UR - http://www.scopus.com/inward/record.url?scp=85046953545&partnerID=8YFLogxK

U2 - 10.1145/3173574.3173655

DO - 10.1145/3173574.3173655

M3 - Conference Paper

BT - CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

A2 - Perry, Mark

A2 - Cox, Anna

PB - Association for Computing Machinery (ACM)

CY - New York NY USA

ER -

Kytö M, Ens B, Piumsomboon T, Lee GA, Billinghurst M. Pinpointing: precise head- and eye-based target selection for augmented reality. In Perry M, Cox A, editors, CHI 2018 - Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems: April 21–26, 2018 Montréal, QC, Canada. New York NY USA: Association for Computing Machinery (ACM). 2018 https://doi.org/10.1145/3173574.3173655