Developing future wearable interfaces for human-drone teams through a virtual drone search game

Marlena R. Fraune, Ahmed S. Khalaf, Mahlet Zemedie, Poom Pianpak, Zahra NaminiMianji, Sultan A. Alharthi, Igor Dolgov, Bill Hamilton, Son Tran, Phoebe O. Toups Dugas

Research output: Contribution to journalArticleResearchpeer-review

17 Citations (Scopus)

Abstract

Autonomous robotic vehicles (i.e., drones) are potentially transformative for search and rescue (SAR). This paper works toward wearable interfaces, through which humans team with multiple drones. We introduce the Virtual Drone Search Game as a first step in creating a mixed reality simulation for humans to practice drone teaming and SAR techniques. Our goals are to (1) evaluate input modalities for the drones, derived from an iterative narrowing of the design space, (2) improve our mixed reality system for designing input modalities and training operators, and (3) collect data on how participants socially experience the virtual drones with which they work. In our study, 17 participants played the game with two input modalities (Gesture condition, Tap condition) in counterbalanced order. Results indicated that participants performed best with the Gesture condition. Participants found the multiple controls challenging, and future studies might include more training of the devices and game. Participants felt like a team with the drones and found them moderately agentic. In our future work, we will extend this testing to a more externally valid mixed reality game.

Original languageEnglish
Article number102573
Number of pages16
JournalInternational Journal of Human Computer Studies
Volume147
DOIs
Publication statusPublished - Mar 2021
Externally publishedYes

Keywords

  • Drones
  • Empirical study
  • Gesture interface
  • HMD
  • Human-drone teams
  • Mixed reality
  • Wearables

Cite this