A multimodal system using augmented reality, gestures, and tactile feedback for robot trajectory programming and execution

Wesley P. Chan, Camilo Perez Quintero, Matthew K. X. J. Pan, Maram Sakr, H. F. Machiel Van der Loos, Elizabeth Croft

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearch


Currently available interfaces for programming industrial robots, e.g., teach pendants and computer consoles, are often unintuitive, resulting in a slow and tedious process for teaching robot tasks. Kinesthetic teaching, i.e., teaching robot motions by placing the robot in a gravity compensated state and then moving the robot though the desired motions, provides an alternative for small robots for which safe interaction can be guaranteed. However for many large industrial robots physical interaction is not an option. Emerging augmented
reality technology offers an alternative interface with the potential to make robotic programming faster, safer, and more intuitive. The use of augmented reality admits the presentation of large amounts of rich, visual, in-situ information. However, it may also overload the user’s visual information capacity, or may not provide sufficient feedback regarding the state of the robot. With the addition of gestural control and tactile feedback to augmented reality, we propose a system that allows users to program and execute robot tasks in an efficient and intuitive manner, by providing relevant feedback through different channels to maximize clear communication of the task commands and outcomes.
Original languageEnglish
Title of host publication2018 ICRA Workshop on Robotics in Virtual Reality
Place of PublicationPiscataway NJ USA
PublisherIEEE, Institute of Electrical and Electronics Engineers
Number of pages7
Publication statusPublished - 2018
EventICRA Workshop on Robotics in Virtual Reality 2018 - Brisbane, Australia
Duration: 25 May 201825 May 2018


ConferenceICRA Workshop on Robotics in Virtual Reality 2018
Internet address

Cite this