Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion

Daniel Troniak, Junaed Sattar, Ankur Gupta, James J. Little, Wesley Chan, Ergun Calisgan, Elizabeth Croft, Machiel Van Der Loos

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

Abstract

This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.

Original languageEnglish
Title of host publicationProceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013
Pages1-8
Number of pages8
DOIs
Publication statusPublished - 9 Sep 2013
Externally publishedYes
EventInternational Conference on Computer and Robot Vision 2013 - Regina, SK, Canada
Duration: 29 May 201331 May 2013
Conference number: 10th

Conference

ConferenceInternational Conference on Computer and Robot Vision 2013
Abbreviated titleCRV 2013
CountryCanada
CityRegina, SK
Period29/05/1331/05/13

Keywords

  • Multi-Floor Navigation
  • Robot Elevator Operation
  • Robot Vision
  • Robotics
  • Service Robots

Cite this

Troniak, D., Sattar, J., Gupta, A., Little, J. J., Chan, W., Calisgan, E., ... Van Der Loos, M. (2013). Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion. In Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013 (pp. 1-8). [6569177] https://doi.org/10.1109/CRV.2013.12
Troniak, Daniel ; Sattar, Junaed ; Gupta, Ankur ; Little, James J. ; Chan, Wesley ; Calisgan, Ergun ; Croft, Elizabeth ; Van Der Loos, Machiel. / Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion. Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013. 2013. pp. 1-8
@inproceedings{621455b0aac34029a687b720f7f0ba6f,
title = "Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion",
abstract = "This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.",
keywords = "Multi-Floor Navigation, Robot Elevator Operation, Robot Vision, Robotics, Service Robots",
author = "Daniel Troniak and Junaed Sattar and Ankur Gupta and Little, {James J.} and Wesley Chan and Ergun Calisgan and Elizabeth Croft and {Van Der Loos}, Machiel",
year = "2013",
month = "9",
day = "9",
doi = "10.1109/CRV.2013.12",
language = "English",
isbn = "9780769549835",
pages = "1--8",
booktitle = "Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013",

}

Troniak, D, Sattar, J, Gupta, A, Little, JJ, Chan, W, Calisgan, E, Croft, E & Van Der Loos, M 2013, Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion. in Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013., 6569177, pp. 1-8, International Conference on Computer and Robot Vision 2013, Regina, SK, Canada, 29/05/13. https://doi.org/10.1109/CRV.2013.12

Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion. / Troniak, Daniel; Sattar, Junaed; Gupta, Ankur; Little, James J.; Chan, Wesley; Calisgan, Ergun; Croft, Elizabeth; Van Der Loos, Machiel.

Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013. 2013. p. 1-8 6569177.

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

TY - GEN

T1 - Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion

AU - Troniak, Daniel

AU - Sattar, Junaed

AU - Gupta, Ankur

AU - Little, James J.

AU - Chan, Wesley

AU - Calisgan, Ergun

AU - Croft, Elizabeth

AU - Van Der Loos, Machiel

PY - 2013/9/9

Y1 - 2013/9/9

N2 - This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.

AB - This paper presents the design, implementation and experimental evaluation of a semi-humanoid robotic system for autonomous multi-floor navigation. This robot, a Personal Robot 2 named Charlie, is capable of operating an elevator to travel between rooms located on separate floors. Our goal is to create a robotic assistant capable of locating points of interest, manipulating objects, and navigating between rooms in a multi-storied environment equipped with an elevator. Taking the elevator requires the robot to (1) map and localize within its operating environment, (2) navigate to an elevator door, (3) press the up or down elevator call button, (4) enter the elevator, (5) press the control button associated with the target floor, and (6) exit the elevator at the correct floor. To that end, this work integrates the advanced sensor motor capabilities of the robot - laser rangefinders, stereo and monocular vision systems, and robotic arms - into a complete, task-driven autonomous system. While the design and implementation of individual sensor motor processing components is a challenge in and of itself, complete integration in intelligent systems design often presents an even greater challenge. This paper presents our approach towards designing the individual components, with focus on machine vision, manipulation, and systems integration. We present and discuss quantitative results of our live robotic system, discuss difficulties faced and expose potential pitfalls.

KW - Multi-Floor Navigation

KW - Robot Elevator Operation

KW - Robot Vision

KW - Robotics

KW - Service Robots

UR - http://www.scopus.com/inward/record.url?scp=84883395570&partnerID=8YFLogxK

U2 - 10.1109/CRV.2013.12

DO - 10.1109/CRV.2013.12

M3 - Conference Paper

SN - 9780769549835

SP - 1

EP - 8

BT - Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013

ER -

Troniak D, Sattar J, Gupta A, Little JJ, Chan W, Calisgan E et al. Charlie rides the elevator - Integrating vision, navigation and manipulation towards multi-floor robot locomotion. In Proceedings - 2013 International Conference on Computer and Robot Vision, CRV 2013. 2013. p. 1-8. 6569177 https://doi.org/10.1109/CRV.2013.12