Robust visual tracking for non-instrumental augmented reality

Georg Klein, Tom Drummond

Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

52 Citations (Scopus)

Abstract

This paper presents a robust and flexible framework for augmented reality which does not require instrumenting either the environment or the workpiece. A model-based visual tracking system is combined with rate gyroscopes to produce a system which can track the rapid camera rotations generated by a head-mounted camera, even if images are substantially degraded by motion blur. This tracking yields estimates of head position at video field rate (50Hz) which are used to align computer-generated graphics on an optical see-through display. Nonlinear optimisation is used for the calibration of display parameters which include a model of optical distortion. Rendered visuals are pre-distorted to correct the optical distortion of the display.

Original languageEnglish
Title of host publicationProceedings - 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2003
PublisherIEEE, Institute of Electrical and Electronics Engineers
Pages113-122
Number of pages10
ISBN (Electronic)0769520065
DOIs
Publication statusPublished - 1 Jan 2003
Externally publishedYes
Event2nd IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2003 - Tokyo, Japan
Duration: 7 Oct 200310 Oct 2003

Conference

Conference2nd IEEE and ACM International Symposium on Mixed and Augmented Reality, ISMAR 2003
Country/TerritoryJapan
CityTokyo
Period7/10/0310/10/03

Cite this