Tightly integrated sensor fusion for robust visual tracking

G. S.W. Klein, T. W. Drummond

Research output: Contribution to journalArticleResearchpeer-review

39 Citations (Scopus)

Abstract

This paper presents a novel method for increasing the robustness of visual tracking systems by incorporating information from inertial sensors. We show that more can be achieved than simply combining the sensor data within a statistical filter: besides using inertial data to provide predictions for the visual sensor, this data can be used to dynamically tune the parameters of each feature detector in the visual sensor. This allows the visual sensor to provide useful information even in the presence of substantial motion blur. Finally, the visual sensor can be used to calibrate the parameters of the inertial sensor to eliminate drift.

Original languageEnglish
Pages (from-to)769-776
Number of pages8
JournalImage and Vision Computing
Volume22
Issue number10 SPEC. ISS.
DOIs
Publication statusPublished - 1 Sept 2004
Externally publishedYes

Keywords

  • Real-time vision
  • Sensor fusion
  • Visual tracking

Cite this