Overcoming occlusions in eye-in-hand visual search

Sina Radmard, David Meger, Elizabeth A. Croft, James J. Little

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

5 Citations (Scopus)

Abstract

In this paper we propose a method for handling persistent visual occlusions that disrupt visual tracking for eye-in-hand systems. This approach provides an efficient strategy for the robot to look behind the occlusion while respecting the robot's physical constraints. Specifically, we propose a decoupled search strategy combining a naïve pan tilt search with a sensor placement approach, to reduce the strategy's computational cost. We proceed by mapping limited environmental data into the robot configuration space and then planning within a constrained region. We use a particle filter to continuously estimate the target location, while our configuration-based cost function plans a goal location for the camera frame, taking into account robot singularity, self-collision and joint limit constraints. To validate our algorithm, we implemented it on an eye-in-hand robot system. Experimental results for various situations support the feasibility of our approach for quickly recovering fully occluded moving targets. Finally we discuss the implications of this approach to mobile robot platforms.

Original languageEnglish
Title of host publication2012 American Control Conference, ACC 2012
Pages4102-4107
Number of pages6
Publication statusPublished - 26 Nov 2012
Externally publishedYes
EventAmerican Control Conference 2012 - Montreal, QC, Canada
Duration: 27 Jun 201229 Jun 2012

Conference

ConferenceAmerican Control Conference 2012
Abbreviated titleACC 2012
CountryCanada
CityMontreal, QC
Period27/06/1229/06/12

Cite this

Radmard, S., Meger, D., Croft, E. A., & Little, J. J. (2012). Overcoming occlusions in eye-in-hand visual search. In 2012 American Control Conference, ACC 2012 (pp. 4102-4107). [6315690]