Overcoming occlusions in eye-in-hand visual search

Sina Radmard, David Meger, Elizabeth A. Croft, James J. Little

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

6 Citations (Scopus)


In this paper we propose a method for handling persistent visual occlusions that disrupt visual tracking for eye-in-hand systems. This approach provides an efficient strategy for the robot to look behind the occlusion while respecting the robot's physical constraints. Specifically, we propose a decoupled search strategy combining a naïve pan tilt search with a sensor placement approach, to reduce the strategy's computational cost. We proceed by mapping limited environmental data into the robot configuration space and then planning within a constrained region. We use a particle filter to continuously estimate the target location, while our configuration-based cost function plans a goal location for the camera frame, taking into account robot singularity, self-collision and joint limit constraints. To validate our algorithm, we implemented it on an eye-in-hand robot system. Experimental results for various situations support the feasibility of our approach for quickly recovering fully occluded moving targets. Finally we discuss the implications of this approach to mobile robot platforms.

Original languageEnglish
Title of host publication2012 American Control Conference, ACC 2012
Number of pages6
Publication statusPublished - 26 Nov 2012
Externally publishedYes
EventAmerican Control Conference 2012 - Montreal, Canada
Duration: 27 Jun 201229 Jun 2012
https://ieeexplore.ieee.org/xpl/conhome/6297579/proceeding (Proceedings)


ConferenceAmerican Control Conference 2012
Abbreviated titleACC 2012
Internet address

Cite this