Motion planning from demonstrations and polynomial optimization for visual servoing applications

Tiantian Shen, Sina Radmard, Ambrose Chan, Elizabeth A. Croft, Graziano Chesi

Research output: Chapter in Book/Report/Conference proceedingConference PaperOther

6 Citations (Scopus)


Vision feedback control techniques are desirable for a wide range of robotics applications due to their robustness to image noise and modeling errors. However in the case of a robot-mounted camera, they encounter difficulties when the camera traverses large displacements. This scenario necessitates continuous visual target feedback during the robot motion, while simultaneously considering the robot's self- and external-constraints. Herein, we propose to combine workspace (Cartesian space) path-planning with robot teach-by-demonstration to address the visibility constraint, joint limits and 'whole arm' collision avoidance for vision-based control of a robot manipulator. User demonstration data generates safe regions for robot motion with respect to joint limits and potential 'whole arm' collisions. Our algorithm uses these safe regions to generate new feasible trajectories under a visibility constraint that achieves the desired view of the target (e.g., a pre-grasping location) in new, undemonstrated locations. Experiments with a 7-DOF articulated arm validate the proposed method.

Original languageEnglish
Title of host publicationIROS 2013
Subtitle of host publicationNew Horizon, Conference Digest - 2013 IEEE/RSJ International Conference on Intelligent Robots and Systems
Number of pages6
Publication statusPublished - 1 Dec 2013
Externally publishedYes
EventIEEE/RSJ International Conference on Intelligent Robots and Systems 2013 - Tokyo, Japan
Duration: 3 Nov 20137 Nov 2013 (Proceedings)


ConferenceIEEE/RSJ International Conference on Intelligent Robots and Systems 2013
Abbreviated titleIROS 2013
Internet address

Cite this