Abstract
This paper proposes a novel approach for motion primitive segmentation from continuous full body human motion captured on monocular video. The proposed approach does not require a kinematic model of the person, nor any markers on the body. Instead, optical flow computed directly in the image plane is used to estimate the location of segment points. The approach is based on detecting tracking features in the image based on the Shi and Thomasi algorithm [1]. The optical flow at each feature point is then estimated using the Lucas Kanade Pyramidal Optical Flow estimation algorithm [2]. The feature points are clustered and tracked on-line to find regions of the image with coherent movement. The appearance and disappearance of these coherent clusters indicates the start and end points of motion primitive segments. The algorithm performance is validated on full body motion video sequences, and compared to a joint-angle, motion capture based approach. The results show that the segmentation performance is comparable to the motion capture based approach, while using much simpler hardware and at a lower computational effort.
Original language | English |
---|---|
Title of host publication | 2009 IEEE International Conference on Robotics and Automation, ICRA '09 |
Publisher | IEEE, Institute of Electrical and Electronics Engineers |
Pages | 3166-3172 |
Number of pages | 7 |
ISBN (Print) | 9781424427895 |
DOIs | |
Publication status | Published - 2 Nov 2009 |
Externally published | Yes |
Event | IEEE International Conference on Robotics and Automation 2009 - Kobe, Japan Duration: 12 May 2009 → 17 May 2009 https://ieeexplore.ieee.org/xpl/conhome/5076472/proceeding (Proceedings) |
Publication series
Name | Proceedings - IEEE International Conference on Robotics and Automation |
---|---|
ISSN (Print) | 1050-4729 |
Conference
Conference | IEEE International Conference on Robotics and Automation 2009 |
---|---|
Abbreviated title | ICRA 2009 |
Country/Territory | Japan |
City | Kobe |
Period | 12/05/09 → 17/05/09 |
Internet address |