Authors:
Li Ding
1
;
Jiaxin Wang
1
;
Christophe Chaillou
2
and
Chunhong Pan
3
Affiliations:
1
Tsinghua University, China
;
2
INRIA Lille Nord Europe, France
;
3
Chinese Academy of Sciences, China
Keyword(s):
Hand Tracking, Object Occlusion, Possibility Support Map.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Computer Vision, Visualization and Computer Graphics
;
Human-Computer Interaction
;
Methodologies and Methods
;
Model-Based Object Tracking in Image Sequences
;
Motion and Tracking
;
Motion, Tracking and Stereo Vision
;
Pattern Recognition
;
Physiological Computing Systems
;
Real-Time Vision
Abstract:
The research on real-time hand locating by monocular vision has a considerable challenge that to track hands correctly under occlusion situation. This paper proposes a robust hand locating method which generates a possibility support map by integrating information from color model, position model and motion model. For better accuracy, hands are modeled as ellipses. The PSM depends on both previous model information and the relationship between models. Hand pattern search is then processed on the generated map by two steps which firstly locates the center position of hand and secondly determines the size and orientation. Our experimental results show that the proposed method is efficient under situation that one hand is occluded by the other one. Our current prototype system processes image at 10~14 frames per second.