
modification of our segmentation module, other
modules will remain the same.
We have also successfully implemented simple
gestures based human-robot interactive system for
mimic operation, using a robot named ROBOVIE.
We believe that vision system will replace attached
physical sensors for human robot interaction in the
near future. A particular user may assign distinct
commands to specific hand gestures and thus control
various intelligent robots using hand gestures.
The significant issues in gesture recognition for
our method are the simplification of the algorithm
and reduction of processing time in issuing
commands for the robot. Our next step is to make
the detecting system more robust and to recognize
dynamic facial and hand gestures for interaction
with different robots such as AIBO, ROBOVIE,
SCOUT, MELFA, etc. Our ultimate goal is to
establish a symbiotic society for all of the distributed
autonomous intelligent components so that, they
share their resources and work cooperatively with
human beings.
REFERENCES
Vladimir I. Pavlovic, 1997. Visual Interpretation of Hand
Gestures for Human-Computer Interaction: A Review.
IEEE PAMI, Vol. 19, No. 7, pp. 677-695.
Watanabe, T., 1996. Real-Time Gesture Recognition
Using Maskable Template Model. Proc. of the
International Conference on Multimedia Computing
and Systems (ICMCS’96), pp. 341-348.
Hongo, H., 2000. Focus of Attention for Face and Hand
Gesture Recognition Using Multiple Cameras.
AFGR00, IEEE, pp. 156-161.
Matthew, T., 1991. Eigenface for Recognition. Journal of
Cognitive Neuroscience, Vol. 3, No.1, pp. 71-86.
Utsumi, A., 2002. Hand Detection and Tracking using
Pixel Value Distribution Model for Multiple-Camera-
Based Gesture Interactions. Proc. of the IEEE
workshop on knowledge Media Networking
(KMN’02), pp. 31-36.
Bhuiyan, M. A., 2003. Face Detection and Facial Feature
Localization for Human-machine Interface. NII
Journal. Vol. 5, pp. 25-39.
Huang, Yu, 2002. Two-Hand Gesture Tracking
Incorporating Template Warping With Static
Segmentation. AFGR’02, IEEE, pp. 260-265.
Bretzner, L., 2002. Hand Gesture Recognition using
Multi-Scale Colour Features, Hierarchical Models and
Particle Filtering. AFGR’02, IEEE pp. 423-428.
Bhuiyan, M. A., 2004. ON TRACKING OF EYE FOR
HUMAN-ROBOT INTERFACE. International
Journal of Robotics and Automation, Vol. 19, No. 1,
pp. 42-54.
Shimada, N., 1996. 3-D Hand Pose Estimation and Shape
Model Refinement from a Monocular Image Sequence.
Proc. of VSMM’96 in GIFU, pp.23-428
Grzeszczuk, R., 2000. Stereo Based Gesture Recognition
Invariant to 3D pose and lighting. CVPR’00, IEEE, pp.
1826-1833.
Yunato, Cui, 1996. Hand Segmentation Using Learning-
Based prediction and verification for hand Sign
Recognition. Proc. of the Conference on Computer
Vision and pattern Recognition (CVPR’96), IEEE, pp.
88-93.
Yoichi Sato, 2000. Fast Tracking of hands and Fingertips
in Infrared Images for Augmented Desk Interface.
AFGR’00, IEEE, pp. 462-467.
Charles, J., 2001. A Basic Hand Gesture Control System
for PC Applications. Proc. of the 30th Applied
Imagery Pattern Recognition Workshop (AIPR’01),
IEEE, pp. 74-79
Dong, Guo, 1998. Vision-Based Hand Gesture
Recognition for Human-Vehicle Interaction. Proc. of
the International conference on Control, Automation
and Computer Vision, Vol. 1, pp. 151-155.
VISION-BASED HAND GESTURES RECOGNITION FOR HUMAN-ROBOT INTERACTION
445