Cooperative Gesture Recognition - Learning Characteristics of Classifiers and Navigating the User to an Ideal Situation
Hiromasa Yoshimoto, Yuichi Nakamura
2015
Abstract
This paper introduces a novel scheme of gesture interface that guides the user toward obtaining better performance and usability. The accuracy of gesture recognition is heavily affected by how the user makes postures and moves, as well as environmental conditions such as lighting. The usability of the gesture interface can potentially be improved by notifying the user of when and how better accuracy is obtained. For this purpose, we propose a method for estimating the performance of gesture recognition in its current condition, and a method for suggesting possible ways to improve performance to the user. In performance estimation, accuracy in the current condition is estimated based on supervised learning with a large number of samples and corresponding ground truths. If the estimated accuracy is insufficient, the module searches for better conditions that can be reached with the user’s cooperation. If a good improvement is possible, the way to improve is communicated to the user in terms of visual feedback, which shows how to avoid or how to recover from the undesirable condition. In this way, users benefit, i.e., better accuracy and usability, by cooperating with the gesture interface.
References
- Fischler, M. A. and Bolles, R. C. (1981). Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography. Commun. ACM, 24(6):381-395.
- Jacko, J. A. and Sears, A., editors (2003). The Human-computer Interaction Handbook: Fundamentals, Evolving Technologies and Emerging Applications. L. Erlbaum Associates Inc., Hillsdale, NJ, USA.
- Moeslund, T. B. and Granum, E. (2001). A survey of computer vision-based human motion capture. Computer Vision and Image Understanding: CVIU, 81(3):231- 268.
- Moeslund, T. B., Hilton, A., and Krüger, V. (2006). A survey of advances in vision-based human motion capture and analysis. Comput. Vis. Image Underst., 104(2):90-126.
- Nielsen, J. (2009). Ten usability heuristics. http:// www.nngroup.com/articles/ten-usability-heuristics/.
- OpenNI organization (2010). OpenNI User Guide. OpenNI organization.
- Pavlovic, V. I., Sharma, R., and Huang, T. S. (1997). Visual interpretation of hand gestures for human-computer interaction: A review. IEEE Transactions on Pattern Analysis and Machine Intelligence, 19:677-695.
- PrimeSense Inc. (2010). Prime SensorTMNITE 1.3 Algorithms notes. PrimeSense Inc.
- Ren, Z., Yuan, J., Meng, J., and Zhang, Z. (2013). Robust part-based hand gesture recognition using kinect sensor. Multimedia, IEEE Transactions on, 15(5):1110- 1120.
- Shotton, J. and Sharp, T. (2011). Real-time human pose recognition in parts from single depth images. IEEE Conference on Computer Vision and Pattern Recognition (2011), 2(3):1297-1304.
Paper Citation
in Harvard Style
Yoshimoto H. and Nakamura Y. (2015). Cooperative Gesture Recognition - Learning Characteristics of Classifiers and Navigating the User to an Ideal Situation . In Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 2: ICPRAM, ISBN 978-989-758-077-2, pages 210-218. DOI: 10.5220/0005206902100218
in Bibtex Style
@conference{icpram15,
author={Hiromasa Yoshimoto and Yuichi Nakamura},
title={Cooperative Gesture Recognition - Learning Characteristics of Classifiers and Navigating the User to an Ideal Situation},
booktitle={Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 2: ICPRAM,},
year={2015},
pages={210-218},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005206902100218},
isbn={978-989-758-077-2},
}
in EndNote Style
TY - CONF
JO - Proceedings of the International Conference on Pattern Recognition Applications and Methods - Volume 2: ICPRAM,
TI - Cooperative Gesture Recognition - Learning Characteristics of Classifiers and Navigating the User to an Ideal Situation
SN - 978-989-758-077-2
AU - Yoshimoto H.
AU - Nakamura Y.
PY - 2015
SP - 210
EP - 218
DO - 10.5220/0005206902100218