Design of a Low-false-positive Gesture for a Wearable Device

Ryo Kawahata, Atsushi Shimada, Takayoshi Yamashita, Hideaki Uchiyama, Rin-ichiro Taniguchi

2016

Abstract

As smartwatches are becoming more widely used in society, gesture recognition, as an important aspect of interaction with smartwatches, is attracting attention. An accelerometer that is incorporated in a device is often used to recognize gestures. However, a gesture is often detected falsely when a similar pattern of action occurs in daily life. In this paper, we present a novel method of designing a new gesture that reduces false detection. We refer to such a gesture as a low-false-positive (LFP) gesture. The proposed method enables a gesture design system to suggest LFP motion gestures automatically. The user of the system can design LFP gestures more easily and quickly than what has been possible in previous work. Our method combines primitive gestures to create an LFP gesture. The combination of primitive gestures is recognized quickly and accurately by a random forest algorithm using our method. We experimentally demonstrate the good recognition performance of our method for a designed gesture with a high recognition rate and without false detection.

References

  1. Akl, A., Feng, C., and Valaee, S. (2011). A novel accelerometer-based gesture recognition system. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 59(12):6197.
  2. Ashbrook, D. and Starner, T. (2010). Magic: a motion gesture design tool. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 2159-2168. ACM.
  3. Bauer, B. and Kraiss, K.-F. (2002). Video-based sign recognition using self-organizing subunits. In Pattern Recognition, 2002. Proceedings. 16th International Conference on, volume 2, pages 434-437. IEEE.
  4. Chen, Q., Georganas, N. D., and Petriu, E. M. (2007). Real-time vision-based hand gesture recognition using haar-like features. In Instrumentation and Measurement Technology Conference Proceedings, 2007. IMTC 2007. IEEE, pages 1-6. IEEE.
  5. Kohlsdorf, D. K. H. and Starner, T. E. (2013). Magic summoning: towards automatic suggesting and testing of gestures with low probability of false positives during use. The Journal of Machine Learning Research, 14(1):209-242.
  6. Liaw, A. and Wiener, M. (2002). Classification and regression by randomforest. R news, 2(3):18-22.
  7. Liu, J., Zhong, L., Wickramasuriya, J., and Vasudevan, V. (2009). uwave: Accelerometer-based personalized gesture recognition and its applications. Pervasive and Mobile Computing, 5(6):657-675.
  8. Mitra, S. and Acharya, T. (2007). Gesture recognition: A survey. Systems, Man, and Cybernetics, Part C: Applications and Reviews, IEEE Transactions on, 37(3):311-324.
  9. Mori, A., Uchida, S., Kurazume, R., Taniguchi, R.-i., Hasegawa, T., and Sakoe, H. (2006). Early recognition and prediction of gestures. In Pattern Recognition, 2006. ICPR 2006. 18th International Conference on, volume 3, pages 560-563. IEEE.
  10. Oka, R. (1998). Spotting method for classification of real world data. The Computer Journal, 41(8):559-565.
  11. Park, T., Lee, J., Hwang, I., Yoo, C., Nachman, L., and Song, J. (2011). E-gesture: a collaborative architecture for energy-efficient gesture recognition with hand-worn sensor and mobile devices. In Proceedings of the 9th ACM Conference on Embedded Networked Sensor Systems, pages 260-273. ACM.
  12. Ruiz, J. and Li, Y. (2011). Doubleflip: a motion gesture delimiter for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 2717-2720. ACM.
  13. Ruiz, J., Li, Y., and Lank, E. (2011). User-defined motion gestures for mobile interaction. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pages 197-206. ACM.
  14. Ruppert, G. C. S., Reis, L. O., Amorim, P. H. J., de Moraes, T. F., and da Silva, J. V. L. (2012). Touchless gesture user interface for interactive image visualization in urological surgery. World journal of urology, 30(5):687-691.
  15. Schl ömer, T., Poppinga, B., Henze, N., and Boll, S. (2008). Gesture recognition with a wii controller. In Proceedings of the 2nd international conference on Tangible and embedded interaction, pages 11-14. ACM.
  16. Zafrulla, Z., Brashear, H., Starner, T., Hamilton, H., and Presti, P. (2011). American sign language recognition with the kinect. In Proceedings of the 13th international conference on multimodal interfaces, pages 279-286. ACM.
  17. Zhang, M. and Sawchuk, A. A. (2012). Motion primitivebased human activity recognition using a bag-offeatures approach. In Proceedings of the 2nd ACM SIGHIT International Health Informatics Symposium, pages 631-640. ACM.
Download


Paper Citation


in Harvard Style

Kawahata R., Shimada A., Yamashita T., Uchiyama H. and Taniguchi R. (2016). Design of a Low-false-positive Gesture for a Wearable Device . In Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM, ISBN 978-989-758-173-1, pages 581-588. DOI: 10.5220/0005701905810588


in Bibtex Style

@conference{icpram16,
author={Ryo Kawahata and Atsushi Shimada and Takayoshi Yamashita and Hideaki Uchiyama and Rin-ichiro Taniguchi},
title={Design of a Low-false-positive Gesture for a Wearable Device},
booktitle={Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,},
year={2016},
pages={581-588},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005701905810588},
isbn={978-989-758-173-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 5th International Conference on Pattern Recognition Applications and Methods - Volume 1: ICPRAM,
TI - Design of a Low-false-positive Gesture for a Wearable Device
SN - 978-989-758-173-1
AU - Kawahata R.
AU - Shimada A.
AU - Yamashita T.
AU - Uchiyama H.
AU - Taniguchi R.
PY - 2016
SP - 581
EP - 588
DO - 10.5220/0005701905810588