Cho, S.-J., Oh, J.-K., Bang, W.-C., Chang, W., Choi, E.,
Jing, Y., Cho, J., and Kim, D.-Y. (2004). Magic wand:
a hand-drawn gesture input device in 3-d space with
inertial sensors. In Frontiers in Handwriting Recogni-
tion, 2004. IWFHR-9 2004. Ninth International Work-
shop on, pages 106–111.
Hartmann, B. and Link, N. (2010). Gesture recognition
with inertial sensors and optimized dtw prototypes. In
Proceedings of the IEEE International Conference on
Systems, Man and Cybernetics, Istanbul, Turkey, 10-
13 October 2010, Proceedings of the IEEE Interna-
tional Conference on Systems, Man and Cybernetics,
Istanbul, Turkey, 10-13 October 2010, pages 2102–
2109. IEEE.
Hauptmann, A. G. and McAvinney, P. (1993). Gestures with
speech for graphic manipulation. International Jour-
nal of ManMachine Studies, 38(2):231–249.
Kim, D., Hilliges, O., Izadi, S., Butler, A. D., Chen, J.,
Oikonomidis, I., and Olivier, P. (2012). Digits: Free-
hand 3d interactions anywhere using a wrist-worn
gloveless sensor. In Proceedings of the 25th An-
nual ACM Symposium on User Interface Software and
Technology, UIST ’12, pages 167–176, New York,
NY, USA. ACM.
Kim, J., Mastnik, S., and Andr
´
e, E. (2008). Emg-based
hand gesture recognition for realtime biosignal inter-
facing. In Proceedings of the 13th International Con-
ference on Intelligent User Interfaces, volume 39 of
IUI ’08, pages 30–39, New York, NY, USA. ACM
Press.
Li, Y., Chen, X., Tian, J., Zhang, X., Wang, K., and Yang, J.
(2010). Automatic recognition of sign language sub-
words based on portable accelerometer and emg sen-
sors. In International Conference on Multimodal In-
terfaces and the Workshop on Machine Learning for
Multimodal Interaction, ICMI-MLMI ’10, pages 17–
1, New York, NY, USA. ACM.
Mistry, P., Maes, P., and Chang, L. (2009). WUW-wear Ur
world: a wearable gestural interface. Proceedings of
CHI 2009, pages 4111–4116.
Oviatt, S. (1999). Ten myths of multimodal interaction.
Communications of the ACM, 42(11):74–81.
Rabiner, L. (1989). A tutorial on hidden markov models
and selected applications in speech recognition. Pro-
ceedings of the IEEE, 77(2):257–286.
Rekimoto, J. (2001). GestureWrist and GesturePad: un-
obtrusive wearable interaction devices. Proceedings
Fifth International Symposium on Wearable Comput-
ers.
Samadani, A.-A. and Kulic, D. (2014). Hand gesture recog-
nition based on surface electromyography. In En-
gineering in Medicine and Biology Society (EMBC),
2014 36th Annual International Conference of the
IEEE.
Saponas, T. S., Tan, D. S., Morris, D., and Balakrishnan, R.
(2008). Demonstrating the feasibility of using forearm
electromyography for muscle-computer interfaces. In
Proceedings of the SIGCHI Conference on Human
Factors in Computing Systems, CHI ’08, pages 515–
524, New York, New York, USA. ACM Press.
Wolf, M. T., Assad, C., Stoica, A., You, K., Jethani, H.,
Vernacchia, M. T., Fromm, J., and Iwashita, Y. (2013).
Decoding static and dynamic arm and hand gestures
from the jpl biosleeve. In Aerospace Conference, 2013
IEEE, pages 1–9.
Zhang, X., Chen, X., Li, Y., Lantz, V., Wang, K., and Yang,
J. (2011). A framework for hand gesture recognition
based on accelerometer and emg sensors. Systems,
Man and Cybernetics, Part A: Systems and Humans,
IEEE Transactions on, 41(6):1064–1076.
BIOSIGNALS2015-InternationalConferenceonBio-inspiredSystemsandSignalProcessing
108