D. Lara, O. and Labrador, M. (2013). A Survey on Human
Activity Recognition Using Wearable Sensors. Com-
munications Surveys & Tutorials, IEEE, 15:1192–
1209.
Escalera, S., Gonz
`
alez, J., Bar
´
o, X., Reyes, M., Guyon,
I., Athitsos, V., Escalante, H., Sigal, L., Argyros,
A., Sminchisescu, C., Bowden, R., and Sclaroff, S.
(2013). Chalearn multi-modal gesture recognition
2013: grand challenge and workshop summary. In
Proceedings of the 15th ACM on International con-
ference on multimodal interaction, pages 365–368.
Hochreiter, S. and Schmidhuber, J. (1997). Long short-term
memory. Neural computation, 9:1735–80.
Kim, T.-K. and Cipolla, R. (2008). Canonical correlation
analysis of video volume tensors for action catego-
rization and detection. IEEE Transactions on Pattern
Analysis and Machine Intelligence, 31(8):1415–1428.
Kopuklu, O., Rong, Y., and Rigoll, G. (2019). Talking with
your hands: Scaling hand gestures and recognition
with cnns. In Proceedings of the IEEE International
Conference on Computer Vision Workshops.
Liu, L. and Shao, L. (2013). Learning discriminative repre-
sentations from rgb-d video data. In Twenty-third in-
ternational joint conference on artificial intelligence.
Marin, G., Dominio, F., and Zanuttigh, P. (2016). Hand
Gesture Recognition with Jointly Calibrated Leap
Motion and Depth Sensor. Multimedia Tools Appl.,
75(22):14991–15015.
McConnell, R. (1986). Method of and apparatus for pattern
recognition.
Memo, A., Minto, L., and Zanuttigh, P. (2015). Exploiting
Silhouette Descriptors and Synthetic Data for Hand
Gesture Recognition. In Giachetti, A., Biasotti, S.,
and Tarini, M., editors, Smart Tools and Apps for
Graphics - Eurographics Italian Chapter Conference.
The Eurographics Association.
Nasser, K. (2008). Digital signal processing system design:
Labview based hybrid programming.
Ni, B., Wang, G., and Moulin, P. (2011). RGBD-HuDaAct:
A Color-Depth Video Database For Human Daily Ac-
tivity Recognition. International Conference on Com-
puter Vision Workshops, IEEE, pages 1147–1153.
Radu, V., Tong, C., Bhattacharya, S., Lane, N. D., Mascolo,
C., Marina, M. K., and Kawsar, F. (2018). Multimodal
Deep Learning for Activity and Context Recognition.
Proc. ACM Interact. Mob. Wearable Ubiquitous Tech-
nol., 1(4):157:1–157:27.
Ranasinghe, S., Machot, F. A., and Mayr, H. C. (2016).
A review on applications of activity recognition sys-
tems with regard to performance and evaluation. In-
ternational Journal of Distributed Sensor Networks,
12(8):1550147716665520.
Romdhane, R., Crispim-Junior, C. F., Bremond, F., and
Thonnat, M. (2013). Activity Recognition and Un-
certain Knowledge in Video Scenes. In IEEE Inter-
national Conference on Advanced Video and Signal-
Based Surveillance (AVSS), Krakow, Poland.
Rusu, R. B., Blodow, N., Marton, Z. C., and Beetz, M.
(2008). Aligning point cloud views using persistent
feature histograms. In 2008 IEEE/RSJ International
Conference on Intelligent Robots and Systems, pages
3384–3391. IEEE.
Sachara, F., Kopinski, T., Gepperth, A., and Handmann,
U. (2017). Free-hand gesture recognition with 3d-
cnns for in-car infotainment control in real-time. In
2017 IEEE 20th International Conference on Intelli-
gent Transportation Systems (ITSC), pages 959–964.
Sarkar, A., Gepperth, A., Handmann, U., and Kopinski, T.
(2017). Dynamic hand gesture recognition for mo-
bile systems using deep lstm. In Horain, P., Achard,
C., and Mallem, M., editors, Intelligent Human Com-
puter Interaction, pages 19–31, Cham. Springer Inter-
national Publishing.
Schak, M. and Gepperth, A. (2019). Robustness of deep
lstm networks in freehand gesture recognition. In
Artificial Neural Networks and Machine Learning –
ICANN 2019: Image Processing, pages 330–343.
Springer International Publishing.
Sharma, S., Kiros, R., and Salakhutdinov, R. (2016). Action
Recognition using Visual Attention. ICLR.
Wan, J., Zhao, Y., Zhou, S., Guyon, I., Escalera, S., and Li,
S. Z. (2016). Chalearn looking at people rgb-d iso-
lated and continuous datasets for gesture recognition.
In Proceedings of the IEEE Conference on Computer
Vision and Pattern Recognition Workshops, pages 56–
64.
William T. Freeman, M. R. (1994). Orientation histograms
for hand gesture recognition. Technical Report TR94-
03, MERL - Mitsubishi Electric Research Laborato-
ries, Cambridge, MA 02139.
Zhang, M. and Sawchuk, A. A. (2012). USC-HAD: A Daily
Activity Dataset for Ubiquitous Acitivity Recognition
Using Wearable Sensors. International Conference on
Ubiquitous Computing, pages 1036–1043.
Zhang, Y., Cao, C., Cheng, J., and Lu, H. (2018). EgoGes-
ture: A New Dataset and Benchmark for Egocentric
Hand Gesture Recognition. IEEE Transactions on
Multimedia, 20(5):1038–1050.
Gesture Recognition on a New Multi-Modal Hand Gesture Dataset
131