loading
Papers Papers/2022 Papers Papers/2022

Research.Publish.Connect.

Paper

Paper Unlock

Authors: Simon Senecal ; Niels A. Nijdam and Nadia Magnenat Thalmann

Affiliation: University of Geneva, Geneva and Switzerland

Keyword(s): Modelling of Natural Scenes and Phenomena, Motion Analysis, Couple Dance, Motion Features, Machine Learning.

Related Ontology Subjects/Areas/Topics: Animation Algorithms and Techniques ; Animation and Simulation ; Computer Vision, Visualization and Computer Graphics ; Computer-Supported Education ; e-Learning ; e-Learning Applications and Computer Graphics ; Games for Education and Training ; Geometry and Modeling ; Interactive Environments ; Model Validation ; Modeling and Algorithms ; Modeling of Natural Scenes and Phenomena

Abstract: Learning couple dance such as Salsa is a challenge for the modern human as it requires to assimilate and understand correctly all the dance parameters. Traditionally learned with a teacher, some situation and the variability of dance class environment can impact the learning process. Having a better understanding of what is a good salsa dancer from motion analysis perspective would bring interesting knowledge and can complement better learning. In this paper, we propose a set of music and interaction based motion features to classify salsa dancer couple performance in three learning states (beginner, intermediate and expert). These motion features are an interpretation of components given via interviews from teacher and professionals and other dance features found in systematic review of papers. For the presented study, a motion capture database (SALSA) has been recorded of 26 different couples with three skill levels dancing on 10 different tempos (260 clips). Each recorded clips co ntains a basic steps sequence and an extended improvisation sequence during two minutes in total at 120 frame per second. Each of the 27 motion features have been computed on a sliding window that corresponds to the 8 beats reference for dance. Different multiclass classifier has been tested, mainly k-nearest neighbours, Random forest and Support Vector Machine, with an accuracy result of classification up to 81% for three levels and 92% for two levels. A later feature analysis validates 23 out of 27 proposed features. The work presented here has profound implications for future studies of motion analysis, couple dance learning and human-human interaction. (More)

CC BY-NC-ND 4.0

Sign In Guest: Register as new SciTePress user now for free.

Sign In SciTePress user: please login.

PDF ImageMy Papers

You are not signed in, therefore limits apply to your IP address 3.15.239.50

In the current month:
Recent papers: 100 available of 100 total
2+ years older papers: 200 available of 200 total

Paper citation in several formats:
Senecal, S.; Nijdam, N. and Thalmann, N. (2019). Classification of Salsa Dance Level using Music and Interaction based Motion Features. In Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2019) - GRAPP; ISBN 978-989-758-354-4; ISSN 2184-4321, SciTePress, pages 100-109. DOI: 10.5220/0007399701000109

@conference{grapp19,
author={Simon Senecal. and Niels A. Nijdam. and Nadia Magnenat Thalmann.},
title={Classification of Salsa Dance Level using Music and Interaction based Motion Features},
booktitle={Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2019) - GRAPP},
year={2019},
pages={100-109},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0007399701000109},
isbn={978-989-758-354-4},
issn={2184-4321},
}

TY - CONF

JO - Proceedings of the 14th International Joint Conference on Computer Vision, Imaging and Computer Graphics Theory and Applications (VISIGRAPP 2019) - GRAPP
TI - Classification of Salsa Dance Level using Music and Interaction based Motion Features
SN - 978-989-758-354-4
IS - 2184-4321
AU - Senecal, S.
AU - Nijdam, N.
AU - Thalmann, N.
PY - 2019
SP - 100
EP - 109
DO - 10.5220/0007399701000109
PB - SciTePress