Authors:
Juan Felipe Beltran
1
;
Xiaohua Liu
1
;
Nishant Mohanchandra
2
and
Godfried Toussaint
2
Affiliations:
1
New York University Abu Dhabi - Faculty of Science, United Arab Emirates
;
2
New York University Abu Dhabi, United Arab Emirates
Keyword(s):
Musical Rhythm, Similarity Measures, Transformations, Inter-Onset Interval Histograms, Mallows Distance, Edit Distance, Statistical Features, Pattern Recognition, Music Information Retrieval, Mantel Test.
Related
Ontology
Subjects/Areas/Topics:
Applications
;
Audio and Speech Processing
;
Data Engineering
;
Digital Signal Processing
;
Information Retrieval
;
Multimedia
;
Multimedia Signal Processing
;
Ontologies and the Semantic Web
;
Pattern Recognition
;
Perception
;
Software Engineering
;
Telecommunications
Abstract:
Two approaches to measuring the similarity between symbolically notated musical rhythms are compared with human judgments of perceived similarity. The first is the edit-distance, a popular transformation method, applied to the rhythm sequences. The second works on the histograms of the inter-onset-intervals (IOIs) of these rhythm sequences. Furthermore, two methods of dealing with the histograms are also compared: the Mallows distance, and the employment of a group of standard statistical features. The results provide further evidence from the aural domain, that transformation methods are superior to feature-based methods for predicting human judgments of similarity. Furthermore, the results also support the hypothesis that statistical features applied to the histograms of the rhythms are better than music-theoretical structural features applied to the rhythms themselves.