Jiang, J., Chen, Y., Hao, D., and Li, K. (2019). Dpc-lg:
Density peaks clustering based on logistic distribution
and gravitation. Physica A: Statistical Mechanics and
its Applications, 514:25–35.
Kadir, S. N., Goodman, D. F., and Harris, K. D. (2014).
High-dimensional cluster analysis with the masked
em algorithm. Neural computation.
Klawonn, F. and Keller, A. (1999). Fuzzy clustering based
on modified distance measures. In International Sym-
posium on Intelligent Data Analysis, pages 291–301.
Springer.
Kou, G., Peng, Y., and Wang, G. (2014). Evaluation of
clustering algorithms for financial risk analysis using
mcdm methods. Information Sciences, 275:1–12.
Kumar, S., Pant, M., Kumar, M., and Dutt, A. (2018).
Colour image segmentation with histogram and ho-
mogeneity histogram difference using evolutionary al-
gorithms. International Journal of Machine Learning
and Cybernetics, 9(1):163–183.
Liu, A., Su, Y., Nie, W., and Kankanhalli, M. S. (2017). Hi-
erarchical clustering multi-task learning for joint hu-
man action grouping and recognition. IEEE Trans.
Pattern Anal. Mach. Intell., 39(1):102–114.
Lu, J., Zhu, Q., and Wu, Q. (2018). A novel data cluster-
ing algorithm using heuristic rules based on k-nearest
neighbors chain. Engineering Applications of Artifi-
cial Intelligence, 72:213–227.
Maneewongvatana, S. and Mount, D. M. (1999). It’s okay
to be skinny, if your friends are fat. In Center for
Geometric Computing 4th Annual Workshop on Com-
putational Geometry, volume 2, pages 1–8.
Mei, J.-P., Wang, Y., Chen, L., and Miao, C. (2017). Large
scale document categorization with fuzzy clustering.
IEEE Transactions on Fuzzy Systems, 25(5):1239–
1251.
Orlandic, R., Lai, Y., and Yee, W. G. (2005). Clustering
high-dimensional data using an efficient and effective
data space reduction. In Proceedings of the 14th ACM
international conference on Information and knowl-
edge management, pages 201–208. ACM.
Pal, N. R., Bezdek, J. C., and Hathaway, R. J. (1996). Se-
quential competitive learning and the fuzzy c-means
clustering algorithms. Neural Networks, 9(5):787–
796.
Pandit, S., Gupta, S., et al. (2011). A comparative study
on distance measuring approaches for clustering. In-
ternational Journal of Research in Computer Science,
2(1):29–31.
Parmar, M., Wang, D., Zhang, X., Tan, A.-H., Miao, C.,
Jiang, J., and Zhou, Y. (2019). Redpc: A residual
error-based density peak clustering algorithm. Neuro-
computing, 348:82–96.
Paukkeri, M.-S., Kivim
¨
aki, I., Tirunagari, S., Oja, E., and
Honkela, T. (2011). Effect of dimensionality reduc-
tion on different distance measures in document clus-
tering. In International Conference on Neural Infor-
mation Processing, pages 167–176. Springer.
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V.,
Thirion, B., Grisel, O., Blondel, M., Prettenhofer,
P., Weiss, R., Dubourg, V., Vanderplas, J., Passos,
A., Cournapeau, D., Brucher, M., Perrot, M., and
Duchesnay, E. (2011). Scikit-learn: Machine learning
in Python. Journal of Machine Learning Research,
12:2825–2830.
Qaddoura, R., Al Manaseer, W., Abushariah, M. A., and Al-
shraideh, M. A. (2020a). Dental radiography segmen-
tation using expectation-maximization clustering and
grasshopper optimizer. MULTIMEDIA TOOLS AND
APPLICATIONS.
Qaddoura, R., Faris, H., and Aljarah, I. (2020b). An ef-
ficient clustering algorithm based on the k-nearest
neighbors with an indexing ratio. International Jour-
nal of Machine Learning and Cybernetics, 11(3):675–
714.
Qaddoura, R., Faris, H., Aljarah, I., and Castillo, P. A.
(2020c). Evocluster: An open-source nature-inspired
optimization clustering framework in python. In In-
ternational Conference on the Applications of Evolu-
tionary Computation (Part of EvoStar), pages 20–36.
Springer.
Santos, B. O., Valenc¸a, J., and J
´
ulio, E. (2017). Detection
of cracks on concrete surfaces by hyperspectral im-
age processing. In Automated Visual Inspection and
Machine Vision II, volume 10334, page 1033407. In-
ternational Society for Optics and Photonics.
Sfetsos, A. and Siriopoulos, C. (2004). Combinatorial time
series forecasting based on clustering algorithms and
neural networks. Neural computing & applications,
13(1):56–64.
Shirkhorshidi, A. S., Aghabozorgi, S., and Wah, T. Y.
(2015). A comparison study on similarity and dissim-
ilarity measures in clustering continuous data. PloS
one, 10(12):e0144059.
Silva, S., Suresh, R., Tao, F., Votion, J., and Cao, Y. (2017).
A multi-layer k-means approach for multi-sensor data
pattern recognition in multi-target localization. arXiv
preprint arXiv:1705.10757.
Song, W., Wang, H., Maguire, P., and Nibouche, O. (2017).
Local partial least square classifier in high dimension-
ality classification. Neurocomputing, 234:126–136.
Tolentino, J. A. and Gerardo, B. D. (2019). Enhanced
manhattan-based clustering using fuzzy c-means al-
gorithm for high dimensional datasets. International
Journal on Advanced Science Engineering Informa-
tion Technology, 9:766–771.
Trivedi, N. and Kanungo, S. (2017). Performance enhance-
ment of k-means clustering algorithm for gene ex-
pression data using entropy-based centroid selection.
In Computing, Communication and Automation (IC-
CCA), 2017 International Conference on, pages 143–
148. IEEE.
Tzortzis, G. and Likas, A. (2014). The minmax k-
means clustering algorithm. Pattern Recognition,
47(7):2505–2516.
Zhang, T., Ramakrishnan, R., and Livny, M. (1996). Birch:
An efficient data clustering method for very large
databases. In Proceedings of the 1996 ACM SIGMOD
International Conference on Management of Data,
SIGMOD ’96, page 103–114, New York, NY, USA.
Association for Computing Machinery.
NCTA 2020 - 12th International Conference on Neural Computation Theory and Applications
438