it divides the feature space into a number of clusters
with the assumption that each cluster contains corre-
lated features. Selecting one feature for each cluster
helps to select only relevant and non-redundant fea-
tures at each step of the algorithm. This approach
does not only speed up the search algorithm but also
guarantees to obtain a compact and discriminant fea-
ture space. Two experiments were conducted, the
first on six real word, numerical and ready to use
datasets and the second on three color texture im-
age databases. The obtained results were compared
to four clustering-based feature selection approaches
and three other feature selection schemes. They show
that the proposed algorithm outperforms wrapper ap-
proaches while maintaining filter ones advantages.
Compared to other filter model based approaches, our
solution provides a high level of dimensionality re-
duction, high classification accuracy with a reason-
able processing time and no parameter to be adjusted.
REFERENCES
Alimoussa, M., Vandenbroucke, N., Porebski, A., Thami,
R. O. H., Fkihi, S. E., and Hamad, D. (2019). Com-
pact color texture representation by feature selection
in multiple color spaces. In 16th International Joint
Conference on Computer Vision, Imaging and Com-
puter Graphics Theory and Applications. in Prague.
Bins, J. and Draper, B. A. (2001). Feature selection from
huge feature sets. In Proceedings Eighth IEEE Inter-
national Conference on Computer Vision. ICCV 2001,
volume 2, pages 159–165 vol.2.
Chandrashekar, G. and Sahin, F. (2014). A survey on fea-
ture selection methods. Computers & Electrical En-
gineering, 40(1):16 – 28. 40th-year commemorative
issue.
Cov
˜
oes, T. F. and Hruschka, E. R. (2011). Towards improv-
ing cluster-based feature selection with a simplified
silhouette filter. Information Sciences, 181(18):3766
– 3782.
Das, S. (2001). Filters, wrappers and a boosting-based hy-
brid for feature selection. In Proceedings of the Eigh-
teenth International Conference on Machine Learn-
ing, ICML ’01, page 74–81, San Francisco, CA, USA.
Morgan Kaufmann Publishers Inc.
Dua, D. and Graff, C. (2017). UCI machine learning repos-
itory.
Hall, M. (2000). Correlation-based feature selection for dis-
crete and numeric class machine learning. In Proceed-
ings of the 17th international conference on machine
learning (ICML-2000), pages 359–366.
Hanchuan Peng, Fuhui Long, and Ding, C. (2005). Fea-
ture selection based on mutual information crite-
ria of max-dependency, max-relevance, and min-
redundancy. IEEE Transactions on Pattern Analysis
and Machine Intelligence, 27(8):1226–1238.
Harris, D. and Niekerk, A. V. (2018). Feature clustering and
ranking for selecting stable features from high dimen-
sional remotely sensed data. International Journal of
Remote Sensing, 39(23):8934–8949.
Hsu, H.-H., Hsieh, C.-W., and Lu, M.-D. (2011). Hybrid
feature selection by combining filters and wrappers.
Expert Systems with Applications, 38(7):8144 – 8150.
Kira, K. and Rendell, L. A. (1992). The feature selection
problem: Traditional methods and a new algorithm.
In Proceedings of the Tenth National Conference on
Artificial Intelligence, AAAI’92, San Jose, California,
page 129–134.
Krier, C., Franc¸ois, D., Rossi, F., and Verleysen, M. (2007).
Feature clustering and mutual information for the se-
lection of variables in spectral data. In European Sym-
posium on Artificial Neural Networks, Computational
Intelligence and Machine Learning.
Li, B., Wang, Q., Member, J., and Hu, J. (2011). Feature
subset selection: A correlation-based SVM filter ap-
proach. IEEJ Transactions on Electrical and Elec-
tronic Engineering, 6:173 – 179.
Mitra, P., Murthy, C. A., and Pal, S. K. (2002). Unsuper-
vised feature selection using feature similarity. IEEE
Transactions on Pattern Analysis and Machine Intel-
ligence, 24(3):301–312.
Porebski, A., Vandenbroucke, N., and Hamad, D. (2015).
A fast embedded selection approach for color texture
classification using degraded LBP. In 2015 Interna-
tional Conference on Image Processing Theory, Tools
and Applications (IPTA), pages 254–259.
Porebski, A., Vandenbroucke, N., and Macaire, L. (2010).
Comparison of feature selection schemes for color
texture classification. In 2010 2nd International Con-
ference on Image Processing Theory, Tools and Appli-
cations, pages 32 – 37.
Reunanen, J. (2003). Overfitting in making comparisons
between variable selection methods. J. Mach. Learn.
Res., 3:1371–1382.
Song, Q., Ni, J., and Wang, G. (2013). A fast clustering-
based feature subset selection algorithm for high-
dimensional data. IEEE Transactions on Knowledge
and Data Engineering, 25(1):1–14.
Yousef, M., Jung, S., Showe, L., and Showe, M. (2007).
Recursive cluster elimination (rce) for classification
and feature selection from gene expression data. BMC
bioinformatics, 8:144.
Yu, L. and Liu, H. (2003). Feature selection for high-
dimensional data: A fast correlation-based filter solu-
tion. In Fawcett, T. and Mishra, N., editors, Proceed-
ings, Twentieth International Conference on Machine
Learning, pages 856–863.
Zhu, K. and Yang, J. (2013). A cluster-based sequential
feature selection algorithm. In 2013 Ninth Interna-
tional Conference on Natural Computation (ICNC),
pages 848–852.
Zhu, X., Wang, Y., Li, Y., Tan, Y., Wang, G., and Song, Q.
(2019). A new unsupervised feature selection algo-
rithm using similarity-based feature clustering. Com-
putational Intelligence, 35(1):2–22.
VISAPP 2021 - 16th International Conference on Computer Vision Theory and Applications
132