BHHO-2Stage approach to other well-known GS ap-
proaches using the same fitness function.
8 CONCLUSIONS
This paper presents a new GS method based on the
BHHO algorithm and KNN classifier. Two new fit-
ness functions are proposed. The first fitness function
combines the classification performance and the size
of the genes subset in one formula. A linear weight
is used to balance these components together. The
second one is a two-stage fitness function. This fo-
cuses on optimizing the classification performance in
the first stage and optimizing the number of genes in
the second stage. The new two fitness functions were
compared with a common fitness function that uses
the classification performance only in a BHHO based-
wrapper GS approach. The results show that BHHO
with the fitness function that uses the classification
performance only can improve the classification per-
formance using all genes. In almost all the data sets,
BHHO with either of the two proposed fitness func-
tions could achieve higher classification performance.
Besides, they could achieve a fewer number of genes
than BHHO with overall classification performance as
the fitness function. BHHO with the two-stage fitness
function outperforms the linearly changing weights
fitness function in most problems based on the clas-
sification performance and the number of genes se-
lected. BHHO with the proposed fitness functions can
successfully reduce the number of genes and achieve
higher classification performance. In the future, we
will investigate a BHHO-based evolutionary multi-
objective GS approach to explore the Pareto front of
non-dominated solutions.
ACKNOWLEDGMENTS
This work is supported by the Ministerio espa
˜
nol de
Econom
´
ıa y Competitividad under project TIN2017-
85727-C4-2-P (UGR-DeepBio).
REFERENCES
Al-Betar, M. A., Alomari, O. A., and Abu-Romman, S. M.
(2020). A triz-inspired bat algorithm for gene selec-
tion in cancer classification. Genomics, 112(1):114–
126.
Alomari, O. A., Khader, A. T., Al-Betar, M. A., and
Alyasseri, Z. A. A. (2018). A hybrid filter-wrapper
gene selection method for cancer classification. In
2018 2nd International Conference on BioSignal
Analysis, Processing and Systems (ICBAPS), pages
113–118. IEEE.
Chuang, L.-Y., Yang, C.-S., Wu, K.-C., and Yang, C.-
H. (2011). Gene selection and classification using
taguchi chaotic binary particle swarm optimization.
Expert Systems with Applications, 38(10):13367–
13377.
Emary, E., Zawbaa, H. M., and Hassanien, A. E. (2016).
Binary grey wolf optimization approaches for feature
selection. Neurocomputing, 172:371–381.
Heidari, A. A., Mirjalili, S., Faris, H., Aljarah, I., Mafarja,
M., and Chen, H. (2019). Harris hawks optimization:
Algorithm and applications. Future generation com-
puter systems, 97:849–872.
Kennedy, J. and Eberhart, R. C. (1997). A discrete bi-
nary version of the particle swarm algorithm. In 1997
IEEE International conference on systems, man, and
cybernetics. Computational cybernetics and simula-
tion, volume 5, pages 4104–4108. IEEE.
Khurma., R. A., Aljarah., I., and Sharieh., A. (2020). An
Efficient Moth Flame Optimization Algorithm using
Chaotic Maps for Feature Selection in the Medical
Applications. In Proceedings of the 9th International
Conference on Pattern Recognition Applications and
Methods - Volume 1: ICPRAM,, pages 175–182. IN-
STICC, SciTePress.
Khurma, R. A., Aljarah, I., Sharieh, A., and Mirjalili, S.
(2020). Evolopy-fs: An open-source nature-inspired
optimization framework in python for feature selec-
tion. In Evolutionary Machine Learning Techniques,
pages 131–173. Springer.
Moghadasian, M. and Hosseini, S. P. (2014). Binary cuckoo
optimization algorithm for feature selection in high-
dimensional datasets. In International conference
on innovative engineering technologies (ICIET’2014),
pages 18–21.
Mohamad, M. S., Omatu, S., Deris, S., and Yoshioka, M.
(2011). A modified binary particle swarm optimiza-
tion for selecting the small subset of informative genes
from gene expression data. IEEE Transactions on in-
formation Technology in Biomedicine, 15(6):813–822.
Thaher, T., Heidari, A. A., Mafarja, M., Dong, J. S., and
Mirjalili, S. (2020). Binary harris hawks optimizer for
high-dimensional, low sample size feature selection.
In Evolutionary Machine Learning Techniques, pages
251–272. Springer.
Tran, B., Xue, B., and Zhang, M. (2014). Improved pso
for feature selection on high-dimensional datasets. In
Asia-Pacific Conference on Simulated Evolution and
Learning, pages 503–515. Springer.
Xi, M., Sun, J., Liu, L., Fan, F., and Wu, X. (2016). Can-
cer feature selection and classification using a binary
quantum-behaved particle swarm optimization and
support vector machine. Computational and mathe-
matical Methods in Medicine, 2016.
Yang, X.-S. (2010). Nature-inspired metaheuristic algo-
rithms. Luniver press.
Zawbaa, H. M., Emary, E., Grosan, C., and Snasel,
V. (2018). Large-dimensionality small-instance set
feature selection: A hybrid bio-inspired heuristic
approach. Swarm and Evolutionary Computation,
42:29–42.
ECTA 2020 - 12th International Conference on Evolutionary Computation Theory and Applications
146