iii) ๐๐๐_๐ด๐๐ Minimum Accuracy value
obtained when a given algorithm is repeated
for 10 independent runs.
iv) ๐ด๐ฃ๐_๐ด๐๐: Is the average of all the accuracy
values obtained when a given algorithm is
repeated for 10 independent runs.
v) ๐๐๐ฅ_๐๐๐๐๐ก: Is the Maximum number of
features reported by a given algorithm
during the 10 independent runs.
vi) ๐๐๐_๐๐๐๐๐ก: Is the Minimum number of
features reported by a given algorithm
during the 10 independent runs.
vii) ๐ด๐ฃ๐_๐๐๐๐๐ก Is the average of all the
number of features reported by a given
algorithm during the 10 independent runs.
viii) Dataset: Captures the datasets utilized for
experimentation as articulated in Table 1.
It is important to point out that the best result
achieved in each column for all the considered gene
expression datasets is highlighted in bold while the
worst is italicized.
Concerning the classification accuracy, as
presented in Table 3, the proposed EBGWO
algorithm outperformed all the competing when the
fitness function (refer to Equation 24) was adopted.
Concerning the selection of the informative subset
of genes, again the proposed EBGWO identified
subsets with the least number of features to achieve
the highest classification accuracy for all the seven
highly dimensional microarray datasets considered in
this paper.
6 CONCLUSION AND FUTURE
WORKS
An excited grey wolf optimizer (EGWO) is proposed
in this paper. In the proposed algorithm, the concept
of the complete current response of a direct current
(DC) excited resistor capacitor (RC) circuit are
innovatively utilized to make the non-linear control
strategy of parameter ๐ of the GWO adaptive. Since
this scheme allocates a large proportion of the number
of iterations to global exploration compared to local
exploitation, the convergence speed of the proposed
EGWO algorithm is enhanced while minimizing the
local minima trapping effect. Moreover, since the
proposed scheme assigns each wolf a value of
parameter ๐ that is proportional its fitness values in
both the search space and the current iteration
(generation), diversity and the quality of the solutions
is improved as well.
To overcome premature converge (a limitation of
existing versions of GWO algorithms) and still
maintain the social hierarchy of the pack, a new
position-updated equation utilizing the fitness values
of vectors ๐
โ
๎ฌต
, ๐
โ
๎ฌถ
and ๐
โ
๎ฌท
is proposed in determining
new candidate individuals.
As a feature selector, EBGWO is compared with
five metaheuristic algorithms i.e. BGWO1, BGWO2,
BPSO, BDE and BGA that are in existence. The
obtained experimental results revealed that EBGWO
yielded the best performance and overtook the other
algorithms. EBGWO not only attained the highest
classification accuracy, but also selected subsets with
the least number of informative features (genes). In
conclusion, the proposed EBGWO is successful, and
more appropriate to be used as a feature selector in
highly dimensional datasets. For further works, a
chaotic map can be adopted to fine-tune the
parameters of the EBGWO. Utilizing EBGWO as a
hybrid filter-wrapper for feature selection seeking to
evaluate the generality of the selected attributes will
be another valuable contribution. Moreover, EGWO
will be applied to other optimization areas, such as
training neural network, knapsack, and numerical
problems.
REFERENCES
Aghdam, M. H., Ghasem-Aghaee, N., & Basiri, M. E.
(2009). Text feature selection using ant colony
optimization. Expert Systems with Applications, 36(3,
Part 2), 6843โ6853. https://doi.org/10.1016/j.eswa.
2008.08.022
Alexander, C., & Sadiku, M. (2016). Fundamentals of
Electric Circuits. https://www.mheducation.com/
highered/product/fundamentals-electric-circuits-
alexander-sadiku/M9780078028229.html
Almugren, N., & Alshamlan, H. (2019). A Survey on
Hybrid Feature Selection Methods in Microarray Gene
Expression Data for Cancer Classification. IEEE
Access, 7, 78533โ78548. https://doi.org/10.1109/
ACCESS.2019.2922987
Arlot, S., & Celisse, A. (2010). A survey of cross-validation
procedures for model selection. Statistics Surveys, 4,
40โ79. https://doi.org/10.1214/09-SS054
De Stefano, C., Fontanella, F., Marrocco, C., & Scotto di
Freca, A. (2014). A GA-based feature selection
approach with an application to handwritten character
recognition. Pattern Recognition Letters, 35, 130โ141.
https://doi.org/10.1016/j.patrec.2013.01.026
El-Gaafary, A. A. M., Mohamed, Y. S., Hemeida, A. M., &
Mohamed, A.-A. A. (2015). Grey Wolf Optimization
for Multi Input Multi Output System. Universal
Journal of Communications and Network, 3(1), 1โ6.
https://doi.org/10.13189/ujcn.2015.030101