Brownlee, J. (2014). Feature selection to improve accuracy
and decrease training time. https://machinelearningm
astery.com/feature-selection-to-improve-accuracy-a
nd-decrease-training-time/. (accessed on September
30, 2022).
Brownlee, J. (2019). How to choose a feature selection
method for machine learning. https://machinelearn
ingmastery.com/feature-selection-with-real-and-cat
egorical-data/. (accessed on July 22, 2022).
Brownlee, J. (2020). How to calculate feature importance
with python. https://machinelearningmastery.com/c
alculate-feature-importance-with-python/. (accessed
on March 2, 2022).
BSA The Software Alliance, B. (2015). What is the big
deal with data? https://data.bsa.org/wp-content/u
ploads/2015/12/bsadatastudy\ en.pdf. (accessed on
September 4, 2022).
Bullnheimer, B., Hartl, R. F., and Strauss, C. (1997). A new
rank-based version of the ant system. a computational
study.
Chen, T. and Guestrin, C. (2016). Xgboost: A scalable tree-
boosting system. In Proceedings of the 22nd ACM
Sigkdd international conference on knowledge discov-
ery and data mining, pages 785–794.
Chen, Y.-W. and Lin, C.-J. (2006). Combining svms with
various feature selection strategies. In Feature extrac-
tion, pages 315–324. Springer.
Dong, G. and Liu, H. (2018). Feature engineering for ma-
chine learning and data analytics. CRC Press.
Dorigo, M., Maniezzo, V., and Colorni, A. (1991). Positive
feedback as a search strategy.
Dorigo, M., Maniezzo, V., and Colorni, A. (1996). Ant sys-
tem: optimization by a colony of cooperating agents.
IEEE Transactions on Systems, Man, and Cybernet-
ics, Part B (Cybernetics), 26(1):29–41.
Dorigo, M. and St
¨
utzle, T. (2019). Ant colony optimization:
overview and recent advances. Handbook of meta-
heuristics, pages 311–351.
Dua, D. and Graff, C. (2017). Uci machine learning reposi-
tory.
Ferreira, C. A. (2019). Mlp classifier. https://medium.c
om/@carlosalbertoff/mlp-classifier-526978d1c638.
(accessed on March 2, 2022).
Gajawada, S. K. (2019). Anova for feature selection in ma-
chine learning. https://towardsdatascience.com/anova
-for-feature-selection-in-machine-learning-d9305e2
28476. (accessed on August 3, 2022).
Garc
´
ıa, S., Luengo, J., and Herrera, F. (2015). Data prepro-
cessing in data mining, volume 72. Springer.
Gaspar-Cunha, A., Takahashi, R., and Antunes, C. H.
(2012). Manual de computac¸
˜
ao evolutiva e meta-
heur
´
ıstica. Coimbra University Press.
Ghosh, M., Guha, R., Ram, S., and Ajith, A. (2019). A
wrapper-filter feature selection technique based on ant
colony optimization. Neural Computing & Applica-
tions, 32(12):7839–7857.
Ho, T. K. (1995). Random decision forests. In Proceedings
of 3rd international conference on document analysis
and recognition, volume 1, pages 278–282. IEEE.
IBM (2020). What is the k-nearest neighbors algorithm?
https://www.ibm.com/topics/knn#:
∼
:text=The\%
20k\%2Dnearest\%20neighbors\%20algorithm\%
2C\%20also\%20known\%20as\%20KNN\%20
or,of\%20an\%20individual\%20data\%20point.
(accessed on March 2, 2022).
Kuhn, M., Johnson, K., et al. (2013). Applied predictive
modeling, volume 26. Springer.
L
´
opez-Ib
´
a
˜
nez, M., Dubois-Lacoste, J., C
´
aceres, L. P., Bi-
rattari, M., and St
¨
utzle, T. (2016). The irace package:
Iterated racing for automatic algorithm configuration.
Operations Research Perspectives, 3:43–58.
Luz, F. (2018). Algoritmo knn para classificac¸
˜
ao. https:
//inferir.com.br/artigos/algoritimo-knn-para-classific
acao/. (accessed on March 2, 2022).
Mohanty, A. (2019). Multi-layer perceptron (mlp) models
on real-world banking data. https://becominghuman.
ai/multi-layer-perceptron-mlp-models-on-real-world
-banking-data-f6dd3d7e998f. (accessed on March 2,
2022).
Moreira, S. (2018). Multi-layer perceptron (mlp) models on
real-world banking data. https://medium.com/ensin
a-ai/rede-neural-perceptron-multicamadas-f9de847
1f1a9#:
∼
:text=Perceptron\%20Multicamadas\%2
0(PMC%20ou\%20MLP,sa\%C3\%ADda\%20de
sejada\%20nas\%20camadas\%20intermedi\%C3\
%A1rias. (accessed on March 2, 2022).
Naviani, A. (2018). Understanding random forests classi-
fiers in python tutorial. https://www.datacamp.com/t
utorial/random-forests-classifier-python. (accessed
on March 22, 2022).
R, C. T. (2015). R: A language and environment for statisti-
cal computing. https://www.R-project.org. (accessed
on September 20, 2022).
Santos, G. (2021). Estat
´
ıstica para selec¸
˜
ao de atributos. ht
tps://medium.com/data-hackers/estat%C3%ADstic
a-para-sele%C3%A7%C3%A3o-de-atributos-81bdc
274dd2c. (accessed on July 10, 2022).
Sharda, R., Delen, D., and Turban, E. (2019). Busi-
ness Intelligence e An
´
alise de Dados para Gest
˜
ao do
Neg
´
ocio-4. Bookman Editora.
Silva, T. A. (2018). Como implementar as m
´
etricas pre-
cis
˜
ao, revocac¸
˜
ao, acur
´
acia e medida-f. https://tiago.bl
og.br/precisao-revocacao-acuracia-e-medida-/\#:\
∼
:
text=Medida\%20F\%20(F\%20Measure\%2C\%
20F1,medida\%20de\%20confiabilidade\%20da\%
20acur\%C3\%A1cia. (accessed on September 20,
2022).
Spearman, C. (1904). The proof and measurement of asso-
ciation between two things. Amer. Journal of Psychol-
ogy, 15(1):72–101.
St
¨
utzle, T., Dorigo, M., et al. (1999). Aco algorithms for the
traveling salesman problem. Evolutionary algorithms
in engineering and computer science, 4:163–183.
Tabakhi, S., Moradi, P., and Akhlaghian, F. (2014). An
unsupervised feature selection algorithm based on ant
colony optimization. Engineering Applications of Ar-
tificial Intelligence, 32:112–123.
Uthayakumar, J., Metawa, N., Shankar, K., and Laksh-
manaprabu, S. (2020). Financial crisis prediction
Improved ACO Rank-Based Algorithm for Use in Selecting Features for Classification Models
301