other paradigms such as Local learning or Ensem-
ble learning for more powerful machine learning tech-
niques.
REFERENCES
Atlas, L. E., Cohn, D., and Ladner, R. (1990). Train-
ing connectionist networks with queries and selective
sampling. In Touretzky, D., editor, Advances in Neu-
ral Information Processing Systems 2, pages 566–573.
Morgan-Kaufmann.
Azhari, M., Abarda, A., Alaoui, A., Ettaki, B., and Ze-
rouaoui, J. (2020). Detection of pulsar candidates
using bagging method. Procedia Computer Science,
170:1096–1101.
Candanedo, L. M. and Feldheim, V. (2016a). Accurate oc-
cupancy detection of an office room from light, tem-
perature, humidity and co2 measurements using statis-
tical learning models. Energy and Buildings, 112:28–
39.
Candanedo, L. M. and Feldheim, V. (2016b). Accurate oc-
cupancy detection of an office room from light, tem-
perature, humidity and co2 measurements using statis-
tical learning models. Energy and Buildings, 112:28–
39.
Cohn, D., Atlas, L. E., Ladner, R., and Waibel, A. (1994).
Improving generalization with active learning. In Ma-
chine Learning, pages 201–221.
Curry, R., Lichodzijewski, P., and Heywood, M. I. (2007).
Scaling genetic programming to large datasets using
hierarchical dynamic subset selection. IEEE Trans-
actions on Systems, Man, and Cybernetics: Part B -
Cybernetics, 37(4):1065–1073.
Elkhoukhi, H., NaitMalek, Y., Berouine, A., Bakhouya, M.,
Elouadghiri, D., and Essaaidi, M. (2018). Towards
a real-time occupancy detection approach for smart
buildings. Procedia computer science, 134:114–120.
Fortin, F.-A., De Rainville, F.-M., Gardner, M.-A., Parizeau,
M., and Gagn
´
e, C. (2012). DEAP: Evolutionary algo-
rithms made easy. Journal of Machine Learning Re-
search, 13:2171–2175.
Gathercole, C. (1998). An Investigation of Supervised
Learning in Genetic Programming. Thesis, Univer-
sity of Edinburgh.
Gathercole, C. and Ross, P. (1994). Dynamic training sub-
set selection for supervised learning in genetic pro-
gramming. In Parallel Problem Solving from Nature -
PPSN III, volume 866 of Lecture Notes in Computer
Science, pages 312–321. Springer.
Gathercole, C. and Ross, P. (1997). Small populations over
many generations can beat large populations over few
generations in genetic programming. In Koza, J. R.,
Deb, K., Dorigo, M., Fogel, D. B., Garzon, M., Iba,
H., and Riolo, R. L., editors, Genetic Programming
1997: Proc. of the Second Annual Conf., pages 111–
118, San Francisco, CA. Morgan Kaufmann.
Hmida, H., Hamida, S. B., Borgi, A., and Rukoz,
M. (2016a). Hierarchical data topology based
selection for large scale learning. In Ubiqui-
tous Intelligence & Computing, Advanced and
Trusted Computing, Scalable Computing and Com-
munications, Cloud and Big Data Computing,
Internet of People, and Smart World Congress
(UIC/ATC/ScalCom/CBDCom/IoP/SmartWorld),
2016 Intl IEEE Conferences, pages 1221–1226.
IEEE.
Hmida, H., Hamida, S. B., Borgi, A., and Rukoz, M.
(2016b). Sampling methods in genetic programming
learners from large datasets: A comparative study. In
Angelov, P., Manolopoulos, Y., Iliadis, L. S., Roy, A.,
and Vellasco, M. M. B. R., editors, Advances in Big
Data - Proceedings of the 2nd INNS Conference on
Big Data, October 23-25, 2016, Thessaloniki, Greece,
volume 529 of Advances in Intelligent Systems and
Computing, pages 50–60.
Hmida, H., Hamida, S. B., Borgi, A., and Rukoz, M. (2018).
Scale genetic programming for large data sets: Case of
higgs bosons classification. Procedia Computer Sci-
ence, 126:302 – 311. the 22nd International Confer-
ence, KES-2018.
Hunt, R., Johnston, M., Browne, W. N., and Zhang, M.
(2010). Sampling methods in genetic programming
for classification with unbalanced data. In Li, J., edi-
tor, Australasian Conference on Artificial Intelligence,
volume 6464 of Lecture Notes in Computer Science,
pages 273–282. Springer.
Iba, H. (1999). Bagging, boosting, and bloating in ge-
netic programming. In Banzhaf, W., Daida, J., Eiben,
A. E., Garzon, M. H., Honavar, V., Jakiela, M., and
Smith, R. E., editors, Proc. of the Genetic and Evolu-
tionary Computation Conf. GECCO-99, pages 1053–
1060, San Francisco, CA. Morgan Kaufmann.
Lasarczyk, C. W. G., Dittrich, P., and Banzhaf, W. (2004).
Dynamic subset selection based on a fitness case
topology. Evolutionary Computation, 12(2):223–242.
Lyon, R. J., Stappers, B. W., Cooper, S., Brooke, J. M., and
Knowles, J. D. (2016a). Fifty years of pulsar candidate
selection: from simple filters to a new principled real-
time classification approach. Monthly Notices of the
Royal Astronomical Society, 459(1):1104–1123.
Lyon, R. J., Stappers, B. W., Cooper, S., Brooke, J. M., and
Knowles, J. D. (2016b). Fifty years of pulsar candi-
date selection: from simple filters to a new principled
real-time classification approach. Monthly Notices of
the Royal Astronomical Society, 459(1):1104–1123.
Mohamed, T. M. (2018). Pulsar selection using fuzzy knn
classifier. Future Computing and Informatics Journal,
3(1):1–6.
Petrowski, A. and Ben-Hamida, S. (2017). Evolutionary
Algorithms. Computer Engineering: Metaheuristics.
Wiley.
Robert Curry, M. H. (2004). Towards efficient training
on large datasets for genetic programming. Lecture
Notes in Computer Science, 866(Advances in Artifi-
cial Intelligence):161–174.
Wang, Y., Pan, Z., Zheng, J., Qian, L., and Li, M. (2019). A
hybrid ensemble method for pulsar candidate classifi-
cation. Astrophysics and Space Science, 364.
ICSOFT 2021 - 16th International Conference on Software Technologies
582