REFERENCES
Asch, P. and Quandt, R. E. (1988). Betting bias in ‘ex-
otic’bets. Economics Letters, 28(3):215–219.
B
¨
ack, T. and Schwefel, H.-P. (1993). An overview of evo-
lutionary algorithms for parameter optimization. Evo-
lutionary computation, 1(1):1–23.
Bies, R. R., Muldoon, M. F., Pollock, B. G., Manuck,
S., Smith, G., and Sale, M. E. (2006). A genetic
algorithm-based, hybrid machine learning approach
to model selection. Journal of pharmacokinetics and
pharmacodynamics, 33(2):195.
Bodnar, C., Day, B., and Li
´
o, P. (2020). Proximal distilled
evolutionary reinforcement learning. In Proceedings
of the AAAI Conference on Artificial Intelligence, vol-
ume 34, pages 3283–3290.
Booker, L. B., Goldberg, D. E., and Holland, J. H. (1989).
Classifier systems and genetic algorithms. Artificial
intelligence, 40(1-3):235–282.
Buchtala, O., Klimek, M., and Sick, B. (2005). Evolu-
tionary optimization of radial basis function classifiers
for data mining applications. IEEE Transactions on
Systems, Man, and Cybernetics, Part B (Cybernetics),
35(5):928–947.
de Lacerda, E., de Carvalho, A., and Ludermir, T. (2002).
A study of cross-validation and bootstrap as objec-
tive functions for genetic algorithms. In VII Brazilian
Symposium on Neural Networks, 2002. SBRN 2002.
Proceedings., pages 118–123.
Devos, O., Downey, G., and Duponchel, L. (2014). Si-
multaneous data pre-processing and svm classification
model selection based on a parallel genetic algorithm
applied to spectroscopic data of olive oils. Food chem-
istry, 148:124–130.
Forrest, S. (1996). Genetic algorithms. ACM Computing
Surveys (CSUR), 28(1):77–80.
Frazier, P. I. (2018). Bayesian optimization. In Recent ad-
vances in optimization and modeling of contemporary
problems, pages 255–278. Informs.
Giacobini, M., Alba, E., Tettamanzi, A., and Tomassini, M.
(2004). Modeling selection intensity for toroidal cel-
lular evolutionary algorithms. In Genetic and Evolu-
tionary Computation Conference, pages 1138–1149.
Springer.
Giacobini, M., Tomassini, M., and Tettamanzi, A. (2003).
Modeling selection intensity for linear cellular evo-
lutionary algorithms. In International Conference
on Artificial Evolution (Evolution Artificielle), pages
345–356. Springer.
Gong, Y.-J., Chen, W.-N., Zhan, Z.-H., Zhang, J., Li, Y.,
Zhang, Q., and Li, J.-J. (2015). Distributed evolution-
ary algorithms and their models: A survey of the state-
of-the-art. Applied Soft Computing, 34:286–300.
Grefenstette, J. J. (1993). Genetic algorithms and machine
learning. In Proceedings of the sixth annual confer-
ence on Computational learning theory, pages 3–4.
Guerbai, Y., Chibani, Y., and Meraihi, Y. (2022). Tech-
niques for selecting the optimal parameters of one-
class support vector machine classifier for reduced
samples. International Journal of Applied Meta-
heuristic Computing (IJAMC), 13(1):1–15.
Holland, J. (1975). Adaptation in natural and artificial sys-
tems, univ. of mich. press. Ann Arbor.
Inman, H. F. and Jr, E. L. B. (1989). The overlapping co-
efficient as a measure of agreement between probabil-
ity distributions and point estimation of the overlap of
two normal densities. Communications in Statistics -
Theory and Methods, 18(10):3851–3874.
LeCun, Y., Cortes, C., and Burges, C. (2010). Mnist hand-
written digit database. ATT Labs [Online]. Available:
http://yann.lecun.com/exdb/mnist, 2.
Lessmann, S., Stahlbock, R., and Crone, S. F. (2006). Ge-
netic algorithms for support vector machine model se-
lection. In The 2006 IEEE International Joint Con-
ference on Neural Network Proceedings, pages 3063–
3069. IEEE.
Nogueira, F. (2014–). Bayesian Optimization: Open source
constrained global optimization tool for Python.
Paterlini, S. and Minerva, T. (2010). Regression model
selection using genetic algorithms. In Proceedings
of the 11th WSEAS international conference on nu-
ral networks and 11th WSEAS international confer-
ence on evolutionary computing and 11th WSEAS in-
ternational conference on Fuzzy systems, pages 19–
27. World Scientific and Engineering Academy and
Society (WSEAS).
Reichhuber, S. and Tomforde, S. (2021). Bet-based evolu-
tionary algorithms: Self-improving dynamics in off-
spring generation. In ICAART (2), pages 1192–1199.
Reichhuber, S. and Tomforde, S. (2022). Evolving gaus-
sian mixture models for classification. In ICAART (3),
pages 964–974.
Shahriari, B., Swersky, K., Wang, Z., Adams, R. P., and
De Freitas, N. (2015). Taking the human out of the
loop: A review of bayesian optimization. Proceedings
of the IEEE, 104(1):148–175.
Snoek, J., Larochelle, H., and Adams, R. P. (2012). Practi-
cal bayesian optimization of machine learning algo-
rithms. Advances in neural information processing
systems, 25.
Young, S. R., Rose, D. C., Karnowski, T. P., Lim, S.-H.,
and Patton, R. M. (2015). Optimizing deep learning
hyper-parameters through an evolutionary algorithm.
In Proceedings of the workshop on machine learning
in high-performance computing environments, pages
1–5.
Exotic Bets: Evolutionary Computing Coupled with Bet Mechanisms for Model Selection
267