inspired metaheuristic algorithms. Being coded in
Python with its NumPy extension, the implemented
algorithms are boosted by the powerful and efficient
multi-dimensional objects. In comparison with Mat-
lab, EvoloPy shows more efficiency in terms of run-
ning time for problems with high dimension and large
number of variables. This proves that the open source
tool Python should be considered over and preferably
other proprietary tools for carrying out experiments in
metaheuristics, and, for that matter, scientific comput-
ing in general, because besides being free and gratis,
it offers better efficiency and, inherently to its open
source nature, reproducibility.
The first steps after this paper will be to carry out
a profiling of the tool to identify inefficiencies and
make it run faster. After that, more recent and robust
metaheuristic algorithms are planned to be added to
the framework. It is also interesting to expand the
experiments presented in this paper to include a com-
parison with Matlab or other implementations based
on more challenging and sophisticated optimization
functions rather than simple ones.
ACKNOWLEDGEMENTS
This paper has been supported in part by
http://geneura.wordpress.comGeNeura Team,
projects TIN2014-56494-C4-3-P (Spanish Ministry
of Economy and Competitiveness).
REFERENCES
Beyer, H.-G. and Schwefel, H.-P. (2002). Evolution strate-
gies – a comprehensive introduction. Natural Com-
puting, 1(1):3–52.
Cahon, S., Melab, N., and Talbi, E.-G. (2004). Par-
adiseo: A framework for the reusable design of paral-
lel and distributed metaheuristics. Journal of Heuris-
tics, 10(3):357–380.
Durillo, J. J. and Nebro, A. J. (2011). jmetal: A java frame-
work for multi-objective optimization. Advances in
Engineering Software, 42:760–771.
Fortin, F.-A., De Rainville, F.-M., Gardner, M.-A., Parizeau,
M., and Gagn
´
e, C. (2012). DEAP: Evolutionary algo-
rithms made easy. Journal of Machine Learning Re-
search, 13:2171–2175.
Hartmut Pohlheim (2006). Geatbx - the genetic and evolu-
tionary algorithm toolbox for matlab.
Ho, Y.-C. and Pepyne, D. L. (2002). Simple explana-
tion of the no-free-lunch theorem and its implica-
tions. Journal of optimization theory and applications,
115(3):549–570.
Holland, J. (1992). Genetic algorithms. Scientific American,
pages 66–72.
Humeau, J., Liefooghe, A., Talbi, E.-G., and Verel, S.
(2013). ParadisEO-MO: From Fitness Landscape
Analysis to Efficient Local Search Algorithms. Re-
search Report RR-7871, INRIA.
Kennedy, J. and Eberhart, R. (1995). Particle swarm op-
timization. In Neural Networks, 1995. Proceedings.,
IEEE International Conference on, volume 4, pages
1942–1948 vol.4.
Koro
ˇ
sec, P. and
ˇ
Silc, J. (2009). A distributed ant-based al-
gorithm for numerical optimization. In Proceedings
of the 2009 workshop on Bio-inspired algorithms for
distributed systems - BADS 09. Association for Com-
puting Machinery (ACM).
Koza, J. R. (1992). Genetic Programming: On the Pro-
gramming of Computers by Means of Natural Selec-
tion. MIT Press, Cambridge, MA, USA.
Matthew Wall (1996). Galib: A c++ library of genetic al-
gorithm components.
Merelo Guerv
´
os, J. J. (2014). NodEO, a evo-
lutionary algorithm library in Node. Tech-
nical report, GeNeura group. Available at
http://figshare.com/articles/nodeo/972892.
Merelo-Guerv
´
os, J.-J., Arenas, M. G., Carpio, J., Castillo,
P., Rivas, V. M., Romero, G., and Schoenauer, M.
(2000). Evolving objects. In Wang, P. P., editor,
Proc. JCIS 2000 (Joint Conference on Information
Sciences), volume I, pages 1083–1086. ISBN: 0-
9643456-9-2.
Merelo-Guervs, J.-J., Castillo, P.-A., and Alba, E. (2010).
Algorithm::Evolutionary, a flexible Perl mod-
ule for evolutionary computation. Soft Computing,
14(10):1091–1109. Accesible at http://sl.ugr.es/000K.
Mirjalili, S. (2015). Moth-flame optimization algorithm: A
novel nature-inspired heuristic paradigm. Knowledge-
Based Systems, 89:228 – 249.
Mirjalili, S. and Lewis, A. (2016). The whale optimization
algorithm. Advances in Engineering Software, 95:51
– 67.
Mirjalili, S., Mirjalili, S. M., and Hatamlou, A. (2016).
Multi-verse optimizer: a nature-inspired algorithm for
global optimization. Neural Computing and Applica-
tions, 27(2):495–513.
Mirjalili, S., Mirjalili, S. M., and Lewis, A. (2014). Grey
wolf optimizer. Advances in Engineering Software,
69:46 – 61.
Wagner, S. and Affenzeller, M. (2004). The heuristiclab op-
timization environment. Technical report, University
of Applied Sciences Upper Austria.
Wolpert, D. H. and Macready, W. G. (1997). No free lunch
theorems for optimization. Evolutionary Computa-
tion, IEEE Transactions on, 1(1):67–82.
Yang, X.-S. (2010a). Firefly algorithm, stochastic test func-
tions and design optimisation. Int. J. Bio-Inspired
Comput., 2(2):78–84.
Yang, X.-S. (2010b). A new metaheuristic bat-inspired
algorithm. In Gonz
´
alez, J. R., Pelta, D. A., Cruz,
C., Terrazas, G., and Krasnogor, N., editors, Na-
ture Inspired Cooperative Strategies for Optimiza-
tion (NICSO 2010), pages 65–74, Berlin, Heidelberg.
Springer Berlin Heidelberg.
Yang, X.-S. (2013). Metaheuristic optimization: Nature-
inspired algorithms and applications. In Studies in
ECTA 2016 - 8th International Conference on Evolutionary Computation Theory and Applications
176