our CGP framework, we recommend the investigation
of regularization techniques, such as dropout. Also,
the evolution of recurrent structures within the CNNs
may be explored for improved performance.
Future work may explore measures of structural
similarity which act on the DAG representation of
CGP genotype, such as Graph Edit Distance (Sanfe-
liu and Fu, 1983). In addition, the method may be ap-
plied to other CNN tasks such as image segmentation.
Lastly, an investigation of alternative SLS algorithms
may yield further improvements in terms of reduced
computational cost.
ACKNOWLEDGEMENTS
We acknowledge the support of the Natural Sci-
ences and Engineering Research Council of Canada
(NSERC).
REFERENCES
Baldominos, A., Saez, Y., and Isasi, P. (2018). Evolution-
ary convolutional neural networks: An application to
handwriting recognition. Neurocomputing, 283:38–
52.
Caruana, R., Lawrence, S., and Giles, L. (2000). Overfit-
ting in neural nets: Backpropagation, conjugate gra-
dient, and early stopping. NIPS’00, page 381–387,
Cambridge, MA, USA. MIT Press.
Chollet, F. et al. (2015). Keras. https://keras.io.
Cover, T. and Hart, P. (1967). Nearest neighbor pattern clas-
sification. IEEE Transactions on Information Theory,
13(1):21–27.
Deb, K., Pratap, A., Agarwal, S., and Meyarivan, T. (2002).
A fast and elitist multiobjective genetic algorithm:
Nsga-ii. IEEE Transactions on Evolutionary Compu-
tation, 6(2):182–197.
D’Souza, R., Sekaran, C., and Kandasamy, A. (2010). Im-
proved nsga-ii based on a novel ranking scheme. Jour-
nal of Computing, 2.
Fernandes Junior, F. E. and Yen, G. G. (2019). Particle
swarm optimization of deep neural networks architec-
tures for image classification. Swarm and Evolution-
ary Computation, 49:62–74.
Kingma, D. P. and Ba, J. (2017). Adam: A method for
stochastic optimization.
Kirkpatrick, S., Gelatt, C. D., and Vecchi, M. P. (1983).
Optimization by simulated annealing. Science,
220(4598):671–680.
Lecun, Y., Bottou, L., Bengio, Y., and Haffner, P. (1998).
Gradient-based learning applied to document recogni-
tion. Proceedings of the IEEE, 86(11):2278–2324.
LeCun, Y. and Cortes, C. (2010). MNIST handwritten digit
database.
Lehman, J. and Stanley, K. O. (2011). Novelty search and
the problem with objectives. In Riolo, R., Vladislavl-
eva, E., and Moore, J. H., editors, Genetic Program-
ming Theory and Practice IX, pages 37–56. Springer
New York, New York, NY.
McGhie, A., Xue, B., and Zhang, M. (2020). Gpcnn:
evolving convolutional neural networks using genetic
programming. In 2020 IEEE Symposium Series on
Computational Intelligence (SSCI), pages 2684–2691,
Canberra, ACT, Australia. IEEE.
Miller, B. and Goldberg, D. (1995). Genetic algorithms,
tournament selection, and the effects of noise. Com-
plex Syst., 9.
Miller, J. F. and Thomson, P. (2000). Cartesian genetic pro-
gramming.
Sanfeliu, A. and Fu, K.-S. (1983). A distance measure be-
tween attributed relational graphs for pattern recogni-
tion. IEEE Transactions on Systems, Man, and Cyber-
netics, SMC-13(3):353–362.
Srivastava, N., Hinton, G., Krizhevsky, A., Sutskever, I.,
and Salakhutdinov, R. (2014). Dropout: a simple way
to prevent neural networks from overfitting. The jour-
nal of machine learning research, 15(1):1929–1958.
Stanley, K. O., D’Ambrosio, D. B., and Gauci, J. (2009). A
Hypercube-Based Encoding for Evolving Large-Scale
Neural Networks. Artificial Life, 15(2):185–212.
Stanley, K. O. and Miikkulainen, R. (2002). Evolving Neu-
ral Networks through Augmenting Topologies. Evo-
lutionary Computation, 10(2):99–127.
Suganuma, M., Shirakawa, S., and Nagao, T. (2017). A
genetic programming approach to designing convolu-
tional neural network architectures. In Proceedings
of the Genetic and Evolutionary Computation Confer-
ence, pages 497–504, Berlin Germany. ACM.
Sun, Y., Xue, B., Zhang, M., Yen, G. G., and Lv, J.
(2020). Automatically designing cnn architectures
using the genetic algorithm for image classification.
IEEE Transactions on Cybernetics, 50(9):3840–3854.
Sun, Y., Yen, G. G., and Yi, Z. (2019). Evolving unsuper-
vised deep neural networks for learning meaningful
representations. IEEE Transactions on Evolutionary
Computation, 23(1):89–103.
Verbancsics, P. and Harguess, J. (2015). Image classifica-
tion using generative neuro evolution for deep learn-
ing. In 2015 IEEE Winter Conference on Applications
of Computer Vision, pages 488–493.
Wang, B., Sun, Y., Xue, B., and Zhang, M. (2018). Evolving
deep convolutional neural networks by variable-length
particle swarm optimization for image classification.
In 2018 IEEE Congress on Evolutionary Computation
(CEC), pages 1–8.
Xie, L. and Yuille, A. (2017). Genetic cnn. In 2017 IEEE
International Conference on Computer Vision (ICCV),
pages 1388–1397, Venice. IEEE.
A Hybrid Evolutionary Algorithm, Utilizing Novelty Search and Local Optimization, Used to Design Convolutional Neural Networks for
Handwritten Digit Recognition
133