Authors:
Christian Nieber
1
;
Douglas Dias
2
;
Enrique Naredo Garcia
3
and
Conor Ryan
1
Affiliations:
1
Department of Computer Science and Information Systems, University of Limerick, Limerick, Ireland
;
2
Department of Computer Science & Applied Physics, Atlantic Technological University, Galway, Ireland
;
3
Departamento de Ciencias Básicas, Universidad del Caribe, Cancun, Mexico
Keyword(s):
Step Size Control, Evolutionary Algorithms, Neural Architecture Search, Lightweight CNNs, MNIST, LeNet.
Abstract:
This work examines how evolutionary Neural Architecture Search (NAS) algorithms can be improved by controlling the step size of the mutation of numerical parameters. The proposed NAS algorithms are based on F-DENSER, a variation of Dynamic Structured Grammatical Evolution (DSGE). Overall, a (1+5) Evolutionary Strategy is used. Two methods of controlling the step size of mutations of numeric values are compared to Random Search and F-DENSER: Decay of the step size over time and adaptive step size for mutations. The search for lightweight, LeNet-like CNN architectures for MNIST classification is used as a benchmark, optimizing for both accuracy and small architectures. An architecture is described by about 30 evolvable parameters. Experiments show that with step size control, convergence is faster, better performing neural architectures are found on average, and with lower variance. The smallest architecture found during the experiments reached an accuracy of 98.8% on MNIST with only 5
,450 free parameters, compared to the 62,158 parameters of LeNet-5.
(More)