
donesia, December 8–12, 2021, Proceedings, Part II,
pages 335–347, Berlin, Heidelberg. Springer-Verlag.
event-place: Sanur, Bali, Indonesia.
Dong, X. and Yang, Y. (2020). Nas-bench-201: Extending
the scope of reproducible neural architecture search.
Elsken, T., Metzen, J., and Hutter, F. (2019). Neural archi-
tecture search: A survey. Journal of Machine Learn-
ing Research, 20.
Elsken, T., Metzen, J.-H., and Hutter, F. (2017). Simple
And Efficient Architecture Search for Convolutional
Neural Networks. arXiv:1711.04528 [cs, stat].
Goodfellow, I., Bengio, Y., and Courville, A. (2016). Deep
Learning. MIT Press.
Hu, X., Chu, L., Pei, J., Liu, W., and Bian, J. (2021).
Model Complexity of Deep Learning: A Survey.
arXiv:2103.05127 [cs].
Laredo, D., Qin, Y., Sch
¨
utze, O., and Sun, J.-Q. (2019).
Automatic Model Selection for Neural Networks.
arXiv:1905.06010 [cs, stat].
Lee, N., Ajanthan, T., and Torr, P. H. S. (2019). Snip:
Single-shot network pruning based on connection sen-
sitivity.
Luong, N. H., Phan, Q. M., Vo, A., Pham, T. N., and
Bui, D. T. (2024). Lightweight multi-objective evolu-
tionary neural architecture search with low-cost proxy
metrics. Information Sciences, 655:119856.
Mellor, J., Turner, J., Storkey, A., and Crowley, E. J.
(2021). Neural Architecture Search without Training.
arXiv:2006.04647 [cs, stat].
Mokhtari, N., Nedelec, A., Gilles, M., and De Loor, P.
(2022). Improving Neural Architecture Search by
Mixing a FireFly algorithm with a Training Free Eval-
uation. volume 2022-July.
Phan, Q. M. and Luong, N. H. (2021). Enhancing multi-
objective evolutionary neural architecture search with
surrogate models and potential point-guided local
searches. In Fujita, H., Selamat, A., Lin, J. C.-W.,
and Ali, M., editors, Advances and Trends in Artificial
Intelligence. Artificial Intelligence Practices, pages
460–472, Cham. Springer International Publishing.
Real, E., Aggarwal, A., Huang, Y., and Le, Q. V. (2018).
Regularized Evolution for Image Classifier Architec-
ture Search. Publisher: arXiv Version Number: 7.
Real, E., Aggarwal, A., Huang, Y., and Le, Q. V. (2019).
Aging Evolution for Image Classifier Architecture
Search. In AAAI Conference on Artificial Intelligence.
Schwartz, R., Dodge, J., Smith, N. A., and Etzioni, O.
(2019). Green AI. Publisher: arXiv Version Number:
3.
Simonyan, K. and Zisserman, A. (2015). Very Deep Con-
volutional Networks for Large-Scale Image Recogni-
tion. arXiv:1409.1556 [cs].
Strubell, E., Ganesh, A., and McCallum, A. (2019). Energy
and Policy Considerations for Deep Learning in NLP.
arXiv:1906.02243 [cs].
Tan, M. and Le, Q. V. (2020). EfficientNet: Rethinking
Model Scaling for Convolutional Neural Networks.
arXiv:1905.11946 [cs, stat].
Tanaka, H., Kunin, D., Yamins, D. L. K., and Ganguli, S.
(2020). Pruning neural networks without any data by
iteratively conserving synaptic flow.
Theis, L., Korshunova, I., Tejani, A., and Husz
´
ar, F. (2018).
Faster gaze prediction with dense networks and fisher
pruning.
Wang, C., Zhang, G., and Grosse, R. (2020). Picking
winning tickets before training by preserving gradient
flow.
Wei, J., Tay, Y., Bommasani, R., Raffel, C., Zoph,
B., Borgeaud, S., Yogatama, D., Bosma, M.,
Zhou, D., Metzler, D., Chi, E. H., Hashimoto, T.,
Vinyals, O., Liang, P., Dean, J., and Fedus, W.
(2022). Emergent Abilities of Large Language Mod-
els. arXiv:2206.07682 [cs].
Yang, X. (2010a). Nature-inspired Metaheuristic Algo-
rithms. Luniver Press.
Yang, X.-S. (2010b). Firefly algorithm, stochastic test func-
tions and design optimisation.
Ying, C., Klein, A., Real, E., Christiansen, E., Murphy, K.,
and Hutter, F. (2019). NAS-BENCH-101: Towards
reproducible neural architecture search. volume 2019-
June, pages 12334–12348.
Zitzler, E. (2012). Evolutionary Multiobjective Optimiza-
tion. In Rozenberg, G., B
˜
ACck, T., and Kok, J. N.,
editors, Handbook of Natural Computing, pages 871–
904. Springer Berlin Heidelberg, Berlin, Heidelberg.
Zoph, B. and Le, Q. V. (2017). Neural Architecture Search
with Reinforcement Learning. arXiv:1611.01578 [cs].
Zoph, B., Vasudevan, V., Shlens, J., and Le, Q. V. (2018).
Learning Transferable Architectures for Scalable Im-
age Recognition. arXiv:1707.07012 [cs, stat].
Neural Architecture Search: Tradeoff Between Performance and Efficiency
1163