Evolutionary Techniques for Neural Network Optimization
Eva Volná
2005
Abstract
The idea of evolving artificial networks by evolutionary algorithms is based on a powerful metaphor: the evolution of the human brain. The application of evolutionary algorithms to neural network optimization is an active field of study. The success and speed of training of neural network is based on the initial parameter settings, such as architecture, initial weights, learning rates, and others. A lot of research is being done on how to find the optimal network architecture and parameter settings given the problem it has to learn. One possible solution is use of evolutionary algorithms to neural network optimization systems. We can distinguish two separate issues for it: on the one hand weight training, and on the other hand architecture optimization. Next, we will focus on the architecture optimization and especially on the comparison of different strategies of neural network architecture encoding for the purchase of the evolutionary algorithm.
References
- Arena, P. - Caponetto, R. - Fortuna, L. - Xibilia, M.G.: M.L.P. optimal topology via genetic algorithms. Proceedings of the international conference in Innsbruck, pp. 671-674, Austria 1993.
- Beale, R. - Jackson, T.: Neural Computing: An Introduction. J W Arrowsmith Ltd, Bristol, Greit Britain 1992.
- Goldberg, D. E.: Genetic algorithm in search optimalization and machine learning. Addison-Wesley, Reading., Massachusets 1989.
- Herz, J. - Krogh, A. - Palmer, R. G.: Introduction to the Theory of Neural Computation. Addison Wesley Publishing Company, Redwood City 1991.
- Lawrence, D.: Handbook of genetic algorithms. Van Nostrand Reinhold, New York 1991.
- Köhn, P. Genetic encoding strategies for neural networks. Master's thesis, University of Tennessee, Knoxville, IPMU, 1996.
- Maniezzo, V. Searching among search spaces: Hastening the genetic evolution of feedforward neural networks. In R. F. Albrecht, C. R. Reeves, and N. C. Steele, editors: Artificial Neural Nets and Genetic Algorithms. Springer-Verlag, pp. 635-643 (1993).
- Koza, J. R. and J. P. Rice. Genetic generation of both the weights and architecture for a neural network. In Proceedings of the International Joint Conference on Neural Networks, Volume II, IEEE Press, 1991.
- Harp, S. A., Samad, T., and Guha, A. Towards the genetic synthesis of neural networks. In J. D. Schaffer, Ed. San Mateo eds.: Proc. of the Third Intl. Conf. on on Genetic Algorithms and Their Applications. CA: Morgan Kaufmann pp. 360-369, (1989).
- Jacob, C., Rehder, J. Evolution of neural net architectures by a hierarchical grammar-based genetic system. In Proceedings of the International Joint Conference on Neural Networks and Genetic Algorithms, Innsbruck, pp. 72-79. (1993 11.Angeline, P. J., G. M. Saunders, and J. B. Pollack. An evolutionary algorithm that constructs recurrent neural networks. IEEE Transactions on Neural Networks, 5: 54-65, 1993.
- Lindenmayer A.: Mathematical models for cellular interaction in development I, II. Journal of Theoretical biology (18): 280-315 1968.
- Kitano, H. Design neural network using genetic algorithm with graph generation system. Complex systems,(4): 461-476, 1990.
- Nolfi, S., Parisi, D. Genotypes for neural networks. In M. A. Arbib, editor: The Handbook of Brain Theory and Neural Networks. MIT Press 1995.
- Boers, E.J.W., Kuiper, H., Happel, B.L.M., and Sprinkhuizen-Kuyper, I.G.. Designing modular artificial neural networks. In H.A. Wijshoff, editor: Proceedings of Computing Science in The Netherlands. pp. 87-96, Amsterdam, 1993. SION, Stichting Mathematisch Centrum. Amsterdam 1993.
- Channon, A. D. and Damper, R. I. Evolving novel behaviors via natural selection, In Adami, C., Belew, R. K., Kitano, H. and Taylor, C. E., eds.: Alife IV. Sixth International Conference on Artificial life. , pp. 384-388. Bradford Books/MIT Press, Cambridge, MA, 1998.
- Aho, I., Kemppainen, H., Koskimies, K., Mäkinen, E., and Niemi, T. Searching neural network structures with L systems and genetic algorithms. International Journal of Computer Mathematics, 73 (1): 55-75, 1999.
- Gruau, F., D. Whitley, and L. Pyeatt. A comparison between cellular encoding and direct encoding for genetic neural networks. In J. R. Koza, D. E. Goldberg, D. B. Fogel, and R. L. Riolo, editors, Genetic Programming 1996: Proceedings of the First Annual Conference, pp. 81-89, Cambridge, MA, MIT Press, 1996.
- Opitz, D. W. and J. W. Shavlik. Connectionist theory refinement: Genetically searching the space of network topologies. Journal of Artificial Intelligence Research, 6: 177-209, 1997.
- Yao, X. and Y. Liu. Towards designing artificial neural networks by evolution. Applied Mathematics and Computation, 91(1): 83-90, 1996.
- Volna, E.. Learning algorithm which learns both architectures and weights of feedforward neural networks. Neural Network World. Int. Journal on Neural & Mass-Parallel Comp. and Inf. Systems. 8 (6): 653-664, 1998.
Paper Citation
in Harvard Style
Volná E. (2005). Evolutionary Techniques for Neural Network Optimization . In Proceedings of the 1st International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2005) ISBN 972-8865-36-8, pages 3-11. DOI: 10.5220/0001191800030011
in Bibtex Style
@conference{anniip05,
author={Eva Volná},
title={Evolutionary Techniques for Neural Network Optimization},
booktitle={Proceedings of the 1st International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2005)},
year={2005},
pages={3-11},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0001191800030011},
isbn={972-8865-36-8},
}
in EndNote Style
TY - CONF
JO - Proceedings of the 1st International Workshop on Artificial Neural Networks and Intelligent Information Processing - Volume 1: ANNIIP, (ICINCO 2005)
TI - Evolutionary Techniques for Neural Network Optimization
SN - 972-8865-36-8
AU - Volná E.
PY - 2005
SP - 3
EP - 11
DO - 10.5220/0001191800030011