References
1. R. Battiti and F. Masulli. Bfgs optimization for faster and automated supervised learning.
INNC 90 Paris, International Neural Network Conference,, pages 757–760, 1990.
2. Csar Dalto Berci. Observadores Inteligentes de Estado: Propostas. Tese de Mestrado,
LCSI/FEEC/UNICAMP, Campinas, Brasil, 2008.
3. Csar Dalto Berci and Celso Pascoli Bottura. Observador inteligente adaptativo neural no
baseado em modelo para sistemas no lineares. Proceedings of 7th Brazilian Conference on
Dynamics, Control and Applications. Presidente Prudente, Brasil, 7:209–215, 2008.
4. J
¨
urgen Branke. Evolutionary algorithms for neural network design and training. In 1st Nordic
Workshop on Genetic Algorithms and its Applications, 1995. Vaasa, Finland, January 1995.
5. D.J. Chalmers. The evolution of learning: An experiment in genetic connectionism. Pro-
ceedings of the 1990 Connectionist Summer School, pages 81–90, 1990.
6. Charles Darwin. On the Origin of Species by Means of Natural Selection, or the Preservation
of Favoured Races in the Struggle for Life. John Murray, London, 1859.
7. D.C. Dennett. Darwin’s Dangerous Idea: Evolution and the Meanings of Life. Penguim
Books, 1995.
8. A. Fiszelew, P. Britos, A. Ochoa, H. Merlino, E. Fernndez, and R. Garca-Martnez. Finding
optimal neural network architecture using genetic algorithms. Software & Knowledge Engi-
neering Center. Buenos Aires Institute of Technology. Intelligent Systems Laboratory. School
of Engineering. University of Buenos Aires., 2004.
9. C. Fyfe. Artificial Neural Networks. Department of Computing and Information Systems,
The University of Paisley, Edition 1.1, 1996.
10. D.G. Luenberger. Linear and Nonlinear Programming. Addison-Wesley, 2nd edition, 1984.
11. M.F. Mller. Learning by conjugate gradients. The 6th International Meeting of Young Com-
puter Scientists, 1990.
12. M.F. Mller. A scaled conjugate gradient algorithm for fast supervised learning. Computer
Science Department, University of Aarhus Denmark, 6:525–533, 1990.
13. D. Montana and L. Davis. Training feedforward neural networks using genetic algorithms.
Proceedings of the International Joint Conference on Artificial Intelligence, pages 762–767,
1989.
14. D. E. Rumelhart, R. Durbin, R. Golden, and Y. Chauvin. Backpropagation: The basic theory.
Lawrence Erlbaum Associates, Inc., 1995.
15. D. E. Rumelhart, G. E. Hinton, and R. J. Williams. Learning internal representations by
error propagation, in: Parallel distributed processing: Exploration in the microstructure of
cognition. Eds. D.E. Rumelhart, J.L. McClelland, MIT Press, Cambridge, MA., pages 318–
362, 1986.
16. Udo Seiffert. Multiple layer perceptron training using genetic algorithms. ESANN’2001
proceedings - European Symposium on Artificial Neural Networks, pages 159–164, 2001.
17. Zhi-Hua Zhou, Jian-Xin Wu, Yuan Jiang, and Shi-Fu Chen. Genetic algorithm based selec-
tive neural network ensemble. Proceedings of the 17th International Joint Conference on
Artificial Intelligence., 2:797–802, 2001.
12