structure significance as a tool of a negative influence interference rejection on neural
network adaptation. As the hidden units in the not split network are perceived as
some input information processing for output units, where a multiple pattern classifi-
cation is realized on the basis of diametrically distinct criteria (e.g. neural network
has to classify patterns according to their form, location, colors, ...), so in the begin-
ning of an adaptation process the interference can be the reason that output units also
get further information about general object classifications than the one which is
desired from them. This negative interference influence on running the adaptive
process is removed just at the modular neural network architecture, which is proved
also by results of the performed experiment. The winning modular network architec-
ture was the product of emergence using evolutional algorithms. The neural network
serves here as a special way of solving the evolutional algorithm, because of its struc-
ture and properties it can be slightly transformed into an individual in evolutionary
algorithm.
References
1. Di Fernando, A., Calebretta, R., and Parisi, D. (2001) Evolving modular architectures for
neural networks. In French R., and Sougne, J. (eds.).Proceedings of the Sixth Neural Com-
putation and Psychology Workshop: Evolution, Learning and Development. Springer Ver-
lag, London.
2. Fausett, L. V. (1994) Fundamentals of neural networks. Prentice-Hall, Inc., Englewood
Cliffs, New Jersey.
3. Hampshire, J. and Waibel, A. The Meta-Pi network: Building distributed knowledge repre-
sentation for robust pattern recognition. Technical Report CMU-CS-89-166. Pittsburgh,
PA: Carnegie Mellon University.
4. Jacobs, R. A., Jordan, M. I., Nowlan, S.J., and Hinton, G. E. (1991) Adaptive mixtures of
local experts. Neural Computation, 3, pp.79-97.
5. Jacobs, R. A., Jordan, M. I. (1992). Computational consequences of a bias toward short
connections. Journal of Cognitive Neuroscience, 4, 323–336.
6. Jacobs, R. A. (1994) Hierarchical mixtures of experts and the EM algorithm. Neural Com-
putation, 6, 181-214.
7. Jordan, M. I. and Jacobs, R. A. (1995) Modular and Hierarchical Learning Systems. In M.
A. Arbib (Ed) The Handbook of Brain Theory and Neural Networks. pp 579-581.
8. Kvasnička, V; Pelikán, M.; Pospíchal, J. (1996) Hill climbing with learning (an abstraction
of genetic algorithm). Neural network world 5, 773-796.
9. Rueckl, J. G. (1989) Why are “What” and “Where” processed by separate cortical visual
systems? A computational investigation. Journal of Cognitive Neuroscience 2, 171-186.
10. Volna, E. (2002) Neural structure as a modular developmental system. In P. Sinčák, J.
Vaščák, V. Kvasnička, J. Pospíchal (eds.): Intelligent technologies – theory and applica-
tions. IOS Press, Amsterdam, pp.55-60.
11. Volna, E. (2007) Designing Modular Artificial Neural Network through Evolution. In J.
Marques de Sá, L. A. Alexandre, W. Duch, and D. P.Mandic (eds.) Artificial Neural Net-
works – ICANN’07, Lecture Notes in Computer Science, vol. 4668, Springer-Verlag series,
pp 299-308.
32