
zing Map (HSOM) to the initial weights of its 
neurons, is removed. Increasing the controllability 
on the upper bound of Vapnik-Chervonenkis (V.C.) 
dimension and lower complexity during training 
phase in comparison to HSOM are other advantages 
of applying PCA based code assignment. Since 
principle components are orthogonal set of basis, 
testing the first major components guarantees 
optimizing the set of raw codes for each sub-
mapping. 
In addition, applying the fine variables of the 
major principle components, increase the accuracy 
of the results in comparison to sequential feature 
partitioning approach. The orthogonality of the 
components reduces the probability of selecting a 
variable more than one time. It is demonstrated that 
the accuracy of enhanced M
2
OR is comparable with 
the state of the art methods for Forest Cover Type 
and Wall Following Robot datasets with 
incomparable lower computational complexity; 
however, it requires more enhancements in the line 
of accuracy for other datasets. Therefore, we 
propose to apply enhanced M
2
OR in Reproducing 
Kernel Hilbert Space (RKHS) for future works. 
Online learning is another important aspect to 
improve the abilities of M
2
OR. 
ACKNOWLEDGEMENTS 
This paper is supported in part by Information and 
Communication Technology (ICT) under grant T-
19259-500 and by National Elites of Foundation of 
Iran. 
REFERENCES 
Bala, M., Agrawal, R. K., 2009, Evaluation of Decision 
Tree SVM Framework Using Different Statistical 
Measures,  International Conference on Advances in 
Recent Technologies in Communication and 
Computing, 341-345. 
Bavafa, E., Yazdanpanah, M. J., Kalaghchi, B., Soltanian-
Zadeh, H., 2009, Multiscale Cancer Modeling: in the 
Line of Fast Simulation and Chemotherapy, 
Mathematical and Computer Modelling 49, 
1449_1464. 
Bavafaye Haghighi, E., Rahmati, M., Shiry Gh., S., 
XXXX, Mapping to Optimal Regions; a New Concept 
for Multiclassification Task to Reduce Complexity, is 
submitted to the journal of Experimental & 
Theoretical Artificial Intelligence. 
Bavafaye Haghighi, E., Rahmati, M., XXXX, Theoretical 
Aspects of Mapping to Multidimensional Optimal 
Regions as a Multiclassifier, is submitted to the 
journal of Intelligent Data Analysis. 
Bazaraa, M., Sherali, H. D., Shetty, C. M., 2006, 
Nonlinear Programming, theory and Algorithms, 3
rd
 
ed., John Wiley and Sons. 
Ben-Hur, A., Horn, D., Ziegelmann, H. T., Vapnik, V., 
2001, Support Vector Clustering, Journal of Machine 
Learning Research 2, 125-137. 
Ditenbach, M., Rauber A., Merkel, D., 2002, Uncovering 
hierarchical structure in data using the growing 
hierarchical self-organizing map, Neurocomputing 48, 
199-216. 
El-Rewini, H., Abd-El-Barr, M., 2005, Advanced 
Computer Architechture and Parallel Processing, 
John Willey and Sons. 
Fu, Zh., Robles-Kelly, A., Zhou, J., 2010, Mixing Linear 
SVMs for Nonlinear Classification, IEEE 
Transactions On Neural Networks 21, 1963-1975. 
Heath, M. T., 1997, Scientific Computing: An Introductory 
Survey, Mc Graw Hill. 
Hofmann, T., Scheolkopf, B., Smola, A. J., 2008, Kernel 
Methods in Machine Learning, The Annals of 
Statistics 36, 1171–1220. 
Izenman, A. J., 2008, Modern Multivariate Statistical 
Technics, Springer. 
Jolliffe, I. T., 2002, Principle Component Analysis, 2
nd
 ed., 
Springer. 
Kacprzyk, J., 2007, Challenges for Computational 
Intelligence, in: A Trend on Regularization and Model 
Selection in Statistical Learning: A Bayesian Ying 
Yang Learning Perspective, Springer, 343-406. 
Kietzmann, T. C., Lange, S., M., Riedmiller, 2008, 
Increamental GRLVQ: Learning Relevant Features for 
3D Object Recognition, Neurocomputing 71, 2868-
2879. 
Kohonen, T., 1997, Self Organizing Maps, Springer Series 
in Information Science, 2
nd
 ed., Springer. 
Kumara, K. V., Negi, A., 2008, SubXPCA and a 
generalized feature partitioning approach to principal 
component analysis, Pattern Recognition, 1398-1409. 
LeCun, Y., Bottou, L., Bengio Y., Haffner, P., 1986, 
Gradient-Based Learning Applied to Document 
Recognition, Proceedings of IEEE, 86, 2278-2324. 
Martin, C., Diaz, N. N., Ontrup, J., Nattkemper, T. W., 
2008, Hyperbolic SOM-based Clustering of DNA 
Fragment Features for Taxonomic Visualization and 
Classification, Bioinformatics 24, 1568–1574. 
Meyer, C. D., 2000, Matrix Analysis and Applied Linear 
Algebra, SIAM. 
MNIST: http://yann.lecun.com/exdb/mnist/. 
Nene, S. A., Nayar, Sh. K., Murase, H., 1996, Columbia 
Object Image Library (COIL 100), Technical Report 
No. CUCS-006-96, Department of Computer Science, 
Columbia University. 
Ontrup, J., Ritter, H., 2006, Large-Scale data exploration 
with the hierarchically growing hyperbolic SOM, 
Neural Networks 19, 751-761. 
Sawaragi, Y., Nakayama, H., Tanino, T., 1985, Theory of 
Multiobjective Optimization, Academic Press. 
Schoelkopf, B., Smola, A. J., 2002, Learning with 
EnhancingtheAccuracyofMappingtoMultidimensionalOptimalRegionsusingPCA
545