Gaussian Nonlinear Line Attractor for Learning Multidimensional Data

Theus H. Aspiras, Vijayan K. Asari, Wesam Sakla

Abstract

The human brain’s ability to extract information from multidimensional data modeled by the Nonlinear Line Attractor (NLA), where nodes are connected by polynomial weight sets. Neuron connections in this architecture assumes complete connectivity with all other neurons, thus creating a huge web of connections. We envision that each neuron should be connected to a group of surrounding neurons with weighted connection strengths that reduces with proximity to the neuron. To develop the weighted NLA architecture, we use a Gaussian weighting strategy to model the proximity, which will also reduce the computation times significantly. Once all data has been trained in the NLA network, the weight set can be reduced using a locality preserving nonlinear dimensionality reduction technique. By reducing the weight sets using this technique, we can reduce the amount of outputs for recognition tasks. An appropriate distance measure can then be used for comparing testing data and the trained data when processed through the NLA architecture. It is observed that the proposed GNLA algorithm reduces training time significantly and is able to provide even better recognition using fewer dimensions than the original NLA algorithm. We have tested this algorithm and showed that it works well in different datasets, including the EO Synthetic Vehicle database and the Sheffield face database.

References

  1. Auda, G. and Kamel, M. (1997). Cmnn: cooperative modular neural networks for pattern recognition. Pattern Recognition Letters, 18(11):1391-1398.
  2. Golub, G. H. and Reinsch, C. (1970). Singular value decomposition and least squares solutions. Numerische Mathematik, 14(5):403-420.
  3. Gomi, H. and Kawato, M. (1993). Recognition of manipulated objects by motor learning with modular architecture networks. Neural Networks, 6(4):485-497.
  4. Gottumukkal, R. and Asari, V. K. (2004). An improved face recognition technique based on modular pca approach. Pattern Recognition Letters, 25(4):429-436.
  5. Guido, W., Spear, P., and Tong, L. (1990). Functional compensation in the lateral suprasylvian visual area following bilateral visual cortex damage in kittens. Experimental brain research, 83(1):219-224.
  6. Happel, B. L. and Murre, J. M. (1994). Design and evolution of modular neural network architectures. Neural networks, 7(6):985-1004.
  7. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proceedings of the national academy of sciences, 79(8):2554-2558.
  8. Lewis, J. E. and Glass, L. (1991). Steady states, limit cycles, and chaos in models of complex biological networks. International Journal of Bifurcation and Chaos, 1(02):477-483.
  9. Roweis, S. T. and Saul, L. K. (2000). Nonlinear dimensionality reduction by locally linear embedding. Science, 290(5500):2323-2326.
  10. Seow, M.-J., Alex, A. T., and Asari, V. K. (2012). Learning embedded lines of attraction by self organization for pose and expression invariant face recognition. Optical Engineering, 51(10):107201-1.
  11. Seow, M.-J. and Asari, V. K. (2004). Recurrent network as a nonlinear line attractor for skin color association. In Advances in Neural Networks-ISNN 2004 , pages 870-875. Springer.
  12. Seow, M.-J. and Asari, V. K. (2006). Recurrent neural network as a linear attractor for pattern association. Neural Networks, IEEE Transactions on, 17(1):246-250.
  13. Tononi, G., Sporns, O., and Edelman, G. M. (1999). Measures of degeneracy and redundancy in biological networks. Proceedings of the National Academy of Sciences, 96(6):3257-3262.
  14. Zhang, K. (1996). Representation of spatial orientation by the intrinsic dynamics of the head-direction cell ensemble: a theory. The journal of neuroscience, 16(6):2112-2126.
Download


Paper Citation


in Harvard Style

H. Aspiras T., K. Asari V. and Sakla W. (2015). Gaussian Nonlinear Line Attractor for Learning Multidimensional Data . In Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015) ISBN 978-989-758-157-1, pages 130-137. DOI: 10.5220/0005597801300137


in Bibtex Style

@conference{ncta15,
author={Theus H. Aspiras and Vijayan K. Asari and Wesam Sakla},
title={Gaussian Nonlinear Line Attractor for Learning Multidimensional Data},
booktitle={Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)},
year={2015},
pages={130-137},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005597801300137},
isbn={978-989-758-157-1},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 7th International Joint Conference on Computational Intelligence - Volume 3: NCTA, (ECTA 2015)
TI - Gaussian Nonlinear Line Attractor for Learning Multidimensional Data
SN - 978-989-758-157-1
AU - H. Aspiras T.
AU - K. Asari V.
AU - Sakla W.
PY - 2015
SP - 130
EP - 137
DO - 10.5220/0005597801300137