A Walk in the Statistical Mechanical Formulation of Neural Networks - Alternative Routes to Hebb Prescription

Elena Agliari, Adriano Barra, Andrea Galluzzi, Daniele Tantari, Flavia Tavani

2014

Abstract

Neural networks are nowadays both powerful operational tools (e.g., for pattern recognition, data mining, error correction codes) and complex theoretical models on the focus of scientific investigation. As for the research branch, neural networks are handled and studied by psychologists, neurobiologists, engineers, mathematicians and theoretical physicists. In particular, in theoretical physics, the key instrument for the quantitative analysis of neural networks is statistical mechanics. From this perspective, here, we review attractor networks: starting from ferromagnets and spin-glass models, we discuss the underlying philosophy and we recover the strand paved by Hopfield, Amit-Gutfreund-Sompolinky. As a sideline, in this walk we derive an alternative (with respect to the original Hebb proposal) way to recover the Hebbian paradigm, stemming from mixing ferromagnets with spin-glasses. Further, as these notes are thought of for an Engineering audience, we highlight also the mappings between ferromagnets and operational amplifiers, hoping that such a bridge plays as a concrete prescription to capture the beauty of robotics from the statistical mechanical perspective.

References

  1. Agliari, E., Barra, A., Burioni, R., Di Biasio, A., and Uguzzoni, G. (2013). Collective behaviours: from biochemical kinetics to electronic circuits. Scientific Reports, 3.
  2. Amit, D. J. (1992). Modeling brain function. Cambridge University Press.
  3. Bardeen, J., Cooper, L. N., and Schrieffer, J. R. (1957). Theory of superconductivity. Physical Review, 108:1175.
  4. Barra, A., Genovese, G., Guerra, F., and Tantari, D. (2012). How glassy are neural networks?. Journal of Statistical Mechanics: Theory and Experiment, P07009.
  5. Bean, C. P. (1962). Magnetization of hard superconductors. Physical Review Letters, 8:250.
  6. Castellana, M., Decelle, A., Franz, S., Mezard, M., and Parisi, G. (2010). The hierarchical random energy model. Physical Review Letters, 104:127206.
  7. Castiglione, P., Falcioni, M., Lesne, A., and Vulpiani, A. (2012). Chaos and coarse graining in statistical mechanics,. Cambridge University Press.
  8. Coolen, A. C. C., Kühn, R., and Sollich, P. (2005). Theory of neural information processing systems. Oxford University Press.
  9. Domhoff, G. W. (2003). Neural networks, cognitive development, and content analysis. American Psychological Association.
  10. Ellis, R. (2005). Entropy, large deviations, and statistical mechanics., volume 1431. Taylor & Francis.
  11. Hagan, M. T., Demuth, H. B., and Beale, M. H. (1996). Neural network design. Pws Pub.,, Boston.
  12. Harris-Warrick, R. M., editor (1992). Dynamic biological networks. MIT press.
  13. Hebb, D. O. (1940). The organization of behavior: A neuropsychological theory. Psychology Press.
  14. Hertz, John, A. K. and Palmer, R. (1991). Introduction to the theory of neural networks. Lecture Notes.
  15. Hopfield, J. J. (1982). Neural networks and physical systems with emergent collective computational abilities. Proc. Natl. A. Sc., 79(8):2554-2558.
  16. Kittel, C. (2004). Elementary statistical physics. Courier Dover Publications,.
  17. Martindale, C. (1991). Cognitive psychology: A neuralnetwork approach. Thomson Brooks/Cole Publishing Co.
  18. McCulloch, W. S. and Pitts, W. (1943). A logical calculus of the ideas immanent in nervous activity. The bulletin of mathematical biophysics, 5.4:115-133.
  19. Mézard, M., Parisi, G., and Virasoro, M. A. (1987). Spin glass theory and beyond, volume 9. World scientific, Singapore.
  20. Miller, W. T., Werbos, P. J., and Sutton, R. S., editors (1995). Neural networks for control. MIT press.
  21. Reichl, L. E. and Prigogine, I. (1980). A modern course in statistical physics, volume 71. University of Texas press, Austin.
  22. Rolls, E. T. and Treves, A. (1998). Neural networks and brain function.
  23. Rosenblatt, F. (1958). The perceptron: a probabilistic model for information storage and organization in the brain. Psychological review, 65(6):386.
  24. Saad, D., editor (2009). On-line learning in neural networks, volume 17. Cambridge University Press.
  25. Tuckwell, H. C. (2005). Introduction to theoretical neurobiology., volume 8. Cambridge University Press.
  26. Wilson, K. G. (1971). Renormalization group and critical phenomena. Physical Review B, 4:3174.
Download


Paper Citation


in Harvard Style

Agliari E., Barra A., Galluzzi A., Tantari D. and Tavani F. (2014). A Walk in the Statistical Mechanical Formulation of Neural Networks - Alternative Routes to Hebb Prescription . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014) ISBN 978-989-758-054-3, pages 210-217. DOI: 10.5220/0005077902100217


in Bibtex Style

@conference{ncta14,
author={Elena Agliari and Adriano Barra and Andrea Galluzzi and Daniele Tantari and Flavia Tavani},
title={A Walk in the Statistical Mechanical Formulation of Neural Networks - Alternative Routes to Hebb Prescription},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014)},
year={2014},
pages={210-217},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005077902100217},
isbn={978-989-758-054-3},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014)
TI - A Walk in the Statistical Mechanical Formulation of Neural Networks - Alternative Routes to Hebb Prescription
SN - 978-989-758-054-3
AU - Agliari E.
AU - Barra A.
AU - Galluzzi A.
AU - Tantari D.
AU - Tavani F.
PY - 2014
SP - 210
EP - 217
DO - 10.5220/0005077902100217