EVOLVED DUAL WEIGHT NEURAL ARCHITECTURES TO FACILITATE INCREMENTAL LEARNING

John A. Bullinaria

Abstract

This paper explores techniques for improving incremental learning performance for generalization tasks. The idea is to generalize well from past input-output mappings that become available in batches over time, without the need to store past batches. Standard connectionist systems have previously been optimized for this problem using an evolutionary computation approach. Here that approach is explored more generally and rigorously, and dual weight architectures are incorporated into the evolutionary neural network approach and shown to result in improved performance over existing incremental learning systems.

References

  1. Ans, B., Rousset, S, French, R.M., Musca, S., 2002. Preventing Catastrophic Interference in MultipleSequence Learning Using Coupled Reverberating Elman Networks. Proceedings of the Twenty-fourth Annual Conference of the Cognitive Science Society, 71-76. Mahwah, NJ: LEA
  2. Bishop, C.M., 1995. Neural Networks for Pattern
  3. Blake, C.L., Merz, C.J., 1998. UCI Repository of Machine Learning Databases. University of California, Irvine. http://www.ics.uci.edu/mlearn/MLRepository.html
  4. Bullinaria, J.A., 2007. Using Evolution to Improve Neural Network Learning: Pitfalls and Solutions. Neural Computing & Applications, 16, 209-226.
  5. Frean, M., Robins, A., 1999. Catastrophic Forgetting in Simple Neural Networks: An Analysis of the Pseudorehearsal Solution. Network: Computation in Neural Systems, 10, 227-236.
  6. French, R.M., 1999. Catastrophic Forgetting in Connectionist Networks. Trends in Cognitive Sciences, 4, 128-135.
  7. Giraud-Carrier, C., 2000. A Note on the Utility of Incremental Learning. AI Communications, 13, 215- 223.
  8. Hinton, G.E., Plaut, D.C., 1987. Using Fast Weights to Deblur Old Memories. Proceedings of the Ninth Annual Conference of the Cognitive Science Society, 177-186. Hillsdale, NJ: LEA.
  9. McClelland, J.L., McNaughton, B.L., O'Reilly, R.C., 1995. Why There Are Complementary Learning Systems in the Hippocampus and Neocortex: Insights From the Successes and Failures of Connectionist Models of Learning and Memory. Psychological Review, 102, 419-457.
  10. Polikar, R., Byorick, J., Krause, S., Marino, A., Moreton, M., 2002. Learn++: A Classifier Independent Incremental Learning Algorithm for Supervised Neural Networks. Proceedings of the 2002 International Joint Conference on Neural Networks, 2, 1742-1747.
  11. Polikar, R., Udpa, L., Udpa, S.S., Honavar, V., 2001. Learn++, 2001. An Incremental Learning Algorithm for Multi-Layer Perceptron Networks. IEEE Transactions on Systems, Man, and Cybernetics-Part C: Applications and Reviews, 31, 497-508.
  12. Seipone, T., Bullinaria, J.A., 2005a. The Evolution of Minimal Catastrophic Forgetting in Neural Systems. Proceedings of the Twenty-Seventh Annual Conference of the Cognitive Science Society, 1991- 1996. Mahwah, NJ: LEA.
  13. Seipone, T., Bullinaria, J.A., 2005b. Evolving Improved Incremental Learning Schemes for Neural Network Systems. Proceedings of the 2005 IEEE Congress on Evolutionary Computing (CEC 2005), 273-280. Piscataway, NJ: IEEE.
  14. Yao, X., 1999. Evolving Artificial Neural Networks. Proceedings of the IEEE, 87, 1423-1447.
Download


Paper Citation


in Harvard Style

Bullinaria J. (2009). EVOLVED DUAL WEIGHT NEURAL ARCHITECTURES TO FACILITATE INCREMENTAL LEARNING . In Proceedings of the International Joint Conference on Computational Intelligence - Volume 1: ICNC, (IJCCI 2009) ISBN 978-989-674-014-6, pages 427-434. DOI: 10.5220/0002315304270434


in Bibtex Style

@conference{icnc09,
author={John A. Bullinaria},
title={EVOLVED DUAL WEIGHT NEURAL ARCHITECTURES TO FACILITATE INCREMENTAL LEARNING},
booktitle={Proceedings of the International Joint Conference on Computational Intelligence - Volume 1: ICNC, (IJCCI 2009)},
year={2009},
pages={427-434},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0002315304270434},
isbn={978-989-674-014-6},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Joint Conference on Computational Intelligence - Volume 1: ICNC, (IJCCI 2009)
TI - EVOLVED DUAL WEIGHT NEURAL ARCHITECTURES TO FACILITATE INCREMENTAL LEARNING
SN - 978-989-674-014-6
AU - Bullinaria J.
PY - 2009
SP - 427
EP - 434
DO - 10.5220/0002315304270434