Extreme Learning Machines with Simple Cascades

Tom Gedeon, Anthony Oakden

2015

Abstract

We compare extreme learning machines with cascade correlation on a standard benchmark dataset for comparing cascade networks along with another commonly used dataset. We introduce a number of hybrid cascade extreme learning machine topologies ranging from simple shallow cascade ELM networks to full cascade ELM networks. We found that the simplest cascade topology provided surprising benefit with a cascade correlation style cascade for small extreme learning machine layers. Our full cascade ELM architecture achieved high performance with even a single neuron per ELM cascade, suggesting that our approach may have general utility, though further work needs to be done using more datasets. We suggest extensions of our cascade ELM approach, with the use of network analysis, addition of noise, and unfreezing of weights.

References

  1. Bartlett, P. L. (1998). The sample complexity of pattern classification with neural networks: the size of the weights is more important than the size of the network. Information Theory, IEEE Transactions on, 44(2), 525-536.
  2. Bin, L., Yibin, L., & Xuewen, R. (2011). Comparison of echo state network and extreme learning machine on nonlinear prediction. Journal of Computational Information Systems, 7(6), 1863-1870.
  3. Blake, C., & Merz, C. J. (1998). UCI Repository of machine learning databases. Accessed August 2014. tinyurl.com/Diebates.
  4. Brown, W. M., Gedeon, T. D., & Groves, D. I. (2003). Use of noise to augment training data: a neural network method of mineral-potential mapping in regions of limited known deposit examples. Natural Resources Research, 12(2), 141-152.
  5. Fahlman, S. E. and C. Lebiere, "The cascade-correlation learning architecture," Advances in Neural Information Processing, vol. 2, D.S. Touretzky, (Ed.) San Mateo, CA:Morgan Kauffman, 1990, pp. 524-532.
  6. Gedeon, T. D. (1997). Data mining of inputs: analysing magnitude and functional measures. International Journal of Neural Systems, 8(02), 209-218.
  7. Gedeon, T. D., & Kóczy, L. T. (1998, October). Hierarchical co-occurrence relations. In Systems, Man, and Cybernetics, 1998. 1998 IEEE International Conference on (Vol. 3, pp. 2750-2755). IEEE.
  8. Huang, G. B., Wang, D. H., & Lan, Y. (2011). Extreme learning machines: a survey. International Journal of Machine Learning and Cybernetics, 2(2), 107-122.
  9. Huang, G. B., Zhu, Q. Y., & Siew, C. K. (2004, July). Extreme learning machine: a new learning scheme of feedforward neural networks. In Neural Networks, 2004. Proceedings. 2004 IEEE International Joint Conference on (Vol. 2, pp. 985-990). IEEE.
  10. Jaeger, H. (2001). The “echo state” approach to analysing and training recurrent neural networks-with an erratum note. Bonn, Germany: German National Research Center for Information Technology, GMD Technical Report, 148, 34.
  11. Kools, J. (2013). 6 functions for generating artificial datasets. Accessed August 2014. www.mathworks. com.au/matlabcentral/fileexchange/41459-6- functions-for-generating-artificial-datasets.
  12. Marques, I., & Graña, M. (2012). Face recognition with lattice independent component analysis and extreme learning machines. Soft Comput., 16(9):1525-1537.
  13. Rho, Y. J., & Gedeon, T. D. (2000). Academic articles on the web: reading patterns and formats. Int. Journal of Human-Computer Interaction, 12(2), 219-240.
  14. Riedmiller, M. (1994). Rprop - Description and Implementation Details, Technical Report, University of Karlsruhe.
  15. Rumelhart, D. E., Hinton, G. E., & Williams, R. J., “Learning internal representations by error propagation,” in Rumelhart, D. E., McClelland, J., Parallel distributed processing, v:1, MIT Press, 1986.
  16. Shen, T., & Zhu, D. (2012, June). Layered_CasPer: Layered cascade artificial neural networks. In Neural Networks (IJCNN), The 2012 International Joint Conference on (pp. 1-7). IEEE.
  17. Sun, Y., Yuan, Y., & Wang, G. (2014). Extreme learning machine for classification over uncertain data. Neurocomputing, 128, 500-506.
  18. Tissera, M. D., & McDonnell, M. D. (2015). Deep Extreme Learning Machines for Classification. In Proceedings of ELM-2014 Volume 1 (pp. 345-354). Springer International Publishing.
  19. Treadgold, N. K., & Gedeon, T. D. (1997). A cascade network algorithm employing progressive RPROP. In Biological and Artificial Computation: From Neuroscience to Technology (pp. 733-742). Springer Berlin Heidelberg.
  20. Wefky, A., Espinosa, F., Leferink, F., Gardel, A., & VogtArdatjew, R. (2013). On-road magnetic emissions prediction of electric cars in terms of driving dynamics using neural networks. Progress In Electromagnetics Research, 139, 671-687.
  21. Wong, P. M., Taggart, I. J., & Gedeon, T. D. (1995). Use of Neural-Network Methods to Predict Porosity and Permeability of a Petroleum Reservoir. AI Applications, 9(2), 27-37.
Download


Paper Citation


in Harvard Style

Gedeon T. and Oakden A. (2015). Extreme Learning Machines with Simple Cascades . In Proceedings of the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications - Volume 1: SIMULTECH, ISBN 978-989-758-120-5, pages 271-278. DOI: 10.5220/0005539502710278


in Bibtex Style

@conference{simultech15,
author={Tom Gedeon and Anthony Oakden},
title={Extreme Learning Machines with Simple Cascades},
booktitle={Proceedings of the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications - Volume 1: SIMULTECH,},
year={2015},
pages={271-278},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005539502710278},
isbn={978-989-758-120-5},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 5th International Conference on Simulation and Modeling Methodologies, Technologies and Applications - Volume 1: SIMULTECH,
TI - Extreme Learning Machines with Simple Cascades
SN - 978-989-758-120-5
AU - Gedeon T.
AU - Oakden A.
PY - 2015
SP - 271
EP - 278
DO - 10.5220/0005539502710278