Author:
Jacek Kabzinski
Affiliation:
Lodz University of Technology, Poland
Keyword(s):
Machine Learning, Feedforward Neural Network, Extreme Learning Machine, Neural Approximation.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Artificial Intelligence and Decision Support Systems
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Computational Intelligence
;
Computer-Supported Education
;
Domain Applications and Case Studies
;
Enterprise Information Systems
;
Fuzzy Systems
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Industrial, Financial and Medical Applications
;
Learning Paradigms and Algorithms
;
Methodologies and Methods
;
Neural Network Software and Applications
;
Neural Networks
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Signal Processing
;
Soft Computing
;
Stability and Instability in Artificial Neural Networks
;
Supervised and Unsupervised Learning
;
Support Vector Machines and Applications
;
Theory and Methods
Abstract:
The main aim of this paper is to stress the fact that the sufficient variability of activation functions (AF) is important for an Extreme Learning Machine (ELM) approximation accuracy and applicability. A slight modification of the standard ELM procedure is proposed, which allows increasing the variance of each AF, without losing too much from the simplicity of random selection of parameters. The proposed modification does not increase the computational complexity of an ELM training significantly. Enhancing the variation of AFs results in reduced output weights norm, better numerical conditioning of the output weights calculation, smaller errors for the same number of the hidden neurons. The proposed approach works efficiently together with the Tikhonov regularization of ELM.