Boosting of Neural Networks over MNIST Data
Eva Volna, Vaclav Kocian, Martin Kotyrba
2014
Abstract
The methods proposed in the article come out from a technique called boosting, which is based on the principle of combining a large number of so-called weak classifiers into a strong classifier. The article is focused on the possibility of increasing the efficiency of the algorithms via their appropriate combination, and particularly increasing their reliability and reducing their time exigency. Time exigency does not mean time exigency of the algorithm itself, nor its development, but time exigency of applying the algorithm to a particular problem domain. Simulations and experiments of the proposed processes were performed in the designed and created application environment. Experiments have been conducted over the MNIST database of handwritten digits that is commonly used for training and testing in the field of machine learning. Finally, a comparative experimental study with other approaches is presented. All achieved results are summarized in a conclusion.
References
- Breiman, L. (1996). Bagging predictors. In Machine Learning. (pp. 123-140).
- Davidson, I., and Fan, W. (2006). When efficient model averaging out-performs boosting and bagging. Knowledge Discovery in Databases, 478-486.
- Dietterich, T. G. (2000). An experimental comparison of three methods for constructing ensembles of decision trees: Bagging, boosting, and randomization. Machine learning, 139-157.
- Drucker, H., Schapire, R., and Simard, P. (1993). Boosting performance in neural networks. Int. Journ. of Pattern Recognition. and Artificial Intelligence, 705-719.
- Fausett, L. V. (1994). Fundamentals of Neural Networks. Englewood Cliffs, New Jersey: Prentice-Hall, Inc.
- Freund, Y. (1995). Boosting a weak learning algorithm by majority. Information and Computation, 256-285.
- Freund, Y., and Schapire, R. (1996). Experiments with a New Boosting Algorithm. ICML, 148-156.
- Freund, Y., and Schapire, R. E. (1997). A decisiontheoretic generalization of on-line learning and an application to boosting. J. of Comp. and System Sciences, 119-139.
- Freund, Y., and Schapire, R. (1999). A short introduction to boosting. J. Japan. Soc. for Artif. Intel., 771-780.
- Iwakura, T., Okamoto, S., and Asakawa, K. (2010). An AdaBoost Using a Weak-Learner Generating Several Weak-Hypotheses for Large Training Data of Natural Language Processing. IEEJ Transactions on Electronics, Information and Systems, 83-91.
- Kocian, V., Volná, E., Janošek, M., and Kotyrba, M. (2011). Optimizatinon of training sets for Hebbianlearningbased classifiers. In Proc. of the 17th Int. Conf. on Soft Computing, Mendel 2011, pp. 185-190.
- Kocian, V., and Volná, E. (2012). Ensembles of neuralnetworks-based classifiers. In Proc. of the 18th Int. Conf. on Soft Computing, Mendel 2012, pp. 256-261.
- LeCun, Y. Bottou, L. Bengio, Y. and Haffner, P. (1998) Gradient-based learning applied to document recognition. Proceedings of the IEEE, 86(11):2278- 2324, November.
- LeCun, Y., Cortes, C., and Burges, C. (2014). Te MNIST Database. Retrieved from http://yann.lecun.com/ exdb/mnist/
- Quinlan, J. R. (1996). Bagging, boosting, and C4.5. Thirteenth National Conference on Artificial Intelligence, (pp. 725-730).
- Schapire, R. E. (1990). The strength of weak learnability. Machine Learning, 197-227.
- Schapire, R. E. (1999). A brief introduction to boosting. Sixteenth International Joint Conference on Artificial Intelligence IJCAI (pp. 1401-1406). Morgan Kaufmann Publishers Inc.
- Valiant, L. G. (1984). A theory of the learnable. Communications of the ACM, 1134-1142.
Paper Citation
in Harvard Style
Volna E., Kocian V. and Kotyrba M. (2014). Boosting of Neural Networks over MNIST Data . In Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014) ISBN 978-989-758-054-3, pages 256-263. DOI: 10.5220/0005131802560263
in Bibtex Style
@conference{ncta14,
author={Eva Volna and Vaclav Kocian and Martin Kotyrba},
title={Boosting of Neural Networks over MNIST Data},
booktitle={Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014)},
year={2014},
pages={256-263},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0005131802560263},
isbn={978-989-758-054-3},
}
in EndNote Style
TY - CONF
JO - Proceedings of the International Conference on Neural Computation Theory and Applications - Volume 1: NCTA, (IJCCI 2014)
TI - Boosting of Neural Networks over MNIST Data
SN - 978-989-758-054-3
AU - Volna E.
AU - Kocian V.
AU - Kotyrba M.
PY - 2014
SP - 256
EP - 263
DO - 10.5220/0005131802560263