Boosted Random Forest

Yohei Mishina, Masamitsu Tsuchiya, Hironobu Fujiyoshi

Abstract

The ability of generalization by random forests is higher than that by other multi-class classifiers because of the effect of bagging and feature selection. Since random forests based on ensemble learning requires a lot of decision trees to obtain high performance, it is not suitable for implementing the algorithm on the small-scale hardware such as embedded system. In this paper, we propose a boosted random forests in which boosting algorithm is introduced into random forests. Experimental results show that the proposed method, which consists of fewer decision trees, has higher generalization ability comparing to the conventional method.

References

  1. Amit, Y. and Geman, D. (1997). Shape quantization and recognition with randomized trees. In Neural Computation. MIT Press.
  2. Breiman, L. (1996). Bagging predictors. In Machine Learning. Springer.
  3. Breiman, L. (1999). Using adaptive bagging to debias regressions. In Technical Report. Statistics Dept. UCB.
  4. Breiman, L. (2001). Random forests. In Machine learning. Springer.
  5. Freund, Y. and Schapire, R. (1995). A decision-theoretic generalization of on-line learning and an application to boosting. In Computational learning theory. Springer.
  6. Gall, J., Yao, A., Razavi, N., Van Gool, L., and Lempitsky, V. (2011). Hough forests for object detection, tracking, and action recognition. In Pattern Analysis and Machine Intelligence. IEEE.
  7. Ho, T. K. (1998). The random subspace method for constructing decision forests. In Pattern Analysis and Machine Intelligence. IEEE.
  8. J. Shotton, a. A. F., Cook, M., Sharp, T., Finocchio, M., Moore, R., Kipman, A., and Blake, A. (2011). Realtime human pose recognition in parts from single depth images. In Computer Vision and Pattern Recognition. IEEE.
  9. J. Shotton, M. J. and Cipolla, R. (2008). Semantic texton forests for image categorization and segmentation. In Computer Vision and Pattern Recognition. IEEE.
  10. Kim, T.-K. and Cipolla, R. (2008). Mcboost: Multiple classifier boosting for perceptual co-clustering of images and visual features. In Advances in Neural Information Processing Systems.
  11. Lepetit, V. and p. Fua (2006). Keypoint recognition using randomized trees. In Pattern Analysis and Machine Intelligence. IEEE.
  12. Saberian, M. and Vasconcelos, N. (2008). Multiclass boosting: Theory and algorithms. In Advances in Neural Information Processing Systems.
Download


Paper Citation


in Harvard Style

Mishina Y., Tsuchiya M. and Fujiyoshi H. (2014). Boosted Random Forest . In Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014) ISBN 978-989-758-004-8, pages 594-598. DOI: 10.5220/0004739005940598


in Bibtex Style

@conference{visapp14,
author={Yohei Mishina and Masamitsu Tsuchiya and Hironobu Fujiyoshi},
title={Boosted Random Forest},
booktitle={Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)},
year={2014},
pages={594-598},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0004739005940598},
isbn={978-989-758-004-8},
}


in EndNote Style

TY - CONF
JO - Proceedings of the 9th International Conference on Computer Vision Theory and Applications - Volume 2: VISAPP, (VISIGRAPP 2014)
TI - Boosted Random Forest
SN - 978-989-758-004-8
AU - Mishina Y.
AU - Tsuchiya M.
AU - Fujiyoshi H.
PY - 2014
SP - 594
EP - 598
DO - 10.5220/0004739005940598