INTERLEAVING FORWARD BACKWARD FEATURE SELECTION

Michael Siebers, Ute Schmid

2010

Abstract

Selecting appropriate features has become a key task when dealing with high-dimensional data. We present a new algorithm designed to find an optimal solution for classification tasks. Our approach combines forward selection, backward elimination and exhaustive search. We demonstrate its capabilities and limits using artificial and real world data sets. Regarding artificial data sets interleaving forward backward selection performs similar as other well known feature selection methods.

References

  1. Blum, A. L. and Langley, P. (1997). Selection of relevant features and examples in machine learning. Artificial Intelligence, 97(2):245-271.
  2. Fayyad, U. M. and Irani, K. B. (1993). Multi-Interval Discretization of Continuous-Valued Attributes for Classification Learning. In Bajcsy, R., editor, IJCAI 1993, pages 1022-1029, San Mateo, Calif. Morgan Kaufmann.
  3. John, G. H., Kohavi, R., and Pfleger, K. (1994). Irrelevant Features and the Subset Selection Problem. In Cohen, W. W. and Hirsh, H., editors, ICML 1994, pages 121- 129. Morgan Kaufmann.
  4. Kira, K. and Rendell, L. A. (1992). The Feature Selection Problem: Traditional Methods and a New Algorithm. In Swartout, W. R., editor, AAAI Conference on Artificial Intelligence 1992, pages 129-134, Menlo Park. AAAI Press [u.a.].
  5. Koller, D. and Sahami, M. (1996). Toward Optimal Feature Selection. In Saitta, L., editor, ICML 1996, pages 284-292, San Francisco, Calif. Morgan Kaufmann; Kaufmann.
  6. Liu, H. and Motoda, H. (1998). Feature selection for knowledge discovery and data mining, volume 454 of Kluwer international series in engineering and computer science. Kluwer Academic Publ., Boston.
  7. Liu, H., Motoda, H., and Dash, M. (1998). A Monotonic Measure for Optimal Feature Selection. In Nedellec, C. and Rouveirol, C., editors, ECML 98, volume 1398 of Lecture Notes in Computer Science, pages 101- 106. Springer.
  8. Thrun et al., S. (1991). The Monk's problems: A performance comparison of different learning algorithms.
  9. Zilberstein, S. (1996). Using Anytime Algorithms in Intelligent Systems. AI Magazine, 17(3):73-83.
Download


Paper Citation


in Harvard Style

Siebers M. and Schmid U. (2010). INTERLEAVING FORWARD BACKWARD FEATURE SELECTION . In Proceedings of the International Conference on Knowledge Discovery and Information Retrieval - Volume 1: KDIR, (IC3K 2010) ISBN 978-989-8425-28-7, pages 454-457. DOI: 10.5220/0003093204540457


in Bibtex Style

@conference{kdir10,
author={Michael Siebers and Ute Schmid},
title={INTERLEAVING FORWARD BACKWARD FEATURE SELECTION},
booktitle={Proceedings of the International Conference on Knowledge Discovery and Information Retrieval - Volume 1: KDIR, (IC3K 2010)},
year={2010},
pages={454-457},
publisher={SciTePress},
organization={INSTICC},
doi={10.5220/0003093204540457},
isbn={978-989-8425-28-7},
}


in EndNote Style

TY - CONF
JO - Proceedings of the International Conference on Knowledge Discovery and Information Retrieval - Volume 1: KDIR, (IC3K 2010)
TI - INTERLEAVING FORWARD BACKWARD FEATURE SELECTION
SN - 978-989-8425-28-7
AU - Siebers M.
AU - Schmid U.
PY - 2010
SP - 454
EP - 457
DO - 10.5220/0003093204540457