Author:
Hee-Joong Kang
Affiliation:
Hansung University, Korea, Republic of
Keyword(s):
Multiple classifier system, Multiple classifiers, Conditional entropy, Mutual information, Selection method.
Related
Ontology
Subjects/Areas/Topics:
Artificial Intelligence
;
Biomedical Engineering
;
Biomedical Signal Processing
;
Data Manipulation
;
Health Engineering and Technology Applications
;
Human-Computer Interaction
;
Methodologies and Methods
;
Neurocomputing
;
Neurotechnology, Electronics and Informatics
;
Pattern Recognition
;
Physiological Computing Systems
;
Sensor Networks
;
Soft Computing
Abstract:
In addition to a study on how to combine multiple classifiers in multiple classifier systems, recently a study on how to select multiple classifiers from a classifier pool has been investigated, because the performance of multiple classifier systems depends on the selected classifiers as well as a combination method. Previous studies on the selection of multiple classifiers select a classifier set based on the assumption that the number of selected classifiers is fixed in advance, or based on the clustering followed the diversity criteria of classifiers in the classifier overproduce and choose paradigm. In this paper, by minimizing the conditional entropy which is the upper bound of Bayes error rate, a new selection method is considered and devised with no prior limit to the number of classifiers, as illustrated in examples.