Abstract
MetaCost is a recently proposed procedure that converts an error-based learning algorithm into a cost-sensitive algorithm. This paper investigates two important issues centered on the procedure which were ignored in the paper proposing MetaCost. First, no comparison was made between MetaCost’s final model and the internal cost-sensitive classifier on which MetaCost depends. It is credible that the internal cost-sensitive classifier may outperform the final model without the additional computation required to derive the final model. Second, MetaCost assumes its internal cost-sensitive classifier is obtained by applying a minimum expected cost criterion. It is unclear whether violation of the assumption has an impact on MetaCost’s performance. We study these issues using two boosting procedures, and compare with the performance of the original form of MetaCost which employs bagging.
Original language | English |
---|---|
Title of host publication | Machine Learning: ECML 2000 |
Subtitle of host publication | 11th European Conference on Machine Learning Barcelona, Catalonia, Spain, May 31 – June 2, 2000 Proceedings |
Editors | Ramon Lopez de Mantaras, Enric Plaza |
Place of Publication | Berlin Germany |
Publisher | Springer |
Pages | 413-425 |
Number of pages | 13 |
ISBN (Print) | 3540676023 |
DOIs | |
Publication status | Published - 2000 |
Event | European Conference on Machine Learning 2000 - Barcelona, Spain Duration: 31 May 2000 → 2 Jun 2000 Conference number: 11th https://link.springer.com/book/10.1007/3-540-45164-1 (Proceedings) |
Publication series
Name | Lecture Notes in Computer Science |
---|---|
Publisher | Springer |
Volume | 1810 |
ISSN (Print) | 0302-9743 |
Conference
Conference | European Conference on Machine Learning 2000 |
---|---|
Abbreviated title | ECML 2000 |
Country/Territory | Spain |
City | Barcelona |
Period | 31/05/00 → 2/06/00 |
Internet address |
|