An empirical study of MetaCost using boosting algorithms

Kai Ming Ting

    Research output: Chapter in Book/Report/Conference proceedingConference PaperResearchpeer-review

    18 Citations (Scopus)


    MetaCost is a recently proposed procedure that converts an error-based learning algorithm into a cost-sensitive algorithm. This paper investigates two important issues centered on the procedure which were ignored in the paper proposing MetaCost. First, no comparison was made between MetaCost’s final model and the internal cost-sensitive classifier on which MetaCost depends. It is credible that the internal cost-sensitive classifier may outperform the final model without the additional computation required to derive the final model. Second, MetaCost assumes its internal cost-sensitive classifier is obtained by applying a minimum expected cost criterion. It is unclear whether violation of the assumption has an impact on MetaCost’s performance. We study these issues using two boosting procedures, and compare with the performance of the original form of MetaCost which employs bagging.
    Original languageEnglish
    Title of host publicationMachine Learning: ECML 2000
    Subtitle of host publication11th European Conference on Machine Learning Barcelona, Catalonia, Spain, May 31 – June 2, 2000 Proceedings
    EditorsRamon Lopez de Mantaras, Enric Plaza
    Place of PublicationBerlin Germany
    Number of pages13
    ISBN (Print)3540676023
    Publication statusPublished - 2000
    EventEuropean Conference on Machine Learning 2000 - Barcelona, Spain
    Duration: 31 May 20002 Jun 2000
    Conference number: 11th (Proceedings)

    Publication series

    NameLecture Notes in Computer Science
    ISSN (Print)0302-9743


    ConferenceEuropean Conference on Machine Learning 2000
    Abbreviated titleECML 2000
    Internet address

    Cite this