Arrow Research search
Back to AAAI

AAAI 2021

Precision-based Boosting

Conference Paper AAAI Technical Track on Machine Learning III Artificial Intelligence

Abstract

AdaBoost is a highly popular ensemble classification method for which many variants have been published. This paper proposes a generic refinement of all of these AdaBoost variants. Instead of assigning weights based on the total error of the base classifiers (as in AdaBoost), our method uses classspecific error rates. On instance x it assigns a higher weight to a classifier predicting label y on x, if that classifier is less likely to make a mistake when it predicts class y. Like Ada- Boost, our method is guaranteed to boost weak learners into strong learners. An empirical study on AdaBoost and one of its multi-class versions, SAMME, demonstrates the superiority of our method on datasets with more than 1, 000 instances as well as on datasets with more than three classes.

Authors

Keywords

No keywords are indexed for this paper.

Context

Venue
AAAI Conference on Artificial Intelligence
Archive span
1980-2026
Indexed papers
28718
Paper id
915372150274120180