Multi-class AdaBoost with hypothesis margin

Xiaobo Jin*, Xinwen Hou, Cheng Lin Liu

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

16 Citations (Scopus)

Abstract

Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multiclass AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feedforward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost. MH, and meanwhile, AdaBoost.HM is computationally efficient in training.

Original languageEnglish
Title of host publicationProceedings - 2010 20th International Conference on Pattern Recognition, ICPR 2010
Pages65-68
Number of pages4
DOIs
Publication statusPublished - 2010
Externally publishedYes
Event2010 20th International Conference on Pattern Recognition, ICPR 2010 - Istanbul, Turkey
Duration: 23 Aug 201026 Aug 2010

Publication series

NameProceedings - International Conference on Pattern Recognition
ISSN (Print)1051-4651

Conference

Conference2010 20th International Conference on Pattern Recognition, ICPR 2010
Country/TerritoryTurkey
CityIstanbul
Period23/08/1026/08/10

Cite this