Constructing weak learner and performance evaluation in AdaBoost

Mian Zhou*, Hong Wei

*Corresponding author for this work

Research output: Chapter in Book or Report/Conference proceedingConference Proceedingpeer-review

2 Citations (Scopus)

Abstract

This paper gives a deep investigation into AdaBoost algorithm, which is used to boost the performance of any given learning algorithm. Within AdaBoost, weak learners are crucial and primitive parts of the algorithm. Since weak learners are required to train with weights, two types of weak learners: Artificial Neural Network weak learner and naive Bayes weak learner are designed. The results show AdaBoost by naive Bayes weak learners is superior to Artificial Neural Network weak learners, it shares the same generalisation ability with Support Vector Machine.

Original languageEnglish
Title of host publicationProceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009
DOIs
Publication statusPublished - 2009
Externally publishedYes
Event2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 - Wuhan, China
Duration: 11 Dec 200913 Dec 2009

Publication series

NameProceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009

Conference

Conference2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009
Country/TerritoryChina
CityWuhan
Period11/12/0913/12/09

Fingerprint

Dive into the research topics of 'Constructing weak learner and performance evaluation in AdaBoost'. Together they form a unique fingerprint.

Cite this