Abstract
This paper gives a deep investigation into AdaBoost algorithm, which is used to boost the performance of any given learning algorithm. Within AdaBoost, weak learners are crucial and primitive parts of the algorithm. Since weak learners are required to train with weights, two types of weak learners: Artificial Neural Network weak learner and naive Bayes weak learner are designed. The results show AdaBoost by naive Bayes weak learners is superior to Artificial Neural Network weak learners, it shares the same generalisation ability with Support Vector Machine.
Original language | English |
---|---|
Title of host publication | Proceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 |
DOIs | |
Publication status | Published - 2009 |
Externally published | Yes |
Event | 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 - Wuhan, China Duration: 11 Dec 2009 → 13 Dec 2009 |
Publication series
Name | Proceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 |
---|
Conference
Conference | 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 |
---|---|
Country/Territory | China |
City | Wuhan |
Period | 11/12/09 → 13/12/09 |
Fingerprint
Dive into the research topics of 'Constructing weak learner and performance evaluation in AdaBoost'. Together they form a unique fingerprint.Cite this
Zhou, M., & Wei, H. (2009). Constructing weak learner and performance evaluation in AdaBoost. In Proceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009 Article 5362581 (Proceedings - 2009 International Conference on Computational Intelligence and Software Engineering, CiSE 2009). https://doi.org/10.1109/CISE.2009.5362581